UX Audit
UX audit driven by session data, not designer opinions.
Heuristic evaluation, WCAG 2.2 AA accessibility check and 50+ session recordings. Each issue ranked by lost revenue.
- Issues backed by session, funnel or WCAG
- Implementation-ready with Figma annotations
- Mobile and desktop analysed separately

Trusted by ambitious brands worldwide
SEMrush agency
Rated #1 SEO agency in The Netherlands by SEMrush
100% Dedicated to SEO
SEO is all we do, and we're good at it
Most UX audits are a designer's opinion in a PDF.Most designer opinions do not survive contact with users.We start with the session recordings.
Most UX audits are a designer's opinion in a PDF. Most designer opinions do not survive contact with users. We start with the session recordings.
Our senior UX auditors combine Hotjar or Microsoft Clarity session replays, GA4 funnel and scroll-depth analysis, dead-click detection and a Nielsen 10 heuristic evaluation with a WCAG 2.2 AA accessibility audit using axe DevTools and WAVE. Every recommendation lands in a prioritised Notion or Linear tracker your team can ship against, with Figma annotations and code references where they help.
See UX audit outcomesHow a UX audit runs
How a LASEO UX audit runs in five phases
A three to four week audit broken into five phases, each with a deliverable you can sign off. No black box, no proprietary methodology that locks you in.
- 01
Quant baseline (week 1)
GA4 funnel analysis per device, bounce and exit-rate review per template, scroll-depth analysis on the top 20 pages, dead-click detection in Hotjar or Microsoft Clarity. Deliverable: baseline report identifying the five funnel steps with the biggest drop-off and the screens worth investigating in qual.
- 02
Qual analysis (week 1-2)
50+ session recordings reviewed across device types and traffic sources, tagged for friction patterns: form abandonment, navigation confusion, rage-clicks, mis-tapped CTAs. If budget allows, five short user interviews on the worst-performing flows. Deliverable: friction inventory with timestamped recordings.
- 03
Heuristic and accessibility evaluation (week 2-3)
Nielsen 10 heuristic evaluation across the audited templates, plus a full WCAG 2.2 AA accessibility check using axe DevTools, WAVE, manual keyboard testing and screen-reader walkthrough on NVDA or VoiceOver. Information architecture review including tree-testing in Maze if scope allows.
- 04
Prioritised issue list and Figma annotations (week 3)
All issues consolidated into a single ranked list in Notion or Linear, each tied to a session recording, a funnel drop or a failed WCAG criterion. Figma annotations on the affected screens. Each issue has expected impact in conversion or accessibility terms, effort estimate, and a recommended fix.
- 05
Implementation handoff and post-fix monitoring (week 4)
90-minute working session walking your design and engineering team through the top-priority issues. Post-fix, we re-check the top five issues in GA4 and Hotjar to confirm the friction is gone. Optional ongoing engagement: monthly UX scorecard tracking the same metrics.

Think of us as yourfractional head of UX research.
We sit between your product team and your developers. A quant baseline from GA4 funnels and scroll-depth. A qual layer from session recordings and short user interviews. A heuristic and accessibility pass on top. The result is a single prioritised issue list with Figma annotations and code references, not a slide deck of subjective critiques.
A UX audit is an evidence-led diagnosis of where users struggle on your site. Not a redesign, not a brand exercise, not an opinion piece. The goal is a ranked list of fixable issues with evidence attached. Distinct from a design audit, distinct from a CRO test, distinct from an analytics review.
UX audit vs design audit
A design audit looks at the visual layer: typography, spacing, hierarchy, brand consistency. A UX audit looks at the full experience: can users complete the task, where do they hesitate, what breaks accessibility. Design audits use a designer's eye, UX audits use session recordings and analytics. You can buy both, they answer different questions.
UX audit vs CRO programme
An audit is a diagnostic at a point in time. A CRO programme is ongoing experimentation against the issues the audit found. The audit tells you where to test, the CRO programme runs the A/B tests and ships the winners. Most companies audit once a year and run CRO continuously.
Heuristic evaluation is necessary, not sufficient
Nielsen 10 heuristics are a useful checklist. They will catch obvious problems like inconsistent navigation or missing system feedback. They will not catch the specific friction your users hit because they do not know your product. A heuristic-only UX audit is half an audit.
Accessibility is part of UX, not a separate audit
WCAG 2.2 AA covers keyboard navigation, contrast, focus order, screen-reader announcement, motion preferences and form labels. Around one in five of your users benefits directly from accessibility fixes. Excluding accessibility from a UX audit produces a half-finished list.
Tools we actually use
Hotjar or Microsoft Clarity for session recordings and heatmaps, Fullstory for enterprise volumes, GA4 for funnel and scroll-depth analysis, axe DevTools and WAVE for accessibility, Maze for tree-testing the navigation IA, Figma for the annotation deliverable. We do not use proprietary internal tools.
Audit pricing in Europe
A serious senior-led UX audit in Europe runs EUR 4,500 to EUR 9,000 depending on scope, number of templates and whether user interviews are included. Anyone selling a UX audit for EUR 1,500 is selling a heuristic checklist filled in by a junior, with no session data or accessibility coverage.
Why LASEO vs alternatives
UX audit options compared honestly
Design agencies, analytics consultants and in-house teams all run UX audits in their own way. Here is what each is good at, and where LASEO fits.
Opinion piece in PDF format
- Visual critique by a senior designer based on their experience and your screenshots. No session recordings, no analytics, no accessibility coverage.
- Funnel analysis and bounce-rate report. Knows where users drop, has no design eye for why and no recommended fix.
- Done by a designer who is too close to the product, with no external benchmark and no time blocked for accessibility testing.
- Not included, sold separately by a different vendor, often three months later.
- 60-page PDF that gets read once and forgotten.
- Delivered, invoiced, gone. Your team works out how to action the findings on their own.
Evidence-led, ship-ready issue list
- Heuristic evaluation is one of five inputs, alongside GA4 funnels, 50+ session recordings, WCAG 2.2 AA accessibility and tree-testing. Every issue has evidence attached.
- Quant baseline is step one, not the whole audit. Senior UX auditors interpret the why, run the heuristic pass and recommend the specific fix per issue.
- External senior eyes, dedicated three to four week scope, full accessibility pass on every audit, no internal politics to navigate.
- WCAG 2.2 AA audit included in every engagement, using axe DevTools, WAVE, manual keyboard testing and NVDA or VoiceOver screen-reader walkthrough.
- PDF for leadership, Notion or Linear tracker for execution, Figma annotations for designers, Looker Studio dashboard for ongoing tracking. Four formats, one source of truth.
- 90-minute handoff session with design, engineering and product. Two weeks of Slack support. Post-fix re-check on the top five issues to confirm friction is gone.
UX audits in production
UX audits that paid back the engagement many times over
What our UX audit clients say
UX audit results in their own words
Their UX audit found a contrast-failing 'Add to cart' button that was costing us about 8 percent of mobile conversions. Three days of dev work, one ticket, measurable lift the week after. Nobody on our design team had spotted it in two years.
We had run our own usability tests and were convinced the checkout was fine. The Hotjar replays LASEO pulled showed 38 percent of mobile users hesitating for 12+ seconds on the shipping step. We rewrote two lines of copy and the abandonment rate dropped.
The accessibility section of the audit alone was worth the engagement. They found eleven WCAG 2.2 AA failures our previous design agency had never mentioned, including keyboard traps in our filter UI. Our legal team also slept better afterwards.
Their UX audit found a contrast-failing 'Add to cart' button that was costing us about 8 percent of mobile conversions. Three days of dev work, one ticket, measurable lift the week after. Nobody on our design team had spotted it in two years.
We had run our own usability tests and were convinced the checkout was fine. The Hotjar replays LASEO pulled showed 38 percent of mobile users hesitating for 12+ seconds on the shipping step. We rewrote two lines of copy and the abandonment rate dropped.
The accessibility section of the audit alone was worth the engagement. They found eleven WCAG 2.2 AA failures our previous design agency had never mentioned, including keyboard traps in our filter UI. Our legal team also slept better afterwards.
A design audit looks at the visual layer: typography, spacing, hierarchy, colour, brand consistency. It is done with a designer's trained eye on screenshots. A UX audit looks at the full experience: can users complete the task, where do they hesitate, where do they rage-click, where does accessibility fail. It uses session recordings, GA4 funnel data, heuristic evaluation against Nielsen 10 and a WCAG 2.2 AA accessibility check. A design audit answers 'does it look right'. A UX audit answers 'does it work for users'. You can buy both, and serious companies do, but the deliverables and methods are not interchangeable.

UX audit scoping call
Bring a UX problemto a senior UX auditor.
30 minutes with a LASEO senior UX auditor. We will not pitch. We will look at your site, name the three flows we suspect are losing the most conversions, and tell you whether a UX audit is the right next step or if you need something else.













