If you work in UX research, and especially in UX optimization, you’ve probably come across the term UX Audit. A UX Audit is essentially a research methodological approach where one or more UX specialists evaluate an interface using a combination of methods to uncover experience issues and identify opportunities for improvement. It doesn’t necessarily mean that something is “wrong” with the interface; rather, it reflects the understanding that there is always room to optimize, and that UX is a continuous process, never a finished product.
In many cases, full redesigns simply aren’t feasible due to budget constraints, tight timelines, limited team resources, or competing priorities. And truthfully, it’s often not even necessary. This is where optimization comes in. A strong UX specialist should be able to recognize both the strengths and weaknesses of an interface and know how to leverage both to incrementally improve the overall experience.
In this “not‑so‑short” article, I’ll bring together the most valuable lessons I’ve learned over two years auditing global brands’ interfaces and e‑commerce experiences. My goal is to share the principles, methods, and practical insights that consistently lead to impactful, data‑driven UX improvements.
Lesson 1. When to apply UX Audits?
I’ll have to be “that UX Researcher” and say: it depends.
There are many situations where a UX Audit can generate meaningful business value. Generally, auditing implies that there is something to audit, so it’s not typically a discovery method used at the very beginning of a UX process. But once there’s an existing interface (even a very early one) different contexts may trigger an audit, and each context shapes how we approach it as UX designers and researchers.
Below are some of the most common scenarios I’ve encountered, along with how they influence the audit strategy.
“Our KPIs are dropping”
This is the most targeted audit that usually comes up. In this case you start off with clues of what to look for, and can be more targeted in your recommendations. Make sure to cross‑validate different data sources to confirm the assumed problem, investigate recent interface changes that could be contributing to friction, benchmark against competitors or similar interfaces to understand how others solve the issue and to prioritize recommendations that directly address the KPI decline.
“User Complaints are increasing”
These audits start from qualitative signals like user frustration, feedback emails, call center notes, store‑level feedback, etc. To approach them, categorize complaints into thematic groupings to understand which issues surface most often. Conduct cognitive walkthroughs and session replay deep‑dives aligned with those themes. Treat user complaints as hypotheses and validate them through behavioral data whenever possible.
“We want to redesign our interface”
In redesign contexts, constraints are usually lighter, and there’s more room for larger changes. Given the lack of constraints inherent to a redesign, run a comprehensive audit covering the entire experience and use the findings to shape the redesign strategy. Provide practical, ideally visual suggestions for solving each issue and ensure a clear chain of reasoning: problem → insight → recommendation.
“We want to do AB Testing”
Many times, UX Audits are as much a part of UX as of CRO. This is UX in its closest to marketing and its typically applied to commercial interfaces such as e-commerces, brand websites/mobile apps, or lead generation landing pages. When auditing in the context of AB Testing focus on identifying measurable hypotheses, and look for quick wins and known friction points (mobile PDPs, CTAs, checkout). Make sure to translate findings into high‑impact test ideas and to ground recommendations in known best practices for e‑commerce, marketing funnels, or landing pages.
“We simply want to optimize”
This might sound like the easiest request, but in reality it’s the toughest and the most time‑consuming. Mainly because you don’t know what you’re looking for, so you must look at everything. In this case you have the change to take a more strategic role, instead of just looking at what you are told to, take a holistic view and define a UX optimization roadmap proactively. Start with the most critical and impactful issues and work down to the more subtle details. Pay attention to UX debt that often goes unnoticed in targeted audits. It’s especially important to ask stakeholders about budget, resources, UX maturity, and short‑ and long‑term objectives.
“We are just now launching our interface”
This scenario is tough because you’ll have minimal behavioral data and user pool. In my personal opinion, if there is a lack of data for a quantitative approach, the better route is to go qualitative with moderated usability testing. You don’t necessarily need a Usability Testing tool that allows you to do them very formally. Instead, run semi-structured formative tests with a small sample (5 to 8 participants for example) and focus on qualitative insights that allows you to identify key usability issues and iterate rapidly. If for any reason usability testing isn’t an option, the highest fidelity alternatives might include pure heuristic evaluations with proved frameworks, cognitive walkthroughs, task analysis, or comparison to best practice playbooks (there are some available online!). Just make sure to let your stakeholders know that you are working on assumptions and personal expertise, not on true user behavior. Make sure to validate critical flows like checkout, registration and onboarding, as well as quick wins that won’t clog the development teams.
Bonus: In some cases, Internal feedback loops can be incredibly valuable. Talk to SEO specialists, QA testers, Web Masters, product managers, and developers. They often have practical context or recurring issues on their radar that can enrich your hypotheses or reveal problems you’d never find alone.
Understanding the context is half the audit.
Lesson 2. Communicate with your stakeholders
I cannot stress enough how important this step is.
Stakeholders who own the website or interface will always have more context than you.
They know the goals, budgets, priorities, product lifecycle stage, available tools, timelines, roadmaps, and even what has already been tried in the past. Before you dive into any analysis, you’ll want them to fill out a brief survey (the more detailed, the better) and, ideally, you should meet them. You’d be surprised by how much you can learn in a 30‑minute call versus any form response.
Most stakeholders won’t have the time (or the UX maturity) to clearly articulate their motivations in a written format. That’s why a simple video call can make a huge difference. It’s your chance to build rapport, understand their expectations, and get a feel for the real underlying problems. And by understanding them, you’ll be in a much better position to tailor your audit to what they truly need, not just the content and scope, but also the depth, which flowsscreens to focus on, and even the language you use.
For e.g., a project manager won’t care about which specific heuristic is being violated; they care about the potential business impact. A designer, on the other hand, might want exactly that level of detail.
BTW! Define Scope and Objectives
Especially when it’s the first time this stakeholder is requesting a UX Audit, they might not know what to expect from you. So take a bit of time before the project starts to explain what you’ll be doing, what data you’ll be basing the audit on, the timeline for the project, and request any access you’ll need, for e.g., GA4 and other tools available, CMS access, or a dummy account to review gated content, portals, or checkout flows.
From my experience, active listening is key. Acknowledge their thoughts and concerns but also remember that you’re probably the most UX‑experienced person in the room. That means you have a responsibility to listen beyond the words being said. Ask yourself whether the issues they’re raising are truly the root cause or if they’re just symptoms of something deeper.
Bonus: If you struggle with these kick‑off meetings, the 5 Whys Framework is an extremely accessible way to get to root causes quickly. And don’t be afraid to challenge assumptions. There’s a sweet spot between respecting the stakeholder’s knowledge and gently showing that some of their assumptions might not be entirely accurate.
Another thing is defining objectives. Sometimes the goal of an audit is vague, and, as we discussed previously, that’s okay. But other times, clients will have a crystal‑clear idea of what they want, and you really need to pay attention to that. If the client is interested in improving accessibility, spending hours on CTA optimization is not only irrelevant, but it will also actively hurt your project. It extends your workload unnecessarily, leaves you less time to focus on what really matters, and ultimately results in a longer, denser, less helpful report that will frustrate both you and the stakeholder.
It can be tempting to include every single observation you find, because you see all the imperfections. But in my experience, adopting a “less is more” mentality almost always helps your clients more. It positions you as a strategic partner who understands the business and can prioritize effectively, instead of someone who just checks boxes and sends an overwhelming list of issues.
That being said, don’t hesitate to let stakeholders know when there are other areas worth auditing. Just don’t cram it into the current scope. Instead, propose it as a follow‑up once your primary work is done. This keeps your scope clean while showing initiative and reinforcing trust.
Lesson 3. It’s not (only) about opinion and taste
While I genuinely believe UX professionals should develop a critical eye based on experience and visual culture, a UX Audit is not a design critique, and the final outcome should never be a list of personal opinions. This is where a lot of designers slip up. Instead of being methodical and backing their recommendations with data, they lean on taste. And taste, more often than not, is neither aligned with user needs nor business goals.
How do you avoid this pitfall?
Data! Don’t be afraid to combine your UX expertise with as much relevant data as you can gather. Data comes in many shapes and from many sources (some are more “important” than others). A good UX Specialist should be able to read quantitative and qualitative data, cross different signals, and interpret everything in the context of the project.
Here are a few of the main data sources you’ll likely use in UX Audits, and how I personally approach them:
Heatmaps and Interaction Maps
If you’re analyzing specific pages or need a granular view of how users behave, heatmaps are your best friend. And don’t stop at click maps! Look at dead clicks, rage clicks, scroll depth, average fold, first‑click patterns, and so on. Heatmaps become especially powerful when you combine them to tell a story.
For e.g., low submit‑button interactions + very low scroll depth = a form that’s probably too long and pushing the CTA way down.
Just keep in mind the downside: heatmaps are usually very granular. If you’re trying to understand global navigation patterns, say, how users interact with the header, heatmaps typically won’t show aggregated behavior across pages.
Session Replays
Analyzing session replays is time‑consuming, but incredibly valuable, especially when usability testing isn’t an option. They let you see what users are doing in real time. I’ve used them a lot when comparing behavior across devices, for e.g., understanding why the same component works great on desktop but completely fails on mobile.
Bonus: Include short session replay clips when doing a stakeholder presentation. They’re a very visual way to illustrate a point to less technical audiences.
User Journey Analysis
Many analytics tools let you see the paths users take from entry to exit. This is perfect for spotting friction in key flows. It becomes really useful when combined with metrics like abandonment rate, bounce rate, and time on page.
These journeys can also be shaped into funnels or journey maps. Break them down by device, geography, source, etc., and you’ll uncover patterns you’d never see otherwise. You can even translate these insights into proto‑personas if needed.
Longitudinal Behavioral UX Analytics
This can feel intimidating if you’re not used to quantitative analysis, but analytics can massively level up your audit by adding a time dimension. You’ll be able to understand patterns, confirm trends, validate past decisions, and measure changes.
Do numbers tell the whole story? No. But they do make your arguments easier to communicate, especially when dealing with stakeholders who all “speak different languages.” Numbers work a lot like a universal translator.
Bonus: Learn your dashboards and some basic data‑visualization principles! It makes a huge difference. I’ll share tools and resources later in the article, but I want to mention that in my most challenging audits, the difficulty I felt the most is probably cleaning up, preparing and presenting data. To be honest, I never looked into data visualization too much but it is an incredibly powerful asset in your UX toolset. Frederica Fragapone has an incredible online course on this topic, I’ll leave it in the resource chapter.
Surveys, Complaints, Comments & Reviews
These user-generated signals bring the “why” behind behavior which is something analytics alone can’t do. They reveal frustrations, expectations, and sentiments that numbers struggle to capture. Again, they take longer to analyze, but AI can help a lot if you know how to use it.
My approach to dealing with this data is to start by categorizing them into logical groupings, identify high-frequency pains as well as connections between themes, cross-reference with findings from heuristic evaluations or other methods and translate them into clear observations.
Bonus: Stakeholders love seeing direct user voices. Use quotes, word clouds, sentiment barometers, or incorporate them into user journey maps! It makes everything more real and human.
BTW! Unleash the UX Researcher in you
I often see UX Specialists relying either only on tools or only on their expertise. In reality, the best audits combine expert evaluation + data + first-hand research. So don’t just analyze what tools give you. Pair it with usability testing, heuristic evaluations, competitive benchmarking, cognitive walkthroughs, and anything else time allows.
Also, beyond being a UX ResearcherDesignerSpecialist or whatever you call it, you’re probably also acting as a UX Advocate in your team or organization. That means bringing the user into the conversation whenever you can. I strongly encourage you to push for methods like user interviews, usability tests, card sorting, first-click testing, five‑second testing, or tree testing. They’re often cheap, fast, and surprisingly insightful. And giving users a voice, even in small ways, can be the difference between an average audit and a meaningful one, that spans both business goals and user needs.
Lesson 4. The tools of the trade
To be completely honest, this is one of the least important parts of learning how to run UX Audits. Ironically, that’s because there are so many tools out there and most of them are pretty great. Sure, some have proprietary features, some lean more toward marketing or CRO, some will match your workflow better than others… but generally, they all do similar things. After a handful of audits, you’ll know exactly what each tool can and can’t do.
Like many things in UX, tools come and go. They evolve, get acquired, get rebranded, get discontinued, and get replaced. That’s why I always say:
Be tool‑agnostic.
Focus on building your skills, your critical thinking, and your methodological intuition. The tool is just the vehicle.
That said, here are the tools I know and recommend, grouped by what they’re mainly used for (even though many of them do more than one job):
Behavioral Analytics and Event Tracking
These tools capture behavioral events, funnels, trends, and all the longitudinal data we talked about earlier. They’re the backbone for understanding “what” users do over time.
Some of the industry standards:
- Google Analytics 4 (free, tons of tutorials everywhere);
- Amplitude;
- Mixpanel;
- Adobe Analytics.
Granular Analysis
These tools give you qualitative insights without recruiting users. Think heatmaps, interaction data, session replays, etc.. All the juicy stuff you can explore solo.
My go‑tos:
- Glassbox;
- Microsoft Clarity (free and honestly shockingly good);
- Hotjar;
- FullStory.
Voice of User Platforms (Surveys, NPS, CSAT, Qualitative Feedback)
These tools collect user sentiment, from structured surveys with Likert Scales, to open‑ended comments.
Solid options:
- Qualtrics;
- GetFeedback;
- Typeform (they always look amazing);
- Survicate.
Accessibility Auditing
Accessibility is increasingly required by good practice and recently in the EU, by law, so you’ll likely bump into some of them, including:
- Deque / Axe DevTools;
- Google Lighthouse (also a good starting point).
Information Architecture Optimization
For labeling, navigation, IA, and structure. Note that these require real users, which can introduce legal/procurement hurdles. Good choices include:
- Optimal Workshop;
- UXTweak;
- Lyssna.
Performance Checkers
Website speed, stability, and technical errors matter more than many UX Specialists initially think. As a UX Specialist, I was never required to be super versed on these themes but I’ve also realized that some knowledge about how pages and screens load is useful to identify bottlenecks and give better recommendations. Other tools often have performance metrics integrated but if you don’t have access to those, you can start by these:
- Google Lighthouse;
- PageSpeed Insights;
- Semrush Web Core Vitals.
Data Analysis and Reporting
There is a plethora of tools that help in data analysis and visualization. If you’re lucky, your team might have a data analyst or someone that can help you in these tasks. But if you have to do it on your own, modern tools also help a lot, even if you are not super experienced in this field.
At minimum, get comfortable with Excel or Google Sheets, specifically, pivot tables, filters, simple data cleaning, basic charts, CSV handling.
If you know your way around data, tools such as Power BI or Looker Studio are also good complements for more complex analysis and visualizations.
Bonus: Always, always turn on Autosave in Excel. Do it immediately. No exceptions. You’ll thank me later.
Again, this entire section is the most likely to become outdated. Tools change constantly. What doesn’t change is your ability to think, analyze, interpret, and connect dots. Focus on that.
BTW! Mind your Data Retention Limits
If you work with the same brand or product for a while, sooner or later you’ll want to compare current performance to past time periods. And I can almost guarantee there will come a moment where you realize you can’t because the data is gone. Many tools only retain data for X months or X events, and once it’s gone, it’s gone.
Honestly, there’s no magic fix here, but here are two pieces of advice that will save you future heartache:
- Try to anticipate what you’ll need and save important data locally. Screenshots, CSV exports, dashboards, charts, etc. Be organized, name files clearly, and keep everything accessible for future reference if needed.
- Learn your tool’s limitations as soon as you start using it. And when discussing audit scope with stakeholders, set expectations early. Let them know what is and isn’t possible based on retention limits.
Your future self will thank you.
Lesson 5. What are we hunting for
When scopes and goals are unclear, UX Audits can become dauting. You don’t want to leave money on the table, and yet there’s always so much to look for, often with deadlines that are, let’s say, not ideal. When you’re first starting out, it’s easy to fall into the trap of listing issues randomly without much structure. Over time, the big realization is that the more methodical you are, the less likely you are to miss something important.
To help you get started, I’ve put together a set of common “topics” that usually show up in UX Audits, along with the types of questions you might try to answer for each one. Think of this as a checklist you can refer to, not something you must always follow.
Every audit is different, and there’s no single “correct” recipe. Be structured, but stay flexible. And above all, be critical.
Additionally, I’ll try to describe my personal process step-by-step. Take it with a grain of salt, as there is no one correct way to do it and chances are that you’ll adopt your own process as you go.
What to look for on a general UX Audit
Again, you don’t have to look at every single one; take this as a more or less complete list of ubiquitous elements of a website experience that you can refer to when needed. In the same way you might consider other elements in your audits or opt to answer different questions.
Usability
- Can users complete core tasks without confusion or unnecessary steps?
- Are error messages clear and helpful?
- Is the interface consistent in patterns and behaviors?
Website Performance
- Does the site load quickly across devices and network conditions?
- Are performance bottlenecks impacting user experience (e.g., slow scripts, large images)?
- Is the Core Web Vitals score acceptable?
Navigation and Information Architecture
- Do users understand where they are and how to get to their desired destination?
- Is the menu structure logical and aligned with user mental models?
- Are search and filtering options effective?
Accessibility
- Does the site meet WCAG standards for color contrast, keyboard navigation, and screen reader compatibility?
- Are interactive elements accessible to all users?
- Are alt texts and ARIA labels properly implemented?
Visual Design
- Is the visual design aligned with brand guidelines?
- Are colors, typography, and spacing consistent across pages?
- Does the design support readability and scannability?
Imagery and visuals
- Are images optimized for performance without losing quality?
- Do visuals support content and not distract from key actions?
- Are decorative images properly marked for accessibility?
Copy
- Is the language clear, concise, and aligned with brand tone?
- Are CTAs descriptive, action-oriented, and easy to understand?
- Is microcopy (form labels, error messages) helpful and user-friendly?
Conversion-Rate Optimization
- Are CTAs visible and placed at the right points in the journey?
- Are there unnecessary steps in the conversion funnel?
- Is trust-building content (reviews, security badges) present where needed?
Key User Flows and Funnel Performance
- Where do users drop off in critical flows (checkout, signup)?
- Are there friction points in multi-step processes?
- Are success metrics (completion rates) tracked and improving?
Form Optimization
- Are forms easy to complete with minimal cognitive load?
- Are validation messages clear and real-time?
- Is the number of required fields reasonable?
Layout
- Does the layout create a clear visual hierarchy that guides users toward key actions?
- Is spacing and alignment consistent across pages and components?
- Does the layout adapt well to different content lengths without breaking the design?
Responsiveness
- Is the experience optimized for all breakpoints (mobile, tablet, desktop)?
- Do interactive elements remain usable and visible on smaller screens?
- Are images, text, and components scaling properly without causing horizontal scrolling or layout shifts?
My process
This is just my way of doing things and you’ll likely evolve your own over time. UX Audits are not linear, even though they sometimes look like they should be. Take this as a starting point.
Before the Audit
- Read through the survey Briefing — Get familiar with the product, the niche, the business, and the basic context;
- Meet with the Stakeholder — Listen more than you talk. Clarify expectations, KPIs, and the underlying motivations behind the request;
- Decide on Scope and Goals — Agree on what will be audited, in what depth, and under which timeline;
- Prepare the Template — Ensure that your audit report is structurally and visually tailored to the brand you are working with;
- Ensure that you have the required accesses and data is available — Request access to any tools collecting website data and to a dummy account (if needed), ask to see previous audits (if available).
During the Audit
From General to Specific, from data-driven to expertise-based
- Investigate Website-wide Longitudinal Behavioral UX Analytics — Get a high-level view of the overall experience and trends over time. This sets your baseline;
- Look at data per page and section— Apply the aforementioned methods, including Heatmaps, UX Analytics, typical navigation patterns before and after the page, and anything else that is available and relevant. Do this to every section, page and flow agreed during the scope definition. Don’t forget to break down your data by relevant criteria, for e.g., device, user type, etc.;
- Run a Heuristic Evaluation and Best Practice Comparison — Use your expertise, frameworks, and competitive benchmarks to identify issues and opportunities;
- Write clear, specific and actionable observations — Write one observation per issue you found, each composed by a description of the issue and a recommendation separately. Try not go beyond one paragraph per each;
Bonus: Number observations for easy referencing and add level of criticality (apply a Prioritization Framework or eyeball it to define what is Low, Medium, and High Importance) and a category (you can use the list on Lesson 5);
- Illustrate your findings — Add Screenshots, session replay clipsGIFs, Best Practices from other websites and (if time allows) mockup proposals to your observations. You don’t have to do this for every single observation, only the most critical, impactful, or more difficult to explain verbally.
Bonus: Don’t be afraid to annotate your screenshots or add elements like red rectangles to frame where you want your audience to look at.
After the Audit
- Create an Action Plan — Create categories based on logical criteria (for e.g., most critical, low-hanging fruit, etc.) and organize your observations in these categories, creating a kind of timeline or roadmap;
- Design and compile an effective Executive Summary and General Appreciation — Place it at the beginning of the audit report, with 1 or 2 pages containing a summary of the report and pointing out the most critical observation to look at;
- Review everything — Audits can get quite extensive and sometimes are made over some weeks, so ensure you review your work before exporting and delivering;
- Review again — If you can afford it, take a day away from the report. Come back with fresh eyes;
- Send the report to valuable stakeholders — Send it in a common format like PDF (not figma unless you know your stakeholder has an account and knows his/her way around it). Be cordial and make yourself available to answer any questions. Do not forget to suggest a couple of timeslots to do a discussion and presentation session;
- Present your work — In the Discussion Session go through the Executive Summary and General Appreciation. If the stakeholder had the time to review the deck and has some discussion topics, let him guide the meeting. If not (likely), have a set of top 10 observations prepared to present in more depth (I’ll give you some additional tips in Lesson 6);
- Reach out! — Don’t let the audit die after the presentation, reach out to the stakeholder after a while to ensure your work was effective, to gather feedback, and to present ideas for the next audit.
Lesson 6. The value is in the message
When you’ve spent weeks examining every little detail of an interface, the last thing you want is for all that knowledge to die quietly in someone’s inbox, buried under 200 unread emails. Unfortunately, this happens a lot. Stakeholders are busy, juggling fires left and right, and sometimes your carefully crafted audit is simply not their top priority.
While this can feel out of our control, I believe we should take accountability even after we conclude the audit. Over time, I’ve seen audits disappear into the void, and I’ve tried different ways to increase engagement and keep the momentum alive. There’s no perfect formula but there are things you can do to dramatically improve the chances of your work being seen, understood, and acted upon.
1. Keep it Simple and Stupid (KISS)
Just kidding but also not really. Don’t try to cram your entire audit into a 30‑minute presentation. Focus on:
- Answering the stakeholder’s real questions
- Highlighting the observations that actually matter for their objectives
If you overshare, you lose them. If you’re concise, you keep them.
2. Use short sentences and avoid technical jargon.
UX is full of terminology we use every day without realizing how opaque it sounds to others. Be empathetic and speak plainly. You don’t sound “more expert” when people can’t follow you, you just sound confusing. Clarity builds credibility.
3. Make it visual.
Images, videos, annotated screenshots, data visualizations these are your storytelling superpowers.
Visuals help stakeholders stay engaged, understand your points faster, remember your message, and see the problem without needing to dig through the entire report
If you can prepare a shorter “presentation-friendly” version with just the deep dives you plan to show, even better.
4. Time your presentation intentionally.
Always schedule the presentation 3 to 7 days after delivering the report.
This gives stakeholders enough time to skim the document but not enough time to forget it exists or get pulled into something else.
5. Mind Your Deliverables
A good audit can be more than a plain PDF. Consider improving your outputs across several dimensions:
- Improve your report’s layout, structure, colors, readability.
- Include different formats, for e.g., databases, dashboards, mockups, or annotated screenshots.
- Add hypotheses and practical proposals, not just problems.
- Tag your observations! Quick wins, content issues, CRO issues, accessibility, performance, etc. This makes your audit more semantic and more searchable.
6. Know Your Audience
Don’t be afraid to send the audit and the presentation invite to people beyond the main stakeholder, especially UX‑friendly allies who might support or influence decision‑making even if they’re not final approvers. The more internal advocates you have, the more likely your recommendations will move forward.
As good a UX professional as you are, chances are you are not the decision maker regarding roadmaps and acting upon recommendations. This means that your potential for impact is limited not only by your capacity to extract impactful, grounded observations and practical, actionable recommendations, but also by your capacity to convince others that those are worthwhile, profitable investments.
Lesson 7. What to Avoid
To finish up, I want to leave you with some of the mistakes I’ve made myself and others I’ve seen from UX Specialists over the years. Consider these gentle warnings from someone who has stepped on all the rakes so you (hopefully) don’t have to 🙂
1. Only pointing out the bad stuff
It’s easy to focus exclusively on what’s not working. After all, that’s literally your job in an audit. But remember: every interface, good or bad, is the result of real effort from real people. And often, the person you’re presenting to is one of them.
In plain words, be nice.
In the general appreciation section, save at least one bullet for what is working well. In the presentation, take 30 seconds to point out something positive. People are much more open to criticism when they’re not in defensive mode.
2. Accusatory Language
This ties directly to the previous point. Avoid words like terrible, bad, or anything that frames your observations as personal judgments.
Ground your observations in user behavior, best practices, heuristics and pattern recognition. Express confidence but avoid absolute certainty. Explain your reasoning. Show your thought process.
Here is an example:
Judgmental Observation: “The current checkout flow is poorly designed. It is not at all optimized and users will get confused because the form is badly structured and the buttons are in the wrong place. The form must be redesigned according to best practices.”
Neutral Observation: “Session Replay review showed that several users hesitated before completing the form, often pausing or re‑filling the fields. This suggests that the current layout may not clearly signal the expected input hierarchy. To improve form completion rate, consider placing the Submit button closer to the form and grouping related fields.”
3. The Cookie-Cutter Approach
Early in my UX Auditing career, I assumed all projects would follow the same structure. I was very wrong 🙂. Even interfaces we had already worked on behaved unpredictably in terms of audit duration. Trying to apply the same recipe every time will lead to bland, unhelpful audits.
Instead embrace the particularities of each project and adjust your expectations. It is a good practice to add a time buffer to your time estimations, as this allow room for the unexpected. This gets easier as you gain experience and confidence.
4. Not taking into account previous audits
One of the worst feelings is recommending something that has already been tried and failed. Even worse if your own team recommended it to begin with.
Be mindful of contradictions and get familiar with the history of your interface before proposing additional changes.
5. Inconsistent Audits (when >1 Auditor)
Large projects are often shared between multiple researchers. The most common approach is to divide pages or flows between two auditors. I’ve done this many times with mixed success.
When more than one auditor is working on the same project, there are a few risks you need to be aware of. Different people naturally bring different writing styles, different prioritization logic, and different interpretations to the table. In some cases, they may even arrive at contradictory conclusions. All of this can easily make the final audit feel inconsistent or confusing for the stakeholder.
To avoid that kind of chaos, alignment is everything. Before starting, make sure both auditors agree on priorities and scope, so you’re working toward the same objective. Take the time to understand each other’s writing styles and find a comfortable middle ground that feels cohesive. Don’t shy away from openly discussing disagreements, as those conversations usually lead to better insights anyway. And, importantly, both auditors should review the full audit before delivering it.
A shared final pass helps ensure consistency, accuracy, and a unified voice.
6. Not leveraging AI or Relying on it too much
This topic is still a bit taboo in many teams. The reality is that AI is neither the enemy nor the magic solution to everything. It’s a tool (a powerful one) but only as effective as the person using it.
In practice, you’ll usually find two extremes:
On one side, there’s the AI‑Avoider: the person who believes AI has no place in research. They insist on doing every single task manually, which often means they take significantly longer, burn out more easily, and spend hours on repetitive work that AI could handle in minutes.
On the opposite side, you’ll find the AI‑Evangelist: someone who’s convinced that a database, a few screenshots, and a clever prompt can automate the entire audit. They stop thinking critically, rely on AI as a crutch, lose accountability for the quality of their insights, and end up delivering shallow, ungrounded work.
As with most things, the truth lives somewhere in the middle!
AI truly shines at processing large amounts of data quickly, categorizing user‑generated comments, summarizing patterns, and even helping refine wording. I use it myself for clarity, since English isn’t my native language. It can dramatically speed up tedious or repetitive tasks and free up time for the parts of the audit that require actual expertise.
But there are limits. AI cannot think strategically, understand nuance, make judgment calls, or take responsibility for the recommendations in your audit. That part is entirely on you. As the 1979 IBM Training Manual wisely put it:
“A computer can never be held accountable, therefore a computer must never make a management decision.”
Use AI mindfully, not as a replacement for your thinking, but as a way to enhance both the quality and the efficiency of your work. No one is amazing at everything. Let AI fill in the gaps, not define the outcome.
Conclusion
Logically, I can’t share any of the UX Audits I’ve worked on in the past (NDAs are NDAs), but there are plenty of great examples online if you want inspiration or a sense of structure.
This turned into a longer article than I expected (classic UX‑nerd energy 😅), but I really hope you found value in it. If you have questions, want to share your own experiences, or just want to chat about UX, audits, or research in general, feel free to reach out. I’m always happy to connect.
Thanks for reading and happy auditing!
Last but not least , Resources and Goodies
- UX Audit — How to Conduct One (Steps, Templates & Checklist)
- A Complete Guide to UX Audit: Templates + Checklist | UXtweak
- The Ultimate UX Audit Guide (+ Free Checklist)
- UI UX Audit Process — How to Stop Skipping Them [Free UX Audit Checklist]
- UI UX Audit & Analysis Tutorial | GYMSHARK Case Study Included | 2022
- Ecommerce UX Research & Insights Platform — Baymard
- Conversion Design Optimization Experts (CRO through Design) | ConversionWise
- Nielsen Norman Group: UX Training, Consulting, & Research — NN/G
- https://cxl.com/
- Conversion Rate Optimization and Landing Page Course | Udemy
- The complete Conversion Rate Optimization course | Udemy
- Free User Experience (UX) Design Tutorial — UX Audit Course: Conduct Heuristic Evaluation — Part 1 | Udemy
- Free User Experience (UX) Design Tutorial — UX Methods Fundamentals | Udemy
- Online Course — Data Visualization and Information Design: Create a Visual Model (Federica Fragapane) | Domestika
- Optimization by Oliver Youtube Series
Everything I know about running UX Audits was originally published in UX Planet on Medium, where people are continuing the conversation by highlighting and responding to this story.