Got an email Tuesday morning.
“Hey, can you do a quick design audit? We think our dashboard needs a refresh. Shouldn’t take long — just want to check if anything looks off. Can you turn it around by Friday?”
Friday was four days away.
“Quick design audit” is the second-most optimistic phrase in product development. (First place: “just a small change.”)
Here’s what actually happened: Three weeks. 847 screens. €18,000. Findings document: 47 pages. Things that “looked off”: everything.
This wasn’t unusual. This is what happens when teams ask for a “quick audit” without understanding what design audits actually involve.
Let me correct some misconceptions.
Misconception #1: “Just Check Our Spacing”
What they say: “Can you do a quick design audit? Just check spacing, alignment, maybe button consistency. Visual stuff.”
What they actually mean: “Can you do QA?”
What they think they’re asking for: A 30-minute scan where I point out padding inconsistencies and maybe suggest a new shade of blue.
What they’re actually asking for: A comprehensive UX audit examining whether their interface works for humans.
I had a client ask for “spacing audit only” last year. Found the real problem in 5 minutes:
Their entire onboarding assumed users understood industry jargon. Navigation used internal company abbreviations. Primary CTA: “Initiate Protocol Sequence.”
(Nobody knows what that means.)
Spacing was fine. Messaging was incomprehensible.
“Quick spacing check” would’ve missed the actual problem: users couldn’t figure out what anything did.
Reality: Spacing is symptom, not disease. Most visual inconsistencies point to deeper structural problems.
Misconception #2: “Make It Look Nicer”
What they say: “We need an audit. Product works fine, just doesn’t look modern enough.”
What they actually mean: “Fix our conversion rate without admitting conversion’s the problem.”
What they think audits do: Polish. Modernization. Making things “pop.” (Whatever that means.)
What audits actually do: Diagnose why users are confused, frustrated, or leaving.
Had a SaaS company ask me to “make their dashboard look nicer.” Conversion from trial to paid: 18%.
Spent three days in their product. Dashboard looked fine. Modern. Clean. On-trend.
Problem: Dashboard showed 23 metrics simultaneously. No hierarchy. No guidance. Users spent first week trying to understand what they were looking at instead of getting value.
Made it “look nicer”? No. Removed 17 metrics. Surfaced the 3 that mattered. Added empty states explaining what each metric meant.
Trial-to-paid conversion: 34% within 6 weeks.
Misconception #3: “Tell Us What’s Broken”
What they say: “Just tell us what’s broken. We’ll fix it.”
What they actually mean: “Give us a list we can hand to developers.”
What actually happens: Everything’s broken. Question is: what matters?
I audited a fintech product last year. Found 127 distinct UX problems across onboarding, dashboard, and settings.
Client: “Great! Send the list.”
Me: “Which problems are you solving first?”
Them: “All of them.”
No. You’re not fixing 127 things. You’re fixing the 5 things killing activation.
The actual problems:
- KYC verification buried in settings (users couldn’t find it)
- Payment flow had no error states (failed transactions looked successful)
- Empty states said “No data” (users thought product broken)
- Navigation used 14 different labels for same section
- Onboarding skipped critical setup (users hit error on first action)
Fixed those 5. Activation went from 41% to 68% in 8 weeks.
Reality: Audits aren’t to-do lists. They’re triage. Prioritizing what matters vs. what annoys designers.
Misconception #4: “Can’t We Just Run Lighthouse?”
What they say: “Why do we need an audit? We can just run automated tools.”
What they actually mean: “This sounds expensive and we’d rather not pay for it.”
What automated tools check: Contrast ratios, alt text, page load speed, semantic HTML, ARIA labels
What automated tools miss: Your signup flow loses 60% of users at step 3. Your navigation makes no sense outside your company. Your dashboard takes 5 minutes to understand. Your CTAs use jargon nobody recognizes.
Audited a B2B SaaS tool last month. Lighthouse score: 94. Accessibility: perfect. Performance: excellent.
Actual user experience: users couldn’t figure out how to add their first project. Feature was there. Just buried under “Advanced Configuration” in settings.
Support tickets: 40% were “How do I get started?”
Moved “Add Project” to empty state. Support tickets dropped to 12%.
Lighthouse never would’ve caught this.
Reality: Automated tools check technical compliance. Design audits check whether humans can actually use your product.
Misconception #5: “Can You Do It By Thursday?”
What they say: “We need this fast. Can you turn around an audit by end of week?”
What they actually mean: “We’re having a board meeting and need to look like we’re doing something.”
What they think audits take: A few hours. Maybe a day if it’s thorough.
What audits actually take: Three weeks. Minimum.
Here’s what goes into proper auditing:
Week 1: Understanding (32 hours) Map every user flow, test every path, document where flows break, screenshot every state, review navigation patterns, check mobile vs desktop, test accessibility with screen readers.
Week 2: Analysis (28 hours) Heuristic evaluation against usability principles, visual hierarchy audit, copy audit, consistency review, interaction patterns, competitive benchmarking.
Week 3: Synthesis (24 hours) Prioritize findings by impact, document each issue with screenshots, recommend solutions (not just “fix this”), estimate implementation effort, create actionable roadmap.
Total: 84 hours over 3 weeks
This doesn’t include meetings, client review, or follow-up questions.
“By Thursday” means you get a vibe check, not proper analysis.
Reality: Good audits take time because products are complex and problems are layered.
What Audits Actually Diagnose
After 47 audits, here’s what they actually find:
Structural problems: Navigation that makes sense internally but confuses users, features hidden where nobody finds them, onboarding that teaches product not value, dashboards with no hierarchy.
Message problems: Copy that sounds smart internally but means nothing to users, CTAs using jargon instead of verbs, error messages that blame users, empty states that say nothing helpful.
Flow problems: Signup processes that lose users at preventable points, workflows requiring 14 steps when 4 would work, features requiring documentation to understand, settings buried where nobody finds them.
Trust problems: Buttons that look clickable but aren’t, loading states that feel like errors, success confirmations that look like failures, designs that feel unfinished or broken.
These aren’t cosmetic problems. These are “users can’t complete tasks” problems.
Real Example: The Dashboard Nobody Understood
Client asked for “quick audit” of their analytics dashboard.
What they wanted: Visual polish. Modern look. Better colors.
What I found: 6 stakeholders had added features over 18 months. Nobody owned overall vision. Result: 28 metrics simultaneously with zero hierarchy.
New users spent average 11 minutes on first login trying to understand what they were looking at.
The actual problems: No onboarding explanation. Metrics displayed in order they were built, not importance. 12 “nice to have” metrics nobody used. 4 critical metrics buried in bottom-right. Empty states said “No data yet!” (users thought product broken). Every metric used different date ranges.
What I recommended: Remove 20 metrics from default view. Surface 4 metrics that mattered. Add progressive disclosure for advanced metrics. Explain each metric on first view. Standardize date ranges. Add “Getting Started” flow for first-time users.
Results:
- Time-to-first-action: 11 minutes → 90 seconds
- Feature adoption: 23% → 67%
- Support tickets about “dashboard confusion”: 34% → 8%
This wasn’t visual polish. This was product strategy work disguised as audit.
Why Audits Take Three Weeks (Not Three Days)
Because surface problems hide structural problems.
Day 1 discovery: Buttons use 8 different sizes
Day 3 discovery: No design system documentation
Day 7 discovery: Designers and developers never aligned on sizing scale
Day 12 discovery: Different teams built different features with no coordination
Day 18 discovery: No product owner making final decisions on consistency
Button sizes aren’t the problem. Lack of product leadership is the problem.
Can’t find that in three days.
Same pattern every audit:
- Week 1: Document symptoms
- Week 2: Find root causes
- Week 3: Recommend solutions that address causes, not symptoms
Skip weeks 2-3, you get cosmetic fixes that don’t solve anything.
What You’re Actually Asking For
If you want spacing/alignment checked: Hire QA, not designer. Cost: €500, Timeline: 2 days
If you want visual refresh: Hire visual designer. Cost: €3K-8K, Timeline: 2-4 weeks
If you want to know why users are confused: That’s a design audit. Cost: €15K-25K, Timeline: 3-4 weeks
If you want comprehensive product UX strategy: That’s product design consulting. Cost: €40K+, Timeline: 8-12 weeks
“Quick design audit” tries to get #3 for the price of #1 and timeline of #2.
Doesn’t work.
When You Actually Need a Design Audit
You need one when: Users consistently get stuck at same points. Support tickets ask “how do I…” for features that exist. New users take days to understand your product. Feature adoption significantly lower than expected. Conversion dropping but you don’t know why. Multiple teams built features with no coordination.
You don’t need one when: Product isn’t live yet (need product design, not audit). You already know what’s broken (just fix it). You want validation that everything’s fine (it’s not, but audit won’t help if you won’t act). You’re looking for cosmetic polish (hire visual designer). Timeline is “by Thursday” (not enough time).
The Follow-Through Problem
Here’s what usually happens after audits:
Week 1: Present findings. Team excited. “This explains everything!”
Week 2: Leadership reviews recommendations. “These all make sense.”
Week 3: Engineering estimates implementation. “6 weeks minimum.”
Week 4: Product reprioritizes. “Let’s tackle the quick wins first.”
Week 8: Quick wins shipped. 3 of 47 recommendations implemented.
Month 6: Team asks: “Can we do another audit?”
You paid €18K for 47 recommendations. Implemented 3. Now want to pay €18K for 47 more.
Reality: Audits only work if you actually implement findings. Otherwise you’re collecting expensive PDFs that die in Notion.
Most valuable audits I’ve done: client implemented 80%+ within 3 months.
Least valuable: client framed findings document, hung it on wall, changed nothing.
Final Thought: What Audits Actually Diagnose
Audits don’t find cosmetic problems.
They find systemic problems that manifest as cosmetic symptoms.
Your button sizes are inconsistent because you have no design system.
You have no design system because no one owns design consistency.
No one owns it because product grew faster than process.
Process broke because six teams built features independently.
Six teams built independently because no one’s making final decisions.
The “quick spacing audit” reveals organizational dysfunction.
That’s why it takes three weeks, costs €18K, and can’t be done by Thursday.
If you want spacing checked, run Lighthouse.
If you want to understand why your product confuses users, hires designers, and ships broken features — that’s what design audits actually do.
Just don’t call it “quick.”
