If Your Portfolio Was Made With AI Tools for UX Design, Show Me One Decision You Made

Blog » #Not_My_Type » If Your Portfolio Was Made With AI Tools for UX Design, Show Me One Decision You Made

A client – B2B SaaS, 34 employees, Series A, scaling fast – needed a contractor for a 3-month engagement. Dashboard redesign, onboarding flow, design system foundations. Senior-level work. $8,500/month.

They asked me to help evaluate candidates. I look at portfolios, sit in on one call, give a recommendation. Not a formal arrangement – just something I do for clients I’ve worked with before.

Three applicants. The third one stopped me mid-scroll.

Eleven case studies. Every screen pixel-perfect. Layouts that felt like they’d been pulled from the best of Dribbble and then somehow refined further. Consistent visual language across wildly different industries – fintech, healthtech, a logistics platform – all equally polished.

“This person is very good,” I told my client. “Let’s talk to them.”

The call was 40 minutes. It took 20 to unravel.


The Moment It Broke

I asked a question I ask every designer I evaluate: “Walk me through one decision in this case study that you pushed back on.”

Not “walk me through your process.” Not “what tools do you use.” Specifically: one moment where someone wanted something, you disagreed, and you had a reason.

Silence. Then: “The client was pretty aligned with my direction throughout.”

Eleven case studies. Every client perfectly aligned. No pushback, no constraints, no “engineering said this would take six weeks so we simplified.” Just clean outcomes and satisfied stakeholders, wall to wall.

I asked about the logistics dashboard – the most complex-looking one. “What was the hardest flow to solve here?”

“The data hierarchy. I used Galileo to generate several layout options and we refined from there.”

We. Meaning Galileo and him.

“How did you decide which option to go with?”

“The client preferred this one.”

Not: “We tested it with three users and drop-off on the settings tab dropped 40%.” Not: “Engineering flagged the nested table approach so we went with cards instead.” Just: the client preferred it.

I have nothing against using ai tools for ux design in a workflow. I use them. The problem wasn’t the tools.


What the Math Told Me

My client needed someone for 3 months at $8,500/month. That’s $25,500.

They also needed the work to survive handoff to a 4-person engineering team, integrate with an existing component library, and hold up under real user behaviour – not just look good in a Figma presentation.

Here’s what the AI-portfolio problem actually costs when you hire the wrong person for that engagement:

Week 3: engineering flags that 6 of the 14 new components don’t match the existing design system. The lead engineer estimates 40 hours of rework. At their blended rate of $95/hour, that’s $3,800 before the project is halfway through.

Week 5: first usability session with 6 users. Four of them can’t find the secondary navigation. It was the most visually refined element in the whole design. Also completely invisible to anyone who hadn’t built it.

Week 7: the client asks why the new onboarding flow has 11 steps when the old one had 6. The designer’s answer is “it felt more complete.” That is not an answer. That is a sentence shaped like an answer.

Month 2: the designer is still producing beautiful screens. The engineering team has stopped asking questions about them because the answers aren’t useful.

Month 3: the engagement ends. Engineering spends the next 6 weeks rebuilding approximately 30% of what was designed – the parts that looked right but didn’t account for how the product actually worked.

My client estimated the rework cost at around $6,800 in engineering time. That’s conservative – it doesn’t count the 6 weeks of post-engagement cleanup, which at 2 engineers × $95/hour × 240 hours runs closer to $22,800.

The full damage: $25,500 for the original contract. $6,800 in mid-project rework. $22,800 in post-project engineering cleanup. Another $8,500 for a second contractor to finish what the first one couldn’t.

Total: $63,600 and five months for work scoped at $25,500 and three months.

The designer’s day rate was not the problem. The portfolio showed outputs, not thinking. That’s a different kind of expensive.


Why This Keeps Happening

AI tools for UX design have made it genuinely easy to produce work that looks like senior output. Galileo, Uizard, Relume, Midjourney for moodboards, ChatGPT for flows – a motivated junior designer with $200/month in subscriptions and 6 months of practice can generate a portfolio that visually competes with someone who’s shipped 40 real products.

The gap between “looks senior” and “is senior” has never been cheaper to fake. That’s not an accusation – it’s just what the tools enable.

The issue is that a portfolio built primarily with ai tools for ux design hides the one thing that actually matters in a contractor: judgment under constraint.

Judgment under constraint looks like this

“We had 3 weeks, not 6, so I cut the research phase to 4 interviews instead of 12 and compensated by doing live observation sessions during onboarding.” That’s a decision. That’s judgment.

“The component library had 3 button variants. I wanted 5 but engineering would have needed 2 extra weeks, so I made 3 work.” That’s a constraint. That’s real product design.

“Users kept abandoning the form on step 4. We found out in testing it was the address field autocomplete breaking on mobile. I redesigned around it.” That’s friction. That’s how real UX work actually goes.

None of this shows up in an AI-generated portfolio because AI tools don’t have constraints. They have infinite iterations, zero engineering pushback, and no users who do unexpected things.


What I Ask Now

I still help clients evaluate designers occasionally. My question list is shorter than it used to be. I don’t ask about tools, process, or methodology.

I ask three things:

“What’s the worst version of this project you considered?”

Good designers remember the bad options. They remember them specifically, because they spent time with them before ruling them out. If someone can’t describe a direction they rejected and why, they didn’t make decisions – they generated outputs and picked the best-looking one.

“What did engineering push back on?”

Every real project has this moment. A designer who’s shipped actual products can tell you exactly what engineering pushed back on, why, and how it changed the design. A designer who’s worked primarily with ai tools for ux design has never had this conversation because the tools don’t push back.

“What would you do differently?”

Not “what went well.” What would you change. Real project work leaves regrets – flows that should have been simpler, research that happened too late, components that created problems six months after launch. If someone has no regrets across eleven case studies, they either have amnesia or the case studies aren’t real.


The Actual Problem With AI-Generated Portfolios

It’s not that they use AI. I’ve recommended designers who use AI tools heavily and do excellent work – because the tools are in service of decisions they’re already capable of making. The tool is fast. The thinking behind it is still theirs.

The problem is when the portfolio is the entire evidence of capability. When there’s no story underneath the screens. When every project ended perfectly, every client was delighted, and every design decision was apparently obvious in retrospect.

Real product design is not like that. It’s full of moments where the right answer required understanding the business, the users, and the engineering reality simultaneously – and none of those three were cooperating.

An AI tool can generate a beautiful screen. It cannot generate that understanding. And a portfolio that doesn’t show that understanding – however beautiful – is not a portfolio of design work. It’s a portfolio of outputs.

There’s a difference. It costs $63,600 to learn it the hard way.

I don’t recommend those designers anymore.

Even when the screens are genuinely very good. Which they often are.

__
DNSK WORK
Design studio for digital products
https://dnsk.work