I can't write code. I want to be clear about that upfront. I don't mean I'm a beginner who's being modest, or that I know a bit of Python but nothing fancy. I mean I can't write code. I don't understand syntax. I couldn't look at a function and tell you what it does. If you showed me a CSS file and asked me to change the background colour, I would probably break something. I'm, by any reasonable definition, a non-coder who builds products.

That sentence used to be a contradiction. Now it's a description of my daily life. Because AI changed the equation entirely. Not in the vague, hype-cycle way that people mean when they say "AI is changing everything." In a specific, concrete, personal way: AI does the making. I do the judging. And it turns out that the judging - the taste, the direction, the knowing what's right - is the harder part.

0
Lines of code written
100%
Taste-directed
14
Products live

The director, not the actor

Think about a film director. They don't operate the camera. They don't edit the footage. They don't compose the score or build the sets or apply the makeup. But they're responsible for the film. Every frame reflects their vision, their taste, their judgement about what works and what doesn't. The director's job isn't to do each thing. It's to know what each thing should be.

That's my relationship with AI. I'm the director. Claude Code is the crew. I describe what I want - not in technical terms, because I don't have those, but in taste terms. I want this to feel like Monocle's editorial layout. I want the spacing to feel generous without being empty. I want the typography to have weight without heaviness. I want the colour palette to feel warm but not soft. These aren't engineering specifications. They're taste specifications. And they're the only input I have.

This is both the power and the terror of building with AI as a non-coder. The power is that taste becomes sufficient. You don't need to know how to implement a gradient border or a scroll-triggered animation or a responsive grid layout. You need to know whether the result looks right. You need to know when something is off by three pixels. You need to know when a colour is too saturated or a font weight is too heavy or a transition is too fast. These are taste decisions, not technical decisions. And taste is the one thing I've spent my entire career developing.

The terror is that there's nowhere to hide. When a developer builds something and it looks bad, there's plausible deniability. "I'm a backend person." "Design isn't my strength." "I'll clean up the UI later." When a non-coder builds with AI, the output is a pure expression of their direction. Every aesthetic choice was deliberate. Every design decision was mine. If it looks bad, it's because my taste failed, not because my coding skills were lacking. That exposure is more vulnerable than writing code ever could be.

When AI does the making, there is nowhere to hide. Every design decision is yours. If the output is bad, it is because your taste failed - and that is more exposing than any technical limitation.

What taste actually means in practice

People talk about taste like it's an abstract quality - something you either have or you don't, like perfect pitch or the ability to roll your tongue. But in practice, taste is a series of very specific decisions made under pressure. It's looking at two almost identical versions of a page and knowing which one is right. It's saying "the padding needs to be 24px not 20px" without being able to explain why, except that the 20px version feels cramped and the 24px version feels considered. It's rejecting a perfectly functional layout because something about it feels generic, even though you can't articulate what.

When I built Modern Retro, the AI-generated images of modern brands as 1970s stores, taste was literally the entire product. There was no functionality to speak of - it was images on a page. The product was the aesthetic judgement. Which brands to choose. What the prompts should describe. Which generated images to keep and which to regenerate. How to present them. What the surrounding design should feel like. Every single decision was a taste decision. And every single decision was visible.

I think about the Japanese concept of kodawari - an uncompromising devotion to pursuing perfection in a craft. The sushi chef who selects fish at the market at 4am. The knife maker who hand-finishes each blade. The carpenter who spends days on a single joint. This is taste operating at its highest level - not as an opinion about what's good, but as an obsessive commitment to getting every detail exactly right. When AI handles the execution, kodawari is all you have left. Your obsessiveness is the product.

🎯
Taste isn't abstract. It's saying "24px not 20px" without knowing why. It's rejecting the functional version because it feels generic. It's the only input when AI does the rest.

The democratisation problem

The thing nobody in the AI discourse wants to say out loud. If anyone can build a product with AI, then the products themselves are no longer the differentiator. The tool is accessible to everyone. The execution speed is approximately the same for everyone. The technical barrier has been removed for everyone. What remains as the differentiator is the thing that was always there but was previously obscured by technical ability: taste.

This is, I think, the most profound shift that AI tools have created. Not the democratisation of building - that's just the mechanism. But the revelation that building was never the hard part. Knowing what to build was the hard part. Knowing how it should look was the hard part. Knowing what it should feel like was the hard part. The code was always just the translation layer between vision and reality. Now that the translation is automated, the vision stands exposed.

Trove and CultureTerminal were built with the same tools, by the same person, in the same months. But they feel completely different. Trove feels personal, introspective, like a private library. CultureTerminal feels fast, opinionated, like a newsroom. Those differences aren't technical. The underlying code is similar. The differences are pure taste - different aesthetic decisions driven by different product visions, each one reflecting a deliberate choice about what this particular thing should feel like. The AI didn't make those choices. I did. The AI just executed them.

The vulnerability of pure direction

I want to end with the uncomfortable part. Building with AI as a non-coder is more vulnerable than any other way of building, because there's no excuse layer. There's no "I would have designed it better but I couldn't figure out the CSS." There's no "the design is just a placeholder until I learn React." There's no "I'm a developer, not a designer." Every output is the result of direction, and direction is taste, and taste is identity. If someone looks at something I built and thinks it's ugly or generic or poorly considered, they aren't criticising my technical skills. They're criticising my judgement. My eye. The thing I've spent twenty years developing. The thing I believe is my only real competitive advantage.

That vulnerability is why I think non-coders building with AI is actually a braver act than it appears from the outside. It looks easy - just tell the AI what you want and it builds it. But "what you want" is the whole game. "What you want" is every aesthetic principle you've ever absorbed, every design reference you've ever stored, every opinion you've ever formed about what good looks like. "What you want" is your taste, laid bare, with no technical complexity to hide behind.

Zero lines of code written. One hundred percent taste-directed. Fourteen products live. That isn't a shortcut. That's the hardest test of taste I've ever taken.