Spotify knows I like house music. It knows I listen to Floating Points and Four Tet and Caribou. It knows I play ambient electronic music when I'm working and harder stuff when I'm not. Based on this knowledge, it serves me an endless stream of music that sounds exactly like the music I already listen to. Technically, it's doing its job perfectly. Practically, it's trapping me in a sonic echo chamber where nothing surprising ever happens.

This is the fundamental problem with recommendation engines. They're mirrors, not windows. They reflect your existing taste back at you, refined and polished and endlessly reinforced. What they don't do — what they can't do — is show you the thing that will change your mind about everything.

The surprise that changes everything

I can trace nearly every significant shift in my taste to an accident. A surprise. Something I didn't ask for and wouldn't have chosen.

When I was sixteen, a friend's older brother left a copy of The Face on the kitchen table. I'd never seen anything like it. The typography, the photography, the way it mixed fashion with music with politics with art. It wasn't aimed at me. I wasn't the target audience. But it cracked open a door to a world I didn't know existed, and I walked through it.

In my twenties, a record shop owner in Nottingham handed me an album I hadn't asked about. "You'll like this," he said, based on nothing more than a five-minute conversation about what I'd been listening to. He was right. That album led me to a label, which led me to a genre, which fundamentally reshaped my musical taste for the next decade.

An algorithm can predict what you'll click. It can't predict what will change how you see the world.

In my thirties, a colleague sent me a link to a design blog I'd never heard of. The aesthetic was nothing like what I'd been drawn to. Angular where I liked curves. Minimal where I liked maximalism. I found it uncomfortable at first, which is precisely why it was important. It pushed against my instincts, and in doing so, it expanded them.

None of these moments would have happened inside an algorithm. Spotify wouldn't have played me that album. Netflix wouldn't have recommended that film. Amazon wouldn't have suggested that book. Because algorithms don't deal in surprises. They deal in predictions. And predictions, by definition, are based on what has already happened.

The engagement trap

Here's the uncomfortable truth about recommendation engines: they're not designed to improve your taste. They're designed to maximise your engagement. Those are very different objectives.

Improving your taste means exposing you to things that are challenging, unfamiliar, and potentially uncomfortable. It means risk. It means the possibility that you'll encounter something you don't like and disengage. No product manager in Silicon Valley is optimising for that.

Maximising engagement means giving you what's comfortable. What you'll click on. What you'll finish. What will keep you on the platform for another ten minutes. It means safety. It means the warm bath of familiarity. It means your Discover Weekly is full of songs that sound slightly different from each other but fundamentally identical.

The result is a kind of taste atrophy. Your preferences become more refined but also more narrow. You know exactly what you like, and you like exactly what you know. The muscle that used to respond to surprise — the thrill of encountering something genuinely new — weakens from disuse.

What got lost

The internet used to be full of surprise. The early web was a series of rabbit holes. You'd click a link on someone's blog, which led to a forum post, which led to a personal website, which led to a recommendation you'd never have found through any formal channel. The infrastructure was designed for wandering, not consuming.

Even early social media had this quality. Twitter's chronological timeline was a stream of consciousness from people you'd chosen to follow, and their retweets introduced you to voices and ideas outside your immediate circle. Tumblr was an engine of unexpected aesthetic discovery. You'd reblog something beautiful from someone you'd never heard of, and your dashboard would gradually shift in directions you couldn't have predicted.

All of that has been replaced by the feed. The algorithmic, engagement-optimised, personalised feed. It knows what you like. It gives you more of it. And slowly, imperceptibly, it turns the vast, chaotic, surprising internet into a very expensive mirror.

The deliberate cure

If algorithms won't challenge your taste, you have to do it yourself. And that requires deliberate effort — the kind of effort that feels unnatural in an age of frictionless consumption.

Browse a physical bookshop. Not the bestseller table. The back shelves. The sections you'd normally walk past. Pick up the book with the cover that intrigues you and the subject that doesn't. This is how I found half the books on my shelves, and they're the ones that shaped me most.

The growth rule: Once a week, deliberately consume something outside your usual taste. A genre you don't listen to. A publication you've never read. A creator whose aesthetic is nothing like yours. Taste grows through friction, not comfort.

Follow someone online whose taste is nothing like yours. Not someone you disagree with politically — that's a different kind of discomfort. Someone whose aesthetic references, cultural touchpoints, and creative instincts come from a completely different world. Let their perspective sit alongside yours without trying to reconcile them.

Click the link you'd normally skip. Read the article that doesn't match your usual diet. Watch the film that's not in your genre. Listen to the album that sounds wrong at first. "Wrong at first" is often where growth lives.

Why this matters for building

I think about this constantly as a builder. Every product I make is a reflection of my taste — my references, my instincts, my aesthetic sensibility. If my inputs are narrow, my outputs will be narrow. If I only consume things that confirm what I already believe about design, my designs will be predictable.

The best products I've built came from surprising inputs. Modern Retro exists because I combined two things that shouldn't go together — 1970s retail aesthetics and contemporary brands. That combination didn't come from an algorithm. It came from the kind of lateral thinking that only happens when your inputs are wide and weird and occasionally uncomfortable.

Taste requires friction. It requires the discomfort of encountering something that doesn't fit your existing framework and having to expand the framework to accommodate it. Algorithms remove friction. They smooth everything into a seamless, personalised, perfectly comfortable experience where nothing challenges you and nothing changes.

That's not discovery. That's stagnation with better UX.

The algorithm knows what you liked yesterday. A good bookshop owner, a friend with different taste, a random link on a weird blog — they know what you might love tomorrow. And that difference is everything.