Every morning at 7am, an AI reads 50+ culture feeds, identifies the five signals that matter, writes a briefing, records it in a cloned voice, and publishes it as a podcast. I built it in a weekend. It has been running autonomously for months.

No team. No editor. No production schedule. No recording studio. One person, a stack of free tools, and a very specific idea about what a daily culture briefing should sound like.

The project is called The Pattern. Its tagline is "Before it's obvious." And the reason I am writing about it is not because the technology is impressive, although it is. It is because The Pattern represents something bigger: proof that one person can now do what used to require a newsroom.

How it actually works

The pipeline is surprisingly simple once you break it down into steps.

Step 1: Feed ingestion. The Pattern consumes data from CultureTerminal, another project I built. CultureTerminal monitors 50+ RSS feeds from curated sources across fashion, design, technology, music, brands, and lifestyle. Every morning, it scores and ranks hundreds of articles using a custom relevance algorithm. The Pattern inherits all of that intelligence as its raw material.

Step 2: AI synthesis. Claude Haiku receives the day's top stories and a carefully written system prompt. The prompt is not generic. It has a specific editorial voice, a structure (Culture Pulse score, The Lead, Five Signals, The Pattern, One to Watch, Conversation Starters), and strict rules about tone and length. The AI does not just summarise. It synthesises, finding the connections between seemingly unrelated stories and pulling out the thread that ties them together.

Step 3: Voice narration. ElevenLabs takes the written briefing and reads it aloud using a clone of my voice. I trained the model with about 30 minutes of recorded audio. The result is uncanny. People who know me cannot always tell whether they are listening to me or the AI. That is both thrilling and slightly unsettling.

Step 4: Automated deployment. A GitHub Actions workflow triggers everything at 7am UTC. It fetches the CultureTerminal data, runs the AI synthesis, generates the audio, builds the HTML page, creates the RSS feed entry, and deploys the whole thing to Netlify. By the time I wake up, today's briefing is live on the website and available on Spotify.

The entire pipeline runs without me touching it. I get a push notification when it is done, and I listen to the briefing over a hot chocolate. That is the extent of my daily involvement.

One person can now do what used to require a newsroom. Not replacing journalism, but proving that synthesis and curation at scale is possible with taste and AI.

The taste layer is everything

Here is the part that matters most, and the part that most people building with AI miss entirely: the pipeline is not the product. The taste is the product.

Anyone could build a similar pipeline. The technology is available to everyone. What makes The Pattern work is not the code. It is the 50+ feeds I chose to monitor. It is the editorial voice I wrote into the system prompt. It is the structure I designed for each briefing. It is the decision to focus on cultural convergence rather than individual news stories. It is the judgment about what constitutes a "signal" versus what is just noise.

The feeds are curated, not random. I spent months testing and rejecting sources. Some publications are brilliant but too niche. Others are popular but lack editorial substance. The selection criteria is not "who gets the most traffic." It is "who consistently publishes stories that reveal where culture is heading." That is a taste decision, and it shapes everything downstream.

The AI prompt is where my editorial voice lives. I did not ask Claude to "summarise the news." I gave it a specific angle: look for patterns, find the connections, identify what is shifting. The briefing is not a news roundup. It is an analytical lens on culture. That distinction is crucial and it comes from having a point of view about what a culture briefing should be.

What surprised me

The cloned voice is genuinely uncanny. I expected it to sound robotic or flat. It does not. ElevenLabs has reached a point where the inflections, pauses, and emphasis feel natural. When I shared early episodes with friends, several of them asked if I had recorded it myself. The technology has crossed the uncanny valley in a way I did not expect.

Daily consistency builds something real. I underestimated the power of showing up every single day, even when "showing up" means the machine shows up on your behalf. The archive has grown into something genuinely useful. Months of daily briefings create a searchable culture database. You can go back and see what was happening in design three weeks ago, or track how a particular brand story evolved over time. The consistency is the product as much as any individual briefing.

The format shapes the audience. Making it a podcast, not just a blog post, changed who consumed it. People listen during commutes, during workouts, during morning routines. The audio format created a daily ritual that a written briefing would not have achieved. The medium matters as much as the message.

The archive is the hidden value. Each daily briefing is a snapshot of what mattered in culture that day. String enough of them together and you have something that does not exist anywhere else: a time-stamped record of cultural attention. What were people talking about in fashion on February 12th? What was the design world focused on three Tuesdays ago? The Pattern knows.

Why this matters beyond one project

The Pattern is a proof of concept for something much bigger than a daily culture briefing. It demonstrates that the economics of content have fundamentally changed.

A traditional newsroom producing a daily culture briefing would need editors, writers, audio producers, a distribution team, and a technology platform. The annual cost would be six figures at minimum. The Pattern costs effectively nothing to run: GitHub Actions is free for public repos, Claude Haiku costs pennies per call, ElevenLabs has a generous free tier, and Netlify hosts it for nothing.

I am not arguing that this replaces journalism. It does not. Original reporting, investigative work, and boots-on-the-ground coverage require humans doing human things. What The Pattern replaces is the synthesis layer: the aggregation, the curation, the "here is what matters today" editorial function that newsrooms spend significant resources on.

One person with taste, a clear editorial point of view, and access to AI tools can now produce a daily publication that rivals what small editorial teams used to produce. That is not a future prediction. That is what I am doing right now, every morning at 7am, while I sleep.

The real lesson

Building The Pattern taught me something I keep coming back to across all of my projects: AI does not replace taste. It amplifies it. The pipeline is powerful, but it is only as good as the decisions that shaped it. Which sources to monitor. What voice to give the AI. How to structure the output. When to ship and what to leave out.

Those are all human decisions. Taste decisions. And they are the reason The Pattern works as a product, not just a script.

The tools are available to everyone. The taste is not. That is the gap, and that is where the value lives.

Listen to The Pattern Today's culture briefing
Visit The Pattern