Chasing the Signal: Human Authenticity in an AI-Saturated World
There's a concept I keep coming back to, one I've been chasing for years — literally. I even built a website around it called Chasing the Signal. The idea is simple: the world is full of signal and noise. The signal is whatever carries value — knowledge, connection, meaning. The noise is everything that gets in the way.
That framing has followed me across careers, technologies, and conversations. And lately, it's the lens through which I see everything happening with AI.
A Brief History of Noise Reduction
Think about how humans have communicated across time, and you'll notice a pattern: every major leap forward has been, at its core, an act of noise reduction.
Start with oral history. Before writing, everything was passed down through speech — stories, laws, lineage, belief. The problem? The telephone game. You tell someone, they tell someone, and by the end of the chain, the original signal has degraded. That degradation is noise. The value erodes with every transmission.
Then came writing. Scribes in antiquity spent their lives making copies of copies of copies. A massive leap — but not a perfect one. Each copy introduced subtle errors, variations, interpretations. The signal was more durable, but it still wasn't clean.
Fast forward to computers and keyboards. The keyboard was a revolutionary interface — it gave individuals direct access to the digital world. But it came with friction. You had to learn to type. Mistakes crept in. Language barriers added another layer of complexity. More noise.
Then the iPhone changed everything again. Multi-touch put a natural, intuitive interface in everyone's hands. There are videos of children barely a year old navigating content through touch — because it just makes sense. No training required. Another layer of noise, stripped away.
AI Is Just the Next Abstraction Layer
Which brings us to artificial intelligence. And here's where I think the framing matters a lot.
We don't write assembly code anymore. We don't write in machine language. Over decades, we've built abstraction layer upon abstraction layer — compilers, interpreters, high-level languages, frameworks — all of which translate human intent into something the machine can execute. Each layer made it possible for more people to build, create, and contribute without needing to understand what's happening three levels down.
AI is the next layer.
You no longer need to memorize exact syntax to interact with a system. You no longer need to think of yourself as a developer — you can think of yourself as a maker, a builder. The same way that high-level programming languages democratized software development, AI is democratizing creation itself. That's not a threat. It's a progression.
The doomers will point to complexity, to sophistication, to the sheer scale of what these systems can do. And they're not wrong that it's impressive. But let's be honest: AI hasn't invented anything that didn't exist before. What DeepMind did with protein folding — extraordinary. But human scientists could have gotten there eventually. AI just collapsed the timeline from multiple lifetimes to a few years. The advantage is speed and access, not some new form of creativity conjured from nothing.
AGI, ASI — these acronyms get thrown around to generate buzz, to attract venture capital, to dominate headlines. They may mean something someday. But right now, framing AI as the next abstraction layer is both more accurate and more useful.
The New Noise: AI Slop
Here's the problem with every noise-reduction breakthrough: it also opens the door to new noise.
Printing presses enabled mass communication — and mass misinformation. The internet gave everyone a voice — and gave spammers a megaphone. AI is no different. The same tools that strip away friction and democratize creation are also flooding the internet with what people are calling AI slop: content that was generated automatically, without a human genuinely behind it. Vibe-coded blog posts. Auto-generated takes. Articles that look real but were never actually thought by anyone.
It's everywhere. And the more of it there is, the harder it becomes to find the signal.
This is the authenticity problem. And it's not hypothetical — it's already here.
The Human Stamp
So here's an idea I've been sitting with: what if we had a simple, universal way to attest that a human was at the source of a piece of content?
Not a guarantee that no AI was involved — that ship has sailed, and frankly, AI as a tool isn't the problem. I used AI to help shape and structure these very thoughts from a rambling morning commute into something readable. The raw material is mine. The ideas are mine. The voice is mine. AI was just part of the process — the same way a word processor, an editor, or a transcription service might be.
The question isn't did AI touch this? The question is is there a human at the source?
Think of it like the tag at the end of a political ad: "I'm so-and-so, and I approve this message." It's accountability. It's a signal to the reader that a real person stood behind this content, thought about it, and put their name on it. Not a guarantee of quality, but a claim of origin.
There's been industry-level conversation about watermarking AI-generated content. C2PA standards, metadata embedding, detection tools — all of it is pointing in the same direction. But those are top-down solutions, and they're slow. What if creators just started doing this themselves, from the bottom up? A simple icon. A consistent mark. Something that says: human-originated.
It wouldn't solve everything. People would abuse it. Bad actors would misuse it. But it would at least begin to rebuild the expectation that when something is marked this way, a real person put real thought into it. That's worth something.
What We're Really Chasing
Here's how it all ties together.
Every era has had its version of signal versus noise. And every era has found tools that helped suppress the noise and amplify the signal. That's been the project of human progress, more or less.
AI, at its best, is the latest version of that tool. It removes friction. It collapses distance between intent and output. It lets more people build, create, and communicate than ever before.
But the noise it introduces — the slop, the inauthenticity, the flood of content without a human voice behind it — is real. And combating it requires both the technology and the culture to catch up.
I don't have a complete answer. But I know the question worth asking: in a world where anyone can generate anything, how do you find — and signal — that this came from a real person with something real to say?
That's the signal worth chasing.

These ideas started as rambling thoughts on a Monday morning commute. More to come.