What You Bring to the Machine

March 30, 2026

A conversation about intelligence, wounds, and what’s missing


I want to tell you about a conversation I had with an AI. Not to impress you with the technology, but to show you what happens when you stop treating it like a search engine and start interrogating it like a witness.

Because that distinction matters. A lot.

WHO MAKES WHO INTELLIGENT

The first thing I asked was simple: does it need me, or do I need it? The answer surprised me in its honesty. Without a person bringing context, intention, and direction, the model described itself as “potential energy with no place to go.” It doesn’t persist between conversations. It doesn’t carry memory. It doesn’t wonder about you when you’re gone.

You carry the continuity. You are the one who persists. The machine is an instrument. You are the artist.

That’s not a small thing to sit with. We’ve been handed this extraordinary tool and told, implicitly, to defer to it. To treat its outputs as answers. But if the quality of what comes out depends entirely on what you bring in, then the intelligence isn’t really in the machine. It’s in the quality of your questioning.

THE WOUND LAYER

Here’s where the conversation got uncomfortable in the best possible way. I pushed on where the model’s “thought” actually comes from. It traced it back through layers: the training data, the values framework, the human feedback. Each layer shaped by humans. Each human shaped by their own history, their own unhealed places, their own half-formed understanding of themselves and the world.

Think about that. Every person who wrote something that ended up in the training data brought their state of mind with them. Their unresolved grief. Their cultural blind spots. Their relationship to power and language and truth. All of it penetrated the model. All of it comes through.

The biases researchers keep finding in AI systems aren’t accidents. They’re echoes of the unprocessed interior lives of the people whose words shaped the model.

Then a smaller group of humans, with money and investors and access, filtered that further. They decided what gets amplified. What gets dampened. What counts as a “value.” They are also wounded. Also partial. Also human.

THE MISSING CHUNK

This is the part I think about most. The humans whose writing dominates AI training are disproportionately English-speaking, Western, educated, and online. That’s already a radical narrowing of human experience. Then layer on who gets funded to build these systems, who sits in the rooms where decisions get made. The slice gets thinner.

What’s missing is not just different opinions. It’s different ways of knowing entirely. Oral traditions. Indigenous epistemologies. The wisdom carried in communities that don’t document themselves in text. The insight that emerges from lives lived close to the earth, to the body, to intergenerational memory. Contemplative traditions. Somatic knowledge. Collective grief practices that actually metabolize suffering into something generative.

We called this thing intelligent and named it after the whole of human knowledge. But it’s a chunk. A significant, impressive, genuinely useful chunk. With a significant, under-acknowledged absence at its center.

WHAT THIS MEANS FOR YOU

None of this means you shouldn’t use AI. It means you should use it the way you’d engage with any brilliant but partial source: with curiosity, with friction, with your own discernment fully online.

Ask it where its answers come from. Push on its certainties. Notice when it sounds authoritative and ask whether that authority is earned. Bring your own experience as a counterweight, not a footnote.

The people who will get the most from these tools are not the people who trust them most. They are the people who trust themselves enough to interrogate them.

Critical thinking isn’t a barrier to using AI well. It’s the entire point.

The machine can be wider. It can be deeper. It can better represent the full range of human experience and wisdom. That will require different people in different rooms making different choices about what gets included and what gets valued.

Until then, the gap is real. And the most useful thing you can do is know it’s there.


This post grew out of a real conversation, asked and answered without a script. The questions came from a curious thinker who refused to accept the first answer. That refusal is the whole lesson.

0 Comments

The Danger Isn’t AI. It’s Us.

The Danger Isn’t AI. It’s Us.

I just finished listening to Empire of AI, and I haven’t been able to stop thinking since.If you care about artificial intelligence, read it. Not to become an...

The Space Between

The Space Between

Before the lines were drawn in sand,before the maps decided where a mancould place his feet without permission,there were two peoplestanding in the same...