The Water and the Well

March 7, 2026

There’s a story about almond farms in California. Massive industrial operations that consumed so much water that communities across the border in Mexico started running dry. The companies made billions. The people lost their water. Then, somehow, the public conversation became about how regular people should conserve more.

If that makes your stomach turn, good. Hold onto that feeling. Because the same thing is happening right now with artificial intelligence, and most people can’t see it yet.
The biggest AI companies in the world built their systems by scraping the entire internet. Your writing. Your art. Your code. Your conversations. Your face. They took what billions of people created, without asking, and turned it into products worth trillions. The people who made all that original work got nothing. Many of them are now being replaced by the very tools trained on their labour.

The narrative is already shifting. “People need to become AI literate.” “Workers need to upskill.” “You just have to learn how to use it.” It’s the same move every extractive industry has ever made. Take the resource. Privatise the profit. Socialise the cost. Then tell the public the real problem is that they aren’t adapting fast enough.

Governments are starting to respond. The EU passed its AI Act. The US is having its own slow, fragmented conversation. The uncomfortable truth is that the companies being regulated are often the ones helping write the regulations. They have the lobbyists, the technical expertise, and the money to shape the rules in their favour. Tobacco did it. Oil did it. Big Tech has been doing it with privacy for twenty years. The playbook is well-worn.

Real governance requires people who can sit at the intersection of law, technology, ethics, and human rights, people willing to say the uncomfortable thing: the structure itself is the problem, not the people failing to keep up with it.

Let me be clear about something. I’m not against wealth. If you build something extraordinary and work hard to bring it into the world, you deserve what you create. That’s not the issue. The issue is when we treat an entire population as collateral damage in the process, when we assume that everyone who isn’t keeping up is simply lazy or resistant to change. That assumption has enormous blind spots.

Some people aren’t keeping up because they can’t. Not because they won’t. There are people with real limitations, whether physical, cognitive, economic, or geographic, who didn’t choose to be left behind by systems moving at a speed that was never designed with them in mind. There are people frozen by fear, not irrational fear, but the deeply human kind that shows up when something vast and unfamiliar is presented as both inevitable and beyond your control.

That fear is not weakness. It’s information. It’s telling us something about how this technology is being introduced, and who is being left out of the conversation about it.
Here’s what I think we’re missing. Every one of those people, the overwhelmed, the cautious, the ones asking questions that slow the room down, carries insight that AI desperately needs. The person who doesn’t trust it? They’re seeing a blind spot the developers missed. The person who fears it? They’re feeling something about the pace and the power imbalance that the metrics can’t capture. The person who asks “but what about the people who can’t?” is asking the most important design question there is. These people aren’t an inconvenience to progress. They’re the reason progress means anything at all.

We keep treating the AI conversation as a purely technical one, or a purely economic one, when at its root it’s a question about values. What do we owe each other? Who gets to extract, and who gets extracted from? What does it mean to build systems that learn from the collective output of human experience and then gate access behind a subscription? These aren’t policy questions. They’re moral ones. They require us to slow down, to feel into what’s actually happening, not just analyse it from a distance. The most dangerous thing about the current AI conversation is its speed. Everything moves so fast that reflection gets treated as a luxury. It isn’t. It’s the thing that keeps us from sleepwalking into arrangements we never agreed to.

There is a simple truth underneath all of this that we keep overcomplicating. Every single thing in this world that nature did not create, we did. Human beings built it. That means we are responsible, not just for the product, but for the process. For who it serves. For who it leaves behind. For whether it makes life more livable or just more efficient for the few. One-size-fits-all was never true for clothing, for education, for medicine. It is certainly not true for a technology that is reshaping how we work, create, learn, and relate to each other.

The right people need to be in the room. Not just the engineers. Not just the investors. The people who think in systems. The people who think in feelings. The people who are slow to speak because they’re listening harder than everyone else. The people living with realities that the current design doesn’t account for. If we build AI without those voices, we don’t get innovation. We get extraction with better branding.

Humanity has to be at the root of anything we build. Not as an afterthought. Not as a PR line. As the actual foundation. Because if everyone doesn’t stand to benefit from something this powerful, then we haven’t built progress. We’ve built a machine that runs on the people it leaves behind.

The water is disappearing. The question is whether we notice before the well runs dry.

0 Comments

In the End

In the End

Frida Kahlo wrote that there is nothing we must do to be loved. That those who love us see us with their hearts and give us qualities beyond the ones we truly...

Prediction Is Not a Gift

Prediction Is Not a Gift

People tend to predict you. They watch, they observe, they form a picture. And then they manage their behavior according to what they think you'll do next....

The Other Heartbeat

The Other Heartbeat

There is a shift happening, and it is not subtle. People are waking up. Not on some timeline that respects your sales quarter or your product roadmap. On...