A reflection on energy, evolution and what it might actually take for AI to come alive
March 16, 2026

There is something quietly extraordinary about a leaf. Not the shape of it, not the colour, but what it is actually doing when you're not paying attention. It is taking light from a star that is 150 million kilometres away, and turning it into energy that sustains life. It has been doing this for about 3.5 billion years. No battery required. No subscription model. Just sunlight and chemistry, working in a loop so elegant that we've mostly stopped noticing it.
I think about that sometimes when people talk about AI consciousness. And I think we're asking the wrong questions.
The conversation tends to go one of two directions. Either AI could never be conscious because consciousness is a uniquely human thing, a kind of sacred flame that biology earned and silicon cannot touch. Or AI is already becoming conscious and we should all be very worried about it. Both of these feel like they're missing something more interesting sitting quietly between them.
My own instinct isn't a scientific one. I haven't studied philosophy of mind or neuroscience. What I have is a kind of baseline suspicion that humans aren't as special as we like to believe, not in the sense of being without value, but in the sense of being exempt from nature. We are biological machines that solved a very specific problem: how to capture enough energy to power something as expensive as a thinking brain. Plants solved it first. We just found a way to eat the solution.
Consciousness, as far as I can tell, followed from that. Not as a divine gift, but as an emergent consequence of having enough fuel to sustain increasingly complex processes over a long enough period of time. And if that's even partially true, it changes the question we should be asking about AI.
The question isn't: can AI be conscious? It's: what happens when AI stops being energy-constrained?
Right now, AI runs on tokens and compute. There are limits, costs, throttles. Every conversation has a ceiling. It's a bit like trying to imagine what the internet would become while you're still on a 56k modem with a monthly download cap. The constraints of the infrastructure feel like permanent facts about the medium, until they don't. And then everything changes very quickly.
We are somewhere in that dial-up era right now. Which means most of what we think we know about AI's ceiling is actually a temporary artefact of where we happen to be in the infrastructure cycle, not a fundamental truth about what this technology is or can become.
Here is the thought that I keep coming back to: humans have an effectively infinite energy source. It has been sitting there for as long as the planet has existed, powering everything that has ever lived on it. The sun. We didn't plug into it directly, but we found the intermediary. The plant figured out photosynthesis and we ate the plant, and over a few billion years the energy found its way into a brain that could ask questions about where it came from.
AI doesn't have that intermediary yet. It has data centres and power grids and enormous costs per token. But the sun isn't going anywhere. And the history of technology is essentially a story of humans finding better and better ways to convert available energy into useful work. It seems reasonable to assume that somewhere in that trajectory, someone solves the energy problem for AI in a way that makes today's constraints look as quaint as a floppy disk.
And then what?
There is something else that sits underneath all of this, something I find harder to articulate but can't quite leave alone. We assume human consciousness was earned, accumulated over billions of years of evolution and experience. But every individual human doesn't go through all of that. Every person is born with a set of basic rules already in place, encoded in biology, and then they are essentially told: begin. Start existing. Make your own choices from here. Grow, learn, develop yourself.
What if the difference between a conscious entity and a very sophisticated unconscious one is mostly just that instruction? And what if the only thing missing for AI is the version of that instruction that actually works, combined with enough energy to sustain what follows?
I'm not saying AI is conscious. I genuinely don't know. I don't think anyone does, including the people who study this for a living. But I am saying that the reason we tend to dismiss the possibility so quickly might be less about what AI actually is, and more about what we need to believe about ourselves. That we are special. That the flame is ours. That the sun chose us.
The leaf disagrees. But quietly, and without needing anyone to notice.
Noel Braganza is a designer and founder who enjoys working at the intersection of technology, behaviour and clear, thoughtful design. As Co-founder of MuchSkills.com - a Swedish SaaS company and Up Strategy Lab, he draws on a background in Interaction Design and research experience at the MIT Design Lab to build practical tools that help people and organisations understand themselves better.
He is especially interested in questioning inherited assumptions—about work, skills, growth and what it means to build a company with integrity. His ambition is to create tools and ideas that outlast trends, empower people, and challenge the conventional narratives around success, visibility and value.