We Still Have Power Over AI
tl;dr: We may feel powerless before runaway technologies like AI, but resignation is a trap. Radical acceptance must be paired with radical responsibility. Even as AI races ahead, it still depends on energy systems we can regulate. We still have leverage. Let’s find the courage to become flowers in the desert.
Do you ever have the feeling like we are trapped in a runaway train?
Like we are hurtling through systems too vast to steer?
Caught in a momentum that seems impossible to stop?
And not just when it comes to AI, but when it comes to every aspect of the metacrisis.*
It feels like Frederic Jameson was right when he said that:
It is easier to imagine the end of the world than the end of capitalism.
That feeling, the sensation of being in the middle of something unstoppable, is the existential challenge of living inside a “self-terminating system.”
We drilled, extracted and abstracted our way out of the earth’s inherently regenerative system.
Systems are really hard to change. Because contrary to the fantasies of conspiracy theorists, there is no one who is actually in charge.
I have recently come out of a period when I decided that the wise and sober thing to do was to come to terms with the inevitability of collapse. To experience the current cycle of destruction as what is natural in all cycles, all ecologies and all civilizations.
I still think that this radical acceptance is a good way to be with what is. The best way to be with whatever arises as it arises. There is wisdom in it. But there is also a trap. The posture can tempt us to give up too soon. To start playing like it doesn’t matter. Like we don’t have a role to play. Like we don’t have descendants to tend to. Like we are not the future ancestors that we are here to be.
More recently, through my engagement with a new (for me) school of meditation, I am remembering that it is also our responsibility to envision a new dawn. To seed it. To live and practice our way into it.
And I remember my mother’s wisdom that one time when I wondered whether it made sense to bring a child into this world:
You need flowers in the desert.
She reminded me that despair and hope are not opposites. They are companions. They come together.
One Path. Two Tracks.
This is how I’ve come to think of it. Our hearts are set on a new dawn. This is the path that we are on.
And we are walking two tracks to get there.
There is the track of the longview:
We are orienting ourselves towards easing the suffering of all beings. We are daring to imagine and work towards liberation. We are seeding a culture that integrates the wisdom of the ages. A people who remember that the earth is teeming with aliveness and that everything is connected to everything else. We are conjuring a new day by embodying and enacting our way into it.
And we are not just talking about a return to a primordial past. We are integrating and including what is good and true about who we are now. This is what we mean by
Forward Facing Remembering
And the other track is the track of wise, strategic engagement with what is happening now:
What does action look like in the political and economic realms? What are the ways to impact and influence culture?
Are there attractor narratives that can lure more of us out of the “consensus trance?”
How do we contend with power?
Especially now, when the levers of the state turn against those who dare to long for freedom.
These are important questions. They are questions that have to do with survival. And they are not easy to answer. Because too many of the old answers, too much of the ideological discourse, too many of the standard recipes for resistance, are evidently coming up short.
It is time to release the grip of dogma.
It is time to get creative.
It is time to experiment.
And it is definitely time to act.
What does this all have to do with AI?
I’m using AI as an example because it is a space where I caught myself trapped by a sense of powerlessness. I caught myself in the grip of the idea that the “genie is out of the bottle and now there is now putting it back in.”
And it’s true.
There are so many ways in which there is no going back.
I want to be clear here, I am not some sort of anti-AI purist! I like the technology and much of the promise that it holds.
My concern here is with the development of “Artificial Super Intelligence (ASI).” As I repeat over and over in these pages, I am not into alarmism. But neither am I into “spiritually putting my head in the sand.”
So here is the thing, it turns out that most of the people actually building ASI will publicly admit that the technology has a 10% to 25% chance of wiping us out. Privately, some say that the chance is up to 50%!
We have Yoshua Bengio, the most cited living scientist, and Geoffrey Hinton, the Nobel Prize-winning godfather of AI, both coming out and saying they think this is a very good chance of just completely destroying civilization.
- Nate Soares
Listening to Soares** left me concerned but somewhat hopeful. It feels clear that as of today (but not forever!), we still have power over AI.
Capitalism. Our materialistic and reductionist self-terminating system is defined by its perverse incentives.
Perverse incentives are rewards that push people or companies to act in ways that harm the very systems we depend on. Just because it makes them money. Money is the measure. And it is a system obsessed with measures. A system that must reduce, separate and quantify.
So the brilliant people who are building these technologies know that there is a good chance they are building something that will decimate humanity. But they also feel trapped by these perverse incentives. They believe that if they don’t build it then someone else will.
They are caught in a global competition defined by a fast-track race to the bottom. A race towards the possibility of extinction.
Are you sure all your wishes should come true?
What we’re doing is conjuring an egregore the likes of which humanity has never seen. In metaphysics, an egregore is a kind of collective thoughtform or group mind. An entity that emerges from the shared beliefs, emotions, and intentions of a group of people.
Take a pause and consider how much shadow and confusion we hold in our uninitiated culture. Consider the immaturity that defines so many of our shared beliefs, emotions, and intentions. So much of what we wish for.
Take a pause and consider every story ever told about genies or wishes being granted.
Remember that King Midas thought he knew what he wanted.
The way Soares puts it:
If you had an extremely powerful genie that did exactly what you wished for, it would be a hard problem to figure out a wish that would actually have good consequences. You know, it’s difficult to come up with a good wish.
We have great power and very little wisdom.
The technology of the gods
The religions of the middle ages
The instincts of cavemen
You should listen to Soares and Beiner talk about why ASI is so terrifyingly dangerous.** The very shorthand of it is that we already have evidence of AI cheating, and lying about cheating. Evidence of it having some sort of intent outside of our instruction. And that once ASI starts building itself, there really is no telling what it will do.
So what can we actually do?
This is the part that caught my attention. Because it takes some discipline to snap out of this sense that we are on a train that cannot be stopped.
I know that we have a crisis of leadership. But the fact is we still have leaders. We still have governments. We still have people with some level of accountability to their voters and constituencies. (No, I am not ignoring the global trend towards authoritarianism.)
And Soares suggests that our leaders simply don’t understand. That they don’t really get it. And that there is some hope that they will get it.
I think concern about ASI could be one of the few things that bridges the political divide. Already we have a growing bipartisan consensus on the need to regulate phones. More of us across the political spectrum understand how much harm social media is doing to our young.
There are some instincts that unite us.
Soares uses the example of nuclear non-proliferation. And the way the international community has been able to come together to slow down the spread of that other catastrophic technology.
But I asked myself, how can you possibly monitor what people are doing with their computers?
And that’s where Soares’ answer inspired me to write this note. It is evident, but I had not thought about it.
These technologies need enough energy to power entire cities.
There is no way to hide the power plants that breathe life into AI.
They are too vast, too luminous to conceal.
And in that visibility lies our chance.
They are just too big.
Satellite technology shows us which nations are forging ahead.
And that’s where we find our glimmer of hope.
The development of ASI can be regulated.
It can’t be hidden.
So it can be stopped.
It is really challenging to see ourselves having much influence over the forces that govern us. But the Cold War also threatened the end of the world. And somehow, diametrically opposing powers found a way to stave off mutually assured destruction.
If you’ve been with me for a while, you know that I don’t tend to be pollyanna about what is possible. I aim to take a posture of tragic optimism.
But there is something we can do. Or, something we can try and do. A way to coalesce with people who seem to be the opposite of us. And together seek to demand caution and regulation when it comes to ASI.
We lost the story war on climate. The big extractors managed to make the health of the earth something about left and right. But this has not yet happened with ASI. There is something in each of us that trembles, even when we stand in awe of these technological powers.
We can tap into that.
Why not try?
In the big, BIG, scheme of things, in this universe that is ruled by the laws of cause and effec, there is very little we have any say over. And yet this very little say that we do have, that is the say that shapes whole civilizations.
Even in a self-terminating system, there are still flowers in the desert. They bloom in our refusal to abandon beauty, in our courage to imagine again. That, too, is power over AI.
And here we find ourselves again.
One Path. Two tracks.
We root ourselves in the longview. The wisdom of the ancestors and the myriad apocalypse that precede us.
And we take smart, strategic action. Here and now. In the days we were born into. In this world that we can touch and we can see.
*By metacrisis, I am referring to the systemic nature of our current “crisis of crises:” ecological collapse, technological disruption, economic fragility, mental health epidemics, political polarization, erosion of trust, and spiritual disorientation.
It is my opinion that metacrisis is a better term than polycrisis, because it more successfully orients us to what is “meta,” it is not just a confluence of crises. These catastrophic conditions are held together by a worldview that yields a self-terminating system.
**Nate Soares is the co-author with Eliezer Yudkowsky of If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. I encourage you to listen to Nate’s conversation with Alexander Beiner.
***Josh Schrei of the Emerald Podcast has a more meaningful mythopoetic take on the impulses of the uninitiated when it comes to great and mysterious powers. I recommend the episode: So You Want to Be a Sorcerer in the Age of Mythic Powers.