I Sat Through Geneva So You Don’t Have To
On the Limits of Ethical Debate in Global AI Spaces
I didn’t go to Geneva to speak.
I didn’t even apply to be on a panel. I went to witness and to make a statement by showing up.
We could have launched Clause 0 virtually. We could’ve posted a link and left it at that. But that would’ve missed the point. Clause 0 is all about drawing a line, and I wanted to draw it from the heart of where “AI for Good” is being defined. There was no better place to launch it than…. Geneva.
Geneva isn’t just any city. It’s where diplomacy happens. It’s where human rights are negotiated, where international governance is supposed to be centered (if any of that still means anything).
When the UN, the ITU, and dozens of partners host a global summit on AI in this city, under the name “AI for Good” , the message is clear to me: this is where power meets ethics. Or is it?
And yet, last year when I looked at the agenda, I didn’t see real ethics anywhere.
But I went again. At this year’s AI for Good Summit, I could find more sessions on robotics, youth innovation, and plenty of optimism around climate tech (the kind that talks about emissions, not extraction or displacement). There was AI for disaster relief, for tree planting, for social good.. but nothing on AI used for surveillance, targeting, or forced displacement.
Palestine, predictably, didn’t exist.
The AI for Good agenda was dominated by optimistic, sponsor-safe topics: robotics, climate innovation, youth entrepreneurship, global health, and standardization. But there was no visible engagement with AI’s role in:
Occupation and border control
Surveillance regimes
Military partnerships
Data colonialism
Or any specific mention of Palestine, Gaza, or genocide-enabling infrastructures
Instead, what you’ll find often backed by CSR budgets are polished, well-intentioned projects: AI tools for maternal health, robotics for demining, early warning systems for climate resilience, and youth-led tech innovation across the Global South. Many of these efforts are meaningful. But they exist within a curated frame : one that celebrates innovation while steering clear of the deeper systems that produce harm.
These absences aren’t neutral. They’re designed.
I reckon there’s real good happening: researchers doing serious work, projects with real impact, efforts to make AI more inclusive and less harmful. I’ve seen people trying. But even the most meaningful work tends to stay within safe boundaries : what can be funded, celebrated, or published without friction. Few go near the deeper harms, and fewer still ask who benefits when ethics stays polite.
I didn’t want to just critique from afar. So I went. With two senior members of my team. We split up and sat in different sessions. We took notes. We crossed observations. And it confirmed what I had seen before: the story of AI is being carefully told. It’s a polished story, a sponsored story, and a narrow one.
Then, something happened that brought the silence into full view.
Dr. Abeba Birhane, a cognitive scientist known for her research on algorithmic harm, relational ethics, and the decolonization of computational systems, appeared on the main stage willing to address power (picture below)
Her voice didn’t flatter the narrative, and the livestream cut… mid-sentence!
Just like that. No warning. No explanation.
She had come to make a statement. So had I. And her being silenced made mine.
Clause 0 was already written. We had spent the last three months in a virtual room: technologists, lawyers, and compliance professionals drafting the one line that was missing from every framework. The refusal no one wanted to say out loud.
“AI, data, and cloud systems must not enable displacement, surveillance, violence, or erasure.”
Precisely because we knew this would happen. In Geneva. And everywhere else.
That’s it.
Not the ceiling the bare minimum.
If we can’t agree that these uses are off-limits, then what are we even talking about??
Geneva made it clear that the stage is curated.
But I still believe in showing up. I believe in being in the room, even when the room is designed to ignore you. I believe in presence as resistance. That’s why I went.
And if you’ve been in those rooms, if you’ve felt the tension but stayed quiet. I’m not here to judge you. I’m just here to say that you’re not alone, and it’s time we start naming what we see.
Clause 0 isn’t asking you to join something. It’s asking you to hold a line.
With clarity. With integrity. Without compromise.
If that speaks to you: sign it. Share it. Use it.
We launched it in Geneva, because that’s where silence was loudest.
Now it’s up to you to decide what kind of voice you want to be.




