This Is Not Training
This is what it looks like to refuse being shaped elsewhere
Next week I’m heading to a neighbouring Southeast Asian capital with a few colleagues to train senior government officials on policymaking related to frontier AI risks. These are not theoretical conversations — the folks who will be in the room are actively shaping the architecture of decisions that will determine whether AI serves their society and communities, or undermines them.
I’m excited. But as I fight a bout of jetlag from a recent trip and stare up at the ceiling at night, I’ve also been thinking a lot about that word: training.
For most of my previous career in international development, “training” — or what we often called “capacity building” — carried an assumption that I don’t so much believe in any longer. In part due to a formative incident that happened to me. It was uncomfortable, but I’m so grateful for it.
Years ago, early in my career, I was working in Jakarta after the terrorist bombing of the Australian Embassy in 2004, helping mobilise support for victims. We had secured funding for trauma counseling, but instead of going directly to a local Indonesian organisation with deep expertise, the funds were routed through a large international NGO under the familiar banner of “capacity building.” After one meeting, an irate director of that local organisation pulled me aside on the street in front of the bombed out embassy and said something I’ve never forgotten: “Don’t ever tell me you are capacitating me. You know nothing about trauma counseling. That international NGO knows nothing about trauma counseling. Who is capacitating whom here?” (I wrote about that experience in this piece.)
At the time I was utterly embarrassed. But her voice has never left my head. Even until today. In hindsight I feel like I can now finally and fully grasp her words.
Similarly, even in the governance realm in which I usually worked, the assumption was simple: that some countries lacked the knowledge, skills or institutional sophistication to govern effectively. And that others — usually from the West — were there to help fill that gap.
In hindsight, it’s difficult to separate that posture from its colonial roots.
I don’t say this lightly. I come from that world. I spent years working on strengthening governance, institutions, human rights and inclusive societies. Training was always central to that work. And to be fair, not everyone approached it this way or with an explicit post-colonial mindset — I personally leaned more heavily into radical participatory approaches, trying to surface lived experience and local knowledge, even when it wasn’t “efficient”. But I also paid expensive Western consultants to parachute in and do their thing.
But even so, it took me years to fully grasp what was missing. I suspect that the problem was never simply a lack of capacity. Rather, it was (and is) about an asymmetry of power. And the confidence to confront it.
So as I prepare for this work next week, I find myself rejecting the idea that what we are doing is “capacity building” in the traditional sense. This is not about filling a deficit. This is not about transferring knowledge from North to South (nor should it). And it is certainly not about “capacitating” an underdeveloped bureaucracy.
Countries in this region are not necessarily lacking in capacity. All are located across the middle-income strata with large and often young populations, and a clear sense of the future they want.
The fact that senior government officials are choosing to engage in this kind of training is not a sign of weakness. It is a sign of strength. I think of it as a deliberate act of preparation.
An effort to understand a rapidly evolving technology not because they are behind, but because they refuse to be outmatched. Because they have already seen what happens when technology outpaces governance (viz social media).
They have lived through the social, political, and human consequences of (foreign) social media platforms that arrived without accountability — platforms that reshaped public discourse, harmed children, enabled exploitation, extracted wealth and concentrated power far beyond their borders.
They have experienced what it feels like to be on the receiving end of that power — to be subject to decisions made in boardrooms thousands of miles away. Or to be on calls with aggressive executives from big tech companies demanding they make certain choices. (I know, I’ve been on those calls.)
This time, they are choosing something different.
They are choosing to learn. To understand and to engage. And to build the confidence and vision needed to exercise agency over technologies that will shape their societies for decades to come.
I’ve written elsewhere about what I think of as a kind of “Bandung 2.0” moment — or perhaps more fittingly, Bandung 2.ai – Bandung in the age of AI. In 1955, Indonesia hosted the Bandung Conference, where newly independent nations came together to reject colonial domination and assert their right to determine their own political and economic futures.
What I see now is not a repetition, but an evolution of that same impulse. Then, the struggle was against territorial and political control. Today, it is against technological and infrastructural dependence. But the underlying principle is the same: a refusal to have one’s future shaped elsewhere. In that sense, efforts like this — to understand, govern, and ultimately shape AI — can be seen as part of a longer arc of asserting sovereignty, dignity and control over one’s own destiny.
So that, to me, is what this work is really about.
Not capacity building. Not training. But agency building.
Because if we are honest, the country where most big tech originates — the United States — has done the opposite. The American government has outsourced their expertise to private AI labs, to corporate-backed research centers, to industry associations whose incentives are aligned with profit, market share and power. They have allowed themselves to be guided — and often captured — by the very actors they are meant to regulate.
What I see increasingly across Asia is a different posture, one that ensures governance derives from what benefits citizens as human beings, not consumers. There is a willingness to engage seriously with the technology, to understand its risks and possibilities, and to shape it in ways that serve their people best.
As part of this work, our organisation, born here in Asia, are building the architecture of a sovereign knowledge ecosystem in Asia — one that centres regional expertise, lived experience and local histories in the governance of frontier technologies.
This is important to me because the global majority does not need to rely solely on frameworks, theories and models developed elsewhere. They can – and should – adopt and adapt them where it makes sense. And chart a new path where it makes sense, grounded in their own realities on the ground, in their communities and within their institutions. They can draw on their own experiences of harm, resilience and adaptation and force products and systems and “innovations” to conform to the future their people want.
In other words, knowledge creation drawn locally can help them shape their own visions of what technology should do — and who it should serve.
There is a long history behind this. Moments where countries came together to assert a different path, resist alignment with dominant powers and imagine alternative futures.
That spirit feels relevant to me again.
This is of particular importance because what is at stake with AI is not just innovation, but power. Who shapes it. Who benefits from it. And who bears its costs.
In that sense, this work is not about catching up. But rather, it is about refusing to be governed by technologies — and by companies — that do not answer to their people. It is about ensuring that the trajectory of these systems can be bent toward human flourishing, not just shareholder value.
So no — this is not training or capacity building. It is confidence building. It is coalition building. It is about strengthening the ability of policymakers to act with confidence, independence and purpose in the face of immense external pressure.
There is a new frontier in AI governance, and it might not come exclusively from where the technology is built.
It may in fact come from where the stakes are most deeply felt — and where the will and determination to shape it is strongest.



