Unbounded: 2 Minutes Edgewise delivers sharp, fast takes on current events, fresh revelations, and just cool things. Provocative or hopeful or fiery — it will always be brief, always grounded, and always unbounded.
Last week, I joined a civil society-led conference on AI where one theme kept surfacing: building trust.
If we want ethical and responsible AI that contributes to our human flourishing, then civil society and the private sector must develop enough trust to genuinely collaborate.
And yet, what I saw is what I’ve seen at so many similar events over the years. Civil society organizations struggle to get big tech to either take them seriously or to “make time for them” by actively participating in events.
When they do agree to participate, all too often tech company reps parachute in for their panel, maybe stay for the coffee break, and then vanish. If you’re lucky, they linger for half a day. Rarely do they stay on for a full multi-day program. And almost never for the interactive sessions where advocates and activists debate core issues, surface disagreements, or grapple with hard trade-offs while respecting and promoting our human rights.
Intentionally or not, the message this behaviour sends is that we already know what we need to know, and this isn’t worth our time - but I need to keep you ‘engaged’.
Talk amongst yourselves.
This frames civil society as a small, sometimes annoying player, to be managed, not a partner to be heard and respected.
In my view, meaningful collaboration requires moving beyond the performative. Trust doesn’t come from a single appearance on stage to impart your wisdom or perspective with the civil society crowd. It comes from being present long enough to hear what’s uncomfortable, to sit with voices that don’t sound like your own, to recognize the expertise and commitment that civil society brings. And then act on what you’ve learned.
From the other side, big tech often assumes civil society will participate when needed, especially when they’re footing the bill for airfare or accommodation. That generosity isn’t neutral and it creates a dependency, and with it, a sense of obligation. If your travel is covered, you’re expected to show up fully, to sit through the entire two-day program, to give your time and attention in return.
In other words, civil society is expected to commit fully to these moments curated by tech. Why should private sector participation be any different?
I understand the pressures — different incentives, relentless calendars, a thousand competing priorities, performance bonuses determined by doing more than time and budget permit. But that doesn’t make civil society any less demanding, any less sharp, or any less essential as a stakeholder to take seriously.
When a tech sector colleague slips out after their speaking slot, the message is unmistakable: we don’t really need you. And if that’s the dynamic, how can we possibly build the trust needed for ethical AI governance together?
At that recent conference, organizers made it clear to me that participants were asked to stay for the full two days. From what I saw, civil society honored that. What if big tech did too? What if they joined not just to speak, but to listen — with open minds and open hearts — for the duration?
The signal it would send would be powerful: that they see this work as partnership, not performance.
Remember, trust doesn’t begin with legal- and comms-approved statements, press releases or presentations. It begins with something far simpler, and far harder: showing up, and staying in the room.
The commitment of time matters most.
And for those in tech who give their time (you know who you are), its what I’m the most grateful for.
AND *THAT* IS 2 MINUTES EDGEWISE, UNBOUNDED.