Disorientation: AI and the Climate Crisis
< Unbounded: 2 Mins Edgewise
Unbounded: 2 Minutes Edgewise delivers sharp, fast takes on current events, fresh revelations, and just cool things. Provocative or hopeful or fiery — it will always be brief, always grounded, and always unbounded.
Speaking at the Prince Mahidol Awards Conference in Bangkok this week, I reflected on artificial intelligence and climate disinformation, as a package. In short, we are facing twin emergencies. One is climate change — visible and urgent. The other is a crisis of truth — harder to see but just as dangerous.
And the tools we hoped would help — artificial intelligence, ‘democratised’ social media platforms, communications technologies — are now, paradoxically, accelerating confusion. We all too often focus on “misinformation” as some fixable effluent of technological innovation — which sits quite well with big tech because it doesn’t get at their destructive business models.
The truth is, America’s tech business models, incentivised by profiteering, create a danger that is not simply misinformation, but better understood as DISORIENTATION: people just don’t know what to trust or where to look.
AI as an accelerant
Most people think of AI as ChatGPT, but its been ever present in our lives well before that was launched publicly. AI is the invisible scaffolding behind YouTube recommendations, Facebook timelines, and TikTok virality. The same technology that makes Spotify feel uncanny in its music picks is also used to do the same with climate disinformation — with scale, speed, and eerie precision.
Generative AI makes disinformation cheap, easy and hard, all at the same time:
Cheap to create convincing climate denial or delay narratives.
Easy to tailor content to specific audiences — whether it’s doubt about renewables or exaggerating the costs of transition.
Hard to distinguish between credible and manipulated content.
We’re already consuming fake protest footage, synthetic climate “experts,” and polished content that mimics real journalism. Sometimes it’s quite obvious and sometimes not so much.
And that’s the point: it chips away at what we trust and how we know.
This information disorientation stems from a deteriorating information ecosystem that creates confusion. When everything feels uncertain, nothing can mobilise us.
Our information ecosystem is fragmenting — just when we need it most
We live in a fractured attention economy where truth competes with engagement, and AI-generated and enabled noise drowns out real reporting and journalism. One study showed that more than 1 in 5 videos recommended to new YouTube users are low-quality or misleading AI slop. Imagine being 17, trying to understand climate change, and ending up with AI-generated propaganda before you’ve even approached peer-reviewed science.
Implication: synthetic content dominates discovery before public interest journalism ever surfaces.
Stop treating AI and climate as separate issues
Right now, fossil fuel interests can use AI tools to scale greenwashing. Language models can be groomed to produce friendly narratives that delay climate action. And unregulated agents could eventually flood feeds with misleading content — 24/7, no fatigue and no oversight.
Governance conversations focus (rightly) on bias, fairness, and privacy. But climate integrity and public trust need to be part of that conversation too.
It’s not just the lying
AI can help with the climate crisis — smarter grids, better modeling — if it’s deployed within a just, accountable framework. But techno-optimism is no substitute for democratic governance.
And the real threat isn’t just that AI might lie to us. It’s that it may crowd out our capacity to know, to act, and to trust — just when we need all three the most.
So, maybe we shouldn’t silo these issues.
Climate integrity and information integrity must be considered in tandem. In a world of engineered confusion, safeguarding trust is not just a policy task, but a climate — and existential — imperative.
We can’t afford to remain disoriented any longer.
What do YOU think?



