Online algorithms could help save the planet with just a few small tweaks

The Conversation
10 Feb 2025

Online algorithms could help save the planet with just a few small tweaks

Have we tried everything to tackle the climate crisis? At least one simple idea has hardly been explored: prioritizing climate content on social media.

The climate crisis is seriously aggravated by a lack of attention, including in the recent United States presidential election campaign. But algorithmic recommenders could help, as they are responsible for a significant proportion of how human attention online is allocated. Algorithmic recommenders are artificial intelligence systems that suggest content, such as news feeds, music or videos, to people based on their behaviour and preferences.

Take YouTube, where hundreds of millions of users watch billions of hours of content each day. That's a huge amount of brain time. But how do these users select the handful of videos they watch, out of the billions of uploaded content online? Well, in 70 per cent of cases, they merely follow YouTube's automated recommendations. This system determines a massive proportion of human attention.

Effectively leveraging this attention could help achieve vital advances in climate action across the political spectrum.

In a recent article published in Ethics and Information Technology, we argue that YouTube - the world's biggest online video library - should tune its recommendation algorithm in a way that favours the mitigation of the climate crisis. We even propose a precise figure: two per cent of recommendations should be selected for their climate content.

This goal raises a number of critical questions.

What kind of videos could be recommended? Educational videos on climate change are clear candidates, but so are conferences by climate activists, as well as content that encourages viewers to mobilize or change their behaviour, for example by promoting public transport, plant-based cooking or climate demonstrations. The two per cent figure is a proposal, not a dogma. It's far from invasive, but it's still significant.

Another fundamental question is: who decides which videos are good for the climate? From the Intergovernmental Panel on Climate Change to relevant non-governmental organizations to video hosting platforms themselves, there are potential avenues for determining climate-positive content. In any of these cases, transparency will be key to effectiveness.

Firstly, as American researcher Tarleton Gillespie explains in his book Custodians of the Internet, YouTube is already doing moderation, which is a central part of its business. For example, it removes pornographic, violent or illegal content in the name of user safety and well-being, and in accordance to copyright or local laws. Our proposal is merely an extension of these efforts.

Currently, YouTube's algorithmic system appears not to be programmed to push relevant content for the climate, which is endangering the viability of climate content creators. Its own researchers report that it instead maximizes user engagement.

YouTube's algorithm is extremely powerful. If the platform were to direct some of its users' attention to pro-climate action content, it would likely go a long way toward boosting awareness and encouraging action on climate change. There is a strong argument to be made for programming the algorithm along these lines. Simply put, a significant potential benefit for us all is possible at relatively little cost.

Research has also found that YouTube has, in the past, contributed to spreading false information about the climate crisis. A 2024 report found that YouTube earned millions of dollars a year from content that promoted climate denial.

YouTube says that it won't show ads on "content that crosses the line to climate change denial." However, video-sharing platforms have a moral responsibility to also promote information that is factual. This could be done by amplifying climate videos as we propose.

YouTube's algorithm may be likened to a librarian who is tasked with deciding how the library's books are displayed. In the context of the climate crisis, a wise and informed librarian should put forward at least some books on this issue. Online algorithms should be designed less like an attention-grabbing machine and more like a responsible librarian.

Our proposal would likely not be without detractors. For example, would it amount to manipulating users? Our proposal is overtly about influencing people's attitudes in favour of tackling the climate crisis. But it's not about imposing specific content on the user, who remains free to choose whether to watch the content. The nudge is very gentle - and hardly all that different from the algorithmic nudges taking place all across the internet.

Our proposed intervention merely acts on a small fraction of recommendations. No one will force viewers to watch videos with Greta Thunberg, David Suzuki or Michael Mann. On the other hand, if successful, our proposal could help avoid the serious problems that would result from climate inaction.

In the face of the growing environmental crisis, recommendation algorithms like YouTube's could help us build climate bridges across political divides, promote action and raise awareness - all essential tools to building a more just future.