How Can AI Address Climate Justice When Women’s Voices Are Silenced? – State of the Planet

An AI generated image of a women standing by data banks in an AI data center
Credit: franganillo/Pixabay

Artificial Intelligence (AI) is rapidly becoming central to environmental decision-making. But unless women’s voices, labor and lived realities are embedded in its foundations, it risks reinforcing the very inequities it claims to solve. 

In many parts of the world, women stand on the frontlines of climate stress. According to the United Nations, if current trends continue, an estimated 341 million women will still lack electricity in 2030—85 percent of them in sub-Saharan Africa—and forced to depend on costly, labor-intensive fuels. Women around the world also collectively spend 200 million hours, the equivalent of nearly 23,000 years, hauling water every day instead of studying, earning or participating in public life; a large AI data center can consume up to 5 million gallons of water a day to keep their servers cool. As women ration every bucket and queue at communal taps, these facilities secure access to water through long-term contracts and public incentives. Data centers also draw uninterrupted power from national grids, with their electricity demand expected to soar to around 1,050 terawatt-hours annually—on par with some of the world’s largest national consumers of power.  

The Gendered Digital Divide 

Despite their central role in society, women face persistent barriers to digital access and representation, resulting in their underrepresentation within AI training data, system design and governance processes. In many regions, women face structural barriers to internet connectivity and computing devices—the basic conditions required for digital participation. Men are 21 percent more likely to be online globally, with the gap widening to 52 percent in least developed countries. This means women are systematically underrepresented in the data infrastructures that increasingly inform climate and energy policy.  

And even when access is available, evidence from AI adoption demonstrates that this inequality persists. An analysis of 133 AI systems found that 44.2 percent exhibited gender bias, while 25.7 percent showed both gender and racial bias. Global evidence shows that women are 20-25 percent less likely than men to adopt generative AI tools under equal conditions of access and exposure. Women also face substantially higher automation risks while remaining underrepresented in AI workforces and digital infrastructure access. These patterns matter because AI systems increasingly shape decision-making related to the environment, including risk modeling, disaster prediction, insurance pricing, energy forecasting and resource allocation.  

For example, disaster response models, in the absence of women’s input, might prioritize asset recovery while overlooking critical issues—particularly those disproportionately faced by women—such as prolonged heat exposure, the safety and sanitation conditions of evacuation shelters, medication continuity during displacement and informal income losses. When these gendered dimensions of climate harm are not encoded into datasets or decision criteria, they do not shape just outcomes. Over time, this pattern risks redefining priorities around what is present in the data, and easiest to quantify. 

VEJA  Bunker billionaires on a burning planet

The Geography of AI 

Intersectional factors such as class, race, caste, migration status, geography and age further reveal how AI-driven climate governance aligns technological power with social privilege, reinforcing existing inequalities. The geographies of data centers—where they are sited and who is asked to live with their risks—often reveal mechanisms of injustice. AI facilities are increasingly clustered in regions already marked by environmental vulnerability, weak regulatory protections and limited political leverage. This spatial concentration adds to a longer history in which resource-intensive infrastructures have been justified in the name of economic development, while harming marginalized communities. The distribution of AI’s material footprint shows how women, particularly women of color, disproportionately absorb the everyday consequences of environmental degradation.  

In the U.S. South, for example, more than 1000 data centers are either operating or proposed, with many of these facilities sited in parts of Georgia, Alabama, South Carolina, Mississippi and Tennessee. These are often in counties and neighborhoods with large Black and Hispanic populations, and in communities where women already face the highest energy burdens in the country and legacies of toxic siting. Reproductive justice scholarship indicates that women living in over-polluted neighborhoods are already subject to elevated risks of preeclampsia, gestational diabetes, adverse birth outcomes, reduced fertility and fibroids linked to chronic exposure to fine particulate matter and traffic-related air pollution. The continued placement and expansion of AI data centers in these regions may deepen existing reproductive health inequities rather than alleviate them via economic growth—unless community-driven environmental and reproductive justice safeguards are meaningfully integrated into siting, regulation and oversight processes. 

These dynamics are not confined to the United States: they are embedded in a broader global North–South divide in AI’s political economy. In this sense, AI’s global geography is cloning historically unequal divisions of labor and value, rearticulated through contemporary digital and physical infrastructures. The systematic exclusion of women’s knowledge, labor and lived experience from climate and environmental AI will create a reinforcing feedback loop in which women are exploited physically and digitally, with the benefits provided only to a few. This aspect remains largely invisible as it plays out in computational frameworks. Nevertheless, the overall consequences can be disastrous.  

VEJA  How Fossil Fuel Mad Men Have Aided and Abetted the Industry’s Climate Denial

Centering Feminism in AI’s Climate Future 

AI must address questions of bias, representation and inclusion in its data and algorithms. It must also interrogate how knowledge is produced and used—through digital infrastructures that intensify social injustices and amplify environmental risk for women and marginalized communities worldwide. An assessment of AI and the Sustainable Development Goals (SDGs) found that while AI could potentially support 134 of the 169 SDG targets, fewer than 2 percent of documented AI use cases explicitly target SDG-5 (gender equality), compared with more than 25 percent focused on SDG-7 (affordable and clean energy) and SDG-13 (climate action) combined.  

Without considering the participation and needs of women around the world, AI in environmental decision-making operates as a mechanism of expropriation and exploitation, extracting value from women’s time, care work and environmental stewardship while limiting their role in decision-making and benefit distribution. A feminist approach to AI governance as it shapes climate and environmental systems needs to be readily adopted, addressing questions such as who produces data, whose labor is recognized, who bears environmental costs and who participates in decision-making.

Responding to these dynamics requires coordinated investments in women’s digital and green skills, data infrastructures that account for informal and care economies, enforceable environmental accountability across AI supply chains and leadership roles for women within climate and technology institutions. Together, these interventions can promote distributive justice by enabling those most exposed to climate and AI risks to shape and benefit from the futures these systems create. Without these steps, there can’t be climate justice.  

Pavi Selvakumar is a postdoctoral research scientist at the Columbia Climate School. She is interested in exploring how AI and climate justice can be integrated to support equitable resilience planning in frontline communities.

Marco Tedesco is a research professor at the Lamont-Doherty Earth Observatory, and an adjunct scientist at the NASA Goddard Institute for Space Studies.

Views and opinions expressed here are those of the authors, and do not necessarily reflect the official position of the Columbia Climate School, Earth Institute or Columbia University.

Postagem recentes

DEIXE UMA RESPOSTA

Por favor digite seu comentário!
Por favor, digite seu nome aqui

Stay Connected

0FãsCurtir
0SeguidoresSeguir
0InscritosInscrever
Publicidade

Vejá também

EcoNewsOnline
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.