Is It Worth It? AI’s Global Environmental Footprint: Energy, Water, and E-Waste

As of late, developments and advancements in AI seem to be coming at a feverish pace with seemingly no end in sight. From the major players like OpenAI, Google, Meta, and even Apple, down to the onslaught tools from companies formed seemingly out of nowhere, it literally seems there is no end on the horizon for AI.

However, if you’re like me, you find it all fascinating and often downright cool. But, like me, you may also inevitably find yourself pondering the same question that runs through my head from time to time: “Is it worth it?”

As a person who has always been fascinated with, and who has a career life in technology, I still hold on to the belief that technology should always help human life and not harm or replace it. So, in trying to keep up with all things AI, I have written about and spoken to professionals about many of the potential human-helping good things that AI can help with. However, in light of all that goes into these tools I can’t help but continue to ask the question “Is it worth it?” Not that I’m jumping on the “gloom and doom” bandwagon, but we simply can’t ignore the major negative side effects of AI technology and its development.

Generative AI, specifically models such as ChatGPT, Bing Copilot, and those powered by OpenAI, require vast amounts of energy for training and operation, raising concerns about their environmental impact.

The training process alone for a large language model like ChatGPT-3 can consume up to 10 gigawatt-hours (GWh) of power, which is roughly equivalent to the annual electricity consumption of over 1,000 U.S. households. This energy consumption translates to a substantial carbon footprint, estimated to be between 55 and 284 tons of CO2 for training ChatGPT-3 alone, depending on the electricity source.

Running these models also demands significant energy, albeit less than training. A single ChatGPT query can consume 15 times more energy than a Google search query. As AI, particularly generative AI, becomes more integrated into various sectors, the demand for more data centers to handle the processing power will increase. Data centers already account for about 1–1.5% of global electricity consumption and 0.3% of global CO2 emissions. This demand is projected to escalate, potentially leading to global AI electricity consumption comparable to the annual electricity consumption of Argentina and Sweden by 2027. Additionally, water consumption for cooling these data centers is another environmental concern, with estimates indicating global AI demand could be responsible for withdrawing 4.2–6.6 billion cubic meters of water annually by 2027.

The ICT sector, encompassing AI infrastructure, contributes to about 2% of global CO2 emissions. This contribution is expected to rise proportionally with the increasing use and development of generative AI models. While the financial aspects of these operations are substantial, with estimated daily operating costs for ChatGPT reaching $700,000, the environmental costs, particularly in terms of energy consumption and carbon footprint, are significant and warrant attention.

Electronic waste (e-waste) from AI technology includes harmful chemicals like mercury, lead, and cadmium.

These chemicals can leach into the soil and water, posing risks to human health and the ecosystem. The World Economic Forum (WEF) predicts that e-waste will exceed 120 million metric tonnes by 2050. Managing e-waste responsibly and recycling it is crucial to prevent environmental damage and limit the release of toxic substances. Stricter regulations and ethical disposal methods are necessary to handle and recycle e-waste associated with AI safely.

Global Impact of AI Training on Water Resources

Research indicates that global AI demand could lead to the withdrawal of 4.2 – 6.6 billion cubic meters of water by 2027. This projection surpasses the total annual water withdrawal of half of the United Kingdom. The issue of AI’s impact on water consumption, alongside other potential environmental effects, is often overlooked. The lack of data shared by developers contributes to this issue.

Water Consumption of ChatGPT

One report states that ChatGPT-3 consumes approximately 800,000 liters of water per hour. This amount of water is enough to fulfill the daily water needs of 40,000 people.

Factors Contributing to Water Consumption in Data Centers

Data centers, which house the servers and equipment for storing and processing data, require significant amounts of water for cooling and electricity generation. The increasing demand for AI services leads to a higher demand for data centers. Data centers account for about 1 – 1.5% of global electricity consumption and 0.3% of global CO2 emissions.

Reducing the Environmental Impact of AI

Several strategies can be implemented to reduce the energy consumption and environmental impact of AI systems like ChatGPT.

  • Enhancing the efficiency and design of hardware and software used for running AI models. One example is using liquid immersion cooling as opposed to air cooling, which can lower heat and minimize carbon emissions and water usage in data centers.
  • Powering data centers with renewable energy sources like wind, solar, and hydro power. Some countries, such as Norway and Iceland, have low-cost, green electricity due to abundant natural resources. Taking advantage of this, numerous large organizations have established data centers in these countries to benefit from low-carbon energy.
  • Limiting the use of AI models to essential and meaningful applications, avoiding their use for trivial or harmful purposes.
  • Increasing transparency and disclosing water efficiency data and comparisons of different energy inputs related to AI processes.

Need for Transparency and Accountability

There is a call for increased transparency regarding operational and developmental emissions resulting from AI processes. This includes disclosing water efficiency data and making comparisons between different energy inputs. Open data is essential to compare and assess the true environmental impacts of various language models, as well as the differences between them. For instance, a coal-powered data center is likely to be less energy-efficient than one powered by solar or wind energy. However, this assessment requires access to open data. A comprehensive evaluation should consider economic, social, and environmental factors. Community engagement, local knowledge, and individual understanding can be influential in persuading developers to share this data. Increased awareness of potential environmental impacts, along with the more widely discussed ethical concerns surrounding AI, could strengthen individual calls for a more accountable and responsible AI industry.

All said and done, and in consideration of the information that is available, it is difficult at best to answer the question as to whether or not AI will ever be “worth it”. In my opinion, we need more tangible, positive outcomes in order to truly have an answer. However, at this point in history, I’m finding it harder and harder to believe that there will ever be a viable payout to justify the amount of resources that are going into the development stages alone. I’m not calling for an overall halt to all AI development and usage (we’re far beyond that being a sensible answer), but I do believe we as a collective whole should agree to more efficient, less wasteful paths forward.