- Reporter: Noor Mubarak
- Editor: Nour Amr
- Image Credits: Shutterstock
AI feels invisible. It only takes a few seconds to ask ChatGPT a question, but every reply draws on huge amounts of water and energy.
These models need a lot of electricity to run and they also need water to stay cool. That’s why even quick answers use a lot of energy and water.
Training AI models everyday and running them isn’t free for the environment, and experts warn while it can help the environment grow in some ways, the growing carbon footprint it leaves behind could easily worsen the climate crisis.
A 2023 study from the University of California, Riverside, found that training large AI models like GPT-3 can use more than 700,000 liters of water. That’s about the same amount it takes to manufacture 320 electric cars. And that’s just for training. Using AI tools like ChatGPT daily also consumes a significant amount of energy, usually derived from fossil fuels, and requires large amounts of water to cool the systems and prevent overheating.
So, is AI a major threat to the environment?
Not exactly, says Dina Mahmoud, an assistant professor of Computer Science and Engineering at AUC. She believes the situation is more complicated.
“Yes, large-scale AI uses significant energy and water,” she explained, “but much of the environmental cost is actually due to inefficient hardware design. Newer systems, like FPGA-based architectures (Field-Programmable Gate Arrays), are significantly more energy efficient because they’re customized to the algorithms they run.”
Mahmoud researches how to build secure and energy saving systems by combining CPUs and FPGAs. She said this design helps reduce wasted energy. However, she admitted that it’s difficult to balance saving efficiency with system security and reliability.
“Security features often require extra processing or hardware,” she said. “So there’s always a trade-off. But if we design with these priorities in mind from the start, we can minimize the overhead.”
Even leaders in the AI field are talking about this issue. In a recent exchange on X, Sam Altman, the CEO of OpenAI, recently noted that something as small as replying “thank you” to ChatGPT could cost thousands or even millions of dollars a year in energy. His point is that even small online actions can add up to a large carbon footprint.
But not everyone thinks AI deserves its reputation as an environmental villain.
Seif Eldawlatly, another professor in AUC’s Computer Science and Engineering Department, warned against oversimplifying the issue.
“To say that AI is destroying the environment is misleading,” he said. “Yes, large language models consume a lot of energy, but AI also enables tools that reduce environmental harm.”
He pointed to AI tools used in Egypt that detect illegal construction on farmland using satellite imagery. This is a process that saves energy compared to manual inspection. Other AI systems help sort recyclables or reduce car trips by matching patients with the right doctor through chatbots.
“You have to weigh the energy cost of AI tools against the energy they save elsewhere,” said Eldawlatly. “Sometimes using AI can mean one visit to the doctor instead of three, which saves fuel and electricity.”
Ahmed Rafea, a professor and graduate program director of AUC’s Computer Science and Engineering Department, said AI can help the environment if it’s used the right way. He explained that older AI tools, like expert systems used in farming, helped save water and fertilizer without consuming much energy. However, newer AI systems in agriculture, which rely on machine learning and cloud computing, can use significantly more power.
“The promise is there,” he said, “but the environmental cost of training large ML models can’t be ignored.”
Rafea believes that to reduce this impact, we need smarter methods like transfer learning and federated learning, and we should power data centers with renewable energy.
“Streamlined architectures, transfer learning, and decentralized approaches like federated learning can all help reduce computational demands,” he explained.
Still, all three experts agree that AI systems need improvement.
“There’s a push now for models that use less power and fewer training resources,” said Eldawlatly. “We’ll see smaller, more efficient models replacing today’s energy-hungry giants.”
As AI spreads to more parts of life, experts say we need to rethink how it’s built. Mahmoud emphasized that sustainability must be embedded from the start.
“You can’t just bolt on sustainability at the end,” she said. “We have to build smarter from the ground up.”