Sam Altman Defends AI Resource Use at India AI Summit

OpenAI CEO Sam Altman has defended the resource consumption of artificial intelligence. During the India AI Impact Summit, he addressed growing concerns about AI’s water and energy use. In particular, he rejected claims that AI data centers consume extreme amounts of water.

In an interview with The Indian Express, Altman responded to criticism circulating online. He specifically denied viral posts claiming that ChatGPT uses gallons of water per query. According to him, those claims are completely inaccurate and disconnected from reality.

 

Water Usage in Data Centers Remains a Concern

Traditionally, data centers use water-based cooling systems to prevent overheating. However, newer technologies now reduce water dependency. In fact, some modern facilities no longer rely on water at all.

Nevertheless, many existing centers still use water for cooling. As computing demand increases, experts expect pressure on water systems to grow. For example, a recent report by Xylem and Global Water Intelligence predicts that cooling-related water use could triple within the next 25 years. Therefore, the debate around water sustainability continues.

 

Energy Consumption Requires Clean Solutions

While Altman dismissed water-related claims, he acknowledged energy consumption as a genuine issue. Importantly, he clarified that the concern lies in total usage, not in individual AI queries.

As AI adoption expands worldwide, overall electricity demand rises. Consequently, Altman urged the industry to transition quickly toward cleaner energy sources. He highlighted nuclear, wind, and solar power as key solutions. According to him, rapid energy transformation is essential for long-term sustainability.

 

AI vs Human Learning: A Different Comparison

Meanwhile, Altman responded to comments by Microsoft co-founder Bill Gates regarding brain efficiency. He argued that many comparisons between AI and humans lack context.

For instance, critics often focus on the energy required to train AI models. However, they rarely consider the energy required to educate humans. Altman noted that humans spend nearly 20 years learning. During that time, they consume food, electricity, and other resources.

Therefore, he suggested a different benchmark. Instead of comparing training costs, he proposed comparing the energy needed for a trained AI model to answer a question with that required for a human to answer the same question. By that measure, he believes AI has already achieved competitive efficiency.

This stage, known as “inference,” typically consumes far less energy than model training.

 

Online Debate and Industry Reaction

Unsurprisingly, Altman’s remarks triggered online debate. Many users expressed concern about AI replacing human jobs. At the same time, some experts criticized the human-AI comparison.

For example, Sridhar Vembu, co-founder and chief scientist of Zoho Corporation, opposed equating machines with humans. He argued that technology should not be viewed as equal to human beings.

 

Rapid Expansion of Global Data Centers

Meanwhile, governments and corporations continue investing billions in AI infrastructure. As a result, new data centers are emerging across the world.

According to a May report by the International Monetary Fund, global data center electricity consumption in 2023 matched levels seen in Germany or France. Notably, this surge followed the rapid adoption of ChatGPT.

In response, some governments now fast-track approvals for new energy projects. However, environmental groups warn that such expansion could conflict with global climate targets.

Additionally, local communities in the United States have opposed certain projects. Residents fear pressure on electricity grids and rising power costs. Recently, San Marcos, Texas rejected a $1.5 billion data center proposal after months of public resistance.

 

Future of AI Energy Infrastructure

Amid this growing debate, technology leaders continue advocating diversified energy strategies. Altman, in particular, supports combining renewable and nuclear sources.

Ultimately, the future of AI infrastructure will depend on balancing innovation with sustainability. Therefore, the conversation around energy and water use is likely to continue.

Leave a Reply

Your email address will not be published. Required fields are marked *