There’s a paradox at the heart of Nvidia’s sustainability performance

The world’s largest supplier of AI hardware makes super efficient computer chips, but is Nvidia reducing the environmental toll of the industry or driving it?

Santa Clara, CA - Feb. 1, 2018: NVIDIA Corp. Photo via 123rf

This year, Corporate Knights’ Global 100 ranking of the most sustainable companies in the world saw one notable addition: Nvidia, the world’s largest company, with a peak market capitalization exceeding $5 trillion, was ranked 53.

By one estimate, Nvidia provides the hardware for more than 70% of the market for AI chips, and estimates from 2023/2024 suggested it was previously closer to 95%. This is because Nvidia chips are exceptionally efficient, performing the heavy computational tasks required for machine learning at a much lower energy threshold than common computer chips.

“This year we created a definition for sustainability in AI hardware based on energy processing per unit,” explains Michael Yow, director of rankings at Corporate Knights. The sustainability threshold used by Corporate Knights is measured in gigaFLOPS (floating-point operations per second) per watt, the number of computations performed per watt of energy. The threshold was first established by the Green500, a ranking in which Nvidia’s Grace Hopper chips power seven of the top 10 most energy-efficient supercomputer systems.

As a result, $75 billion of Nvidia’s 2025 revenues – about 57% of the total – was deemed sustainable according to the Corporate Knights methodology. That sustainable revenue was not only one of the largest gross totals in the ranking; it was also among the fastest growing, registering 123.5% growth since 2022. (All figures in USD.)

The environmental impact of data centres populated with millions of these chips is increasingly well understood. One study in December reported that current AI infrastructure has roughly the carbon footprint of the city of New York (about 80 billion tons) and that its collective water use is in the range of the global bottled water industry (765 billion litres).

But these figures pale in comparison to the potential future impact. Last September, OpenAI’s Sam Altman issued an internal memo saying the company’s “audacious long-term goal is to build 250 gigawatts of capacity by 2033.” An analysis from Truthdig found that this would put ChatGPT’s energy use on par with India’s 1.5 billion people, which, depending how that energy is sourced, could produce carbon emissions twice that of ExxonMobil, currently one of the largest non-state emitters in the world.

Nvidia is the primary supplier of the chips populating OpenAI’s data centres, and the two companies recently announced a partnership to continue to build out this infrastructure.

The 2026 Global 100 list puts speed in the spotlight

The power sources

There is an irony to all this: the scale of these operations and their environmental impact is almost exclusively a result of the resource efficiency of Nvidia’s chips. Data centre operations for AI would be impossible if they relied on general-purpose servers. It’s the efficiency of Nvidia’s chips that has enabled this “insatiable demand” for computing power. In the United States, it’s estimated that AI data centres could account for as much as 12% of all energy consumption by 2028.

Where is that power coming from? A 2025 report from the International Energy Agency found that coal currently accounts for about 30% of electricity generation for AI data centres globally, mostly in China and the United States. Renewables – wind, solar and hydro – account for 27%, and natural gas accounts for 26% (40% in the U.S.).

Michael Yow, Corporate Knights’ Director of Ratings and Rankings, points out that Nvidia as a company cannot be held responsible for the power sources of the AI industry, even if its chips do make up much of its infrastructure. “Would we be having this conversation if all the electricity was 100% carbon-free?” he asks. “The problem is not with the product but rather with the lack of planning and faster deployment of renewable energy.”

Because of the outsized electrical needs, AI companies are exploring options for generating their own on-site power. Altman has, for example, personally invested in a start-up developing nuclear fusion, the holy grail of renewable power generation, and his company has joined the likes of Google, Microsoft and Amazon to invest in small modular nuclear reactor technology. Nevertheless, the most common on-site power sources at present are fossil fuels. At OpenAI’s Stargate facility under construction in West Texas, one of the largest in the world, they are currently deploying dozens of turbines adapted from aircraft engines.

That data centre is populated with Nvidia chips and operated by OpenAI and Oracle. Though the majority of Nvidia chips are not in Nvidia-owned or -operated data centres, the company reported that all offices and data centres under its control are powered exclusively by renewable energy and that it purchases additional carbon-free electricity to cover 100% of the footprint of its leased data centres.

The optimistic outlook

Nvidia has taken measures to address the water consumption at its own data centres, which use it to cool overheating servers. In its 2025 annual sustainability report, the company said it conducts annual water-risk assessments near all its facilities. Its sites in Santa Clara, California, and Hyderabad, India, have water treatment facilities so wastewater can be used for landscape irrigation. The company is also introducing closed-loop liquid cooling systems to reduce water use, and the Blackwell computing architecture – the successor to the Grace Hopper chips – is 300 times more water efficient than air-cooled architecture. (This does not address water use during manufacturing, however, which relies on ultra-pure jets of water to etch the silicone wafers, becoming contaminated in the process. Nor does it pertain to the data centres Nvidia does not control or operate.) A Nvidia spokesperson declined to participate in this story beyond referencing existing company communications.

Those bullish on the prospect of an AI-optimized future will argue the environmental impact of AI itself will be dwarfed by the efficiencies the expansion of these systems will allow. On one blog, Nvidia cited reports looking at projected U.S. energy demand into 2035. If AI applications are “fully realized”, estimates suggest it could save nearly 2,500 petajoules (PJ) of energy — about 25% of the entire country’s energy use in 2023.

Such savings would be transformative, and there is a much wider range of environmental applications that AI might come to revolutionize: wildfire detection and management, global climate simulations and extreme weather modelling, electrical grid applications to manage fires and outages, wildlife population tracking, materials science for cleantech applications, and other conservation or carbon-reduction efforts.

To date, the economy-altering value of these companies has been largely measured against their future potential, even as the imminent costs of doing business continue to mount. The environmental balance sheet isn’t so different.

Tristan Bronca is a magazine writer and editor based in Newmarket, Ontario.

Latest from 2026 Global 100

Global 100 resources

Explore the full methodology and sustainability taxonomy that powers the world’s most transparent sustainability ratings

SUBSCRIBE TO OUR WEEKLY NEWSLETTER

Get the latest sustainable economy news delivered to your inbox.