Here’s a trick question: What is the greatest source of new energy in North America since 1975? It’s not solar. It’s not wind. It’s not nuclear.
It’s energy efficiency.
Energy efficiency is the cheapest source of new energy because every kilowatt-hour that I save on the grid is one that someone else can use.
Within the world of computers, the price-performance efficiency is even more pronounced. A 2023 iPhone 15 is 60,000 times cheaper than a 1976 Cray-1 supercomputer. It’s also 180,000 times more energy efficient and 5,000 times more powerful.
Kevin Weil, chief product officer at OpenAI, says that OpenAI’s models are improving 10-fold every year in energy efficiency. Deferring compute-heavy tasks such as AI training to times when data-centre energy use loads are light and assigning loads across many data centres – in essence adjusting when and where power is used – could unlock 200 gigawatts of load flexibility by 2030, argues Amory Lovins, one of the cofounders of the Rocky Mountain Institute, which is on a mission the transform global energy systems. That’s far more than enough to power all projected new data centres from existing utility assets.
AI driving efficiency in buildings
Computers aren’t the only arena where efficiency is making giant leaps. Buildings represent roughly 40% of global energy consumption and 75% of U.S. electricity use, making them an enormous target for efficiency improvements. Research published in Nature Communications found that AI applications could reduce building energy consumption by 8% to 19% by 2050, and up to 40% when combined with other policies such as retrofits and low-carbon power generation. Building efficiency can free up more electricity use than AI will ever require.
This creates a fascinating dynamic: while AI consumes energy, it also enables efficiency gains across the broader economy. The key is ensuring that AI deployment is strategic and coupled with robust measurement and verification protocols. AI for buildings isn’t theoretical; companies like BrainBox AI and others are already deploying systems that optimize HVAC (heating, ventilation, air conditioning), lighting, and energy storage in real time based on occupancy, weather and grid conditions.
“Artificial intelligence is the latest development in a long-standing megatrend in which information, analysis and innovation have been replacing the waste of energy and materials that characterize the overbuilt technologies of the 20th-century fossil economy,” notes Ralph Torrie, director of research at Corporate Knights. “Of course it uses electricity, but not nearly as much as it displaces.”
There are many examples of rapid, unexpected gains in energy efficiency. In 2021, all the blockchain-based cryptocurrencies combined used somewhere between 190 and 250 terawatt-hours of electricity, which is just about 1% of global electricity demand that year, or roughly the same as Taiwan’s consumption. Critics called blockchain technology fundamentally unsustainable.
Then in September 2022, Ethereum underwent a major transformation it called “The Merge,” shifting its consensus mechanism from “proof of work” to “proof of stake.” This cut the network’s energy consumption by 99.9%. The network’s annual energy use dropped from 80 terawatt-hours – the same amount of electricity that Austria or Finland uses in a year – to just 0.01 terawatt-hours.
Proof of stake achieved these gains by eliminating wasteful competition. Instead of miners racing to solve puzzles, Ethereum now relies on validators who are chosen to create new blocks and confirm transactions based on the amount of Ether (Ethereum’s native currency) they have staked as collateral. This method secures the network through financial commitment rather than raw computational power.
When real limits appear, innovation often accelerates in unexpected ways. Constraints can become the spark for new possibilities.
— Anthony Di Iorio, Ethereum co-founder
The shift to proof of stake also enhanced its security, scalability and environmental sustainability. This landmark move positioned Ethereum as a leader in sustainable blockchain innovation and demonstrated that large-scale decentralized systems can evolve to meet global energy and climate goals without compromising performance or decentralization.
Ethereum co-founder Anthony Di Iorio says that Ethereum’s massive efficiency gain is part of a broader pattern of disruptive innovation across technologies: “When real limits appear, innovation often accelerates in unexpected ways. Constraints can become the spark for new possibilities.”
For Di Iorio, the key lesson is that “fundamental architectural redesign beats incremental optimization. When you eliminate structural inefficiency rather than just making a system slightly less inefficient, you unlock orders-of-magnitude improvements.”
DeepSeek, a Chinese AI company, also showed that algorithmic innovation can cut costs – in its case by 97% even under severe hardware constraints. Meanwhile, SETI@home coordinated millions of personal computers to create a huge virtual supercomputer to power its search for extra terrestrial intelligence.
Better orchestration, multi-tenant GPU sharing (in which multiple users share computational resources) and carbon-aware scheduling could push AI infrastructure use from today’s 25% to 40% to 55% to 60%, effectively doubling capacity without building any new facilities.
“The question isn’t whether AI will consume catastrophic amounts of energy,” Di Iorio notes. “The question is whether we will apply what we’ve already learned about radical efficiency gains.” That means implementing the measurement and transparency frameworks that enable market discipline and establishing the policy guardrails that ensure that efficiency serves sustainability rather than just enabling endless expansion.
The path forward: Three essential actions
Three near-term actions could save 15 to 70 terawatt-hours by 2028:
Default to efficiency. Major cloud platforms and AI frameworks should make lean, right-sized models the default choice rather than requiring developers to opt in. When efficiency becomes the path of least resistance rather than an extra step, adoption accelerates dramatically.
Focus on hardware use. Better workload-management software can increase effective use from today’s 40% to 55 to 60%.
Mandate transparency. Energy consumption per task should be as visible as price and latency. When enterprises and governments demand kilowatt-hours-per-million-tokens disclosure in their AI procurement, providers will compete on efficiency. Market mechanisms work, but they require information.
The projections of AI’s looming energy crisis aren’t wrong if we assume nothing changes. But stasis isn’t how technology works when talented people face hard constraints with clear incentives to solve them.
Jim Harris is an author, environmentalist and the former leader of the Green Party of Canada
The Weekly Roundup
Get all our stories in one place, every Wednesday at noon EST.
