My 12-year-old took on a small project this summer: writing a report on every breed of domestic cat.
It was more challenging than expected (who knew there was no consensus on the actual number of breeds there are?), but after diligently completing the work, he set out to create a cover page.
He鈥檇 used reliable online sources for his research, but for this last bit, he relaxed and turned to ChatGPT.
The chatbot did its thing, and a few seconds later, he had his image: an illustration of 10 multicoloured kitties, with the unsurprising title, All Cat Breeds.
It was cute enough, but as I watched ChatGPT churn, all I could think about was: Is it worth it? Leaving aside the loss of creativity 鈥 not so long ago, my son would have designed such a cover using his own imagination and pencil crayons 鈥 what was this frivolous exercise doing to the environment?
And with AI increasingly an inescapable part of our everyday lives, how much worse will things get?
Generative AI requires shocking amounts of energy to run.
According to an April report by the International Energy Agency, data centres accounted for聽 of the world鈥檚 energy consumption in 2024. But that consumption, driven largely by the exponential acceleration of AI use, is projected to double by 2030 to 945 terawatt hours (TWh), more electricity than the entire country of Japan uses for everything.
In Ontario, the聽 projects net annual energy demand for data centres to grow from three TWh in 2026 to 13 TWh in 2050 鈥 423 per cent.
Quebec, meanwhile, appears headed for an electricity shortage by聽. AI is so power-hungry that聽,听 补苍诲听 are all currently building or buying their own energy infrastructure to supply their ever-expanding systems.
Then, there鈥檚 the water.
Data centres need a lot of water for their cooling systems and to control humidity. And, unlike the water used for, say, bathing or cleaning, that liquid is not returned to the sewage system after use 鈥 it just evaporates.
The IEA estimates that data centres consume a jaw-dropping聽 a year, and the number is expected to rise to 1,200 billion litres by 2030. Making matters worse, more and more data centres are being built in areas where聽.
With AI use exploding 鈥 ChatGPT already has a聽 鈥 the environmental impact is expected to exact an even bigger toll.
But Gen AI is also in its relative infancy. Optimization, ideally, will follow innovation.
鈥淚 do think, over time, out of necessity, a lot of this is going to get more energy efficient,鈥 says Kathleen Kauth, the chief operating officer at聽, which advises the real estate sector on climate strategy. 鈥淣obody wants to create a huge emissions problem or a power scarcity problem. That鈥檚 nobody鈥檚 goal.鈥
But a goal of several Canadian AI companies is to make those emissions and that power scarcity less of a problem. They鈥檙e developing ways to make AI leaner, faster and greener.
If this is in the industry鈥檚 self-interest 鈥 energy use is the main limiting factor to AI growth 鈥 it鈥檚 also in the planet鈥檚.
Small is beautiful
Simply put, Gen AI consumes energy in two main ways. One, large language models 鈥 the engines of chatbots like ChatGPT 鈥 need to be trained on enormous sets of data, which require equally enormous amounts of electricity.
Two, every time a user asks a chatbot a question or requests that it produce text, code or sounds, the chatbot needs yet more electricity to process that request.
But what if those LLMs were smaller?聽
, which has offices in Toronto, U.K. and several European cities, has developed a quantum-inspired technique to dramatically compress models so that they perform their computations more efficiently and, thus, require less energy.
The result, says CTO Samuel Mugel, are models that perform just as well as the original (sometimes even better) but need only half as much energy 鈥 a significant savings at the data centre level.
鈥淭he environmental impact of AI is potentially way beyond energy consumption,鈥 Mugel says, pointing out that AI鈥檚 massive energy needs have meant a revival of dirty sources like coal. (U.S. President Donald Trump signed an in April to this effect.) 鈥淏ut a big part of our value proposition is: Let鈥檚 push the needle on what we鈥檙e able to do with current resources.鈥
The company has just released its smallest models to date, which it鈥檚 named SuperFly and ChickenBrain because they鈥檙e so tiny they could, theoretically, run on hardware smaller than a fly or chicken鈥檚 brain. The former鈥檚 small enough that it can run locally on automobiles, appliances and smartphones without cellular connectivity.
Do you want to go faster?
From smartphones to data centres, semiconductor chips make the digital world run.
But semiconductor design has changed little since the technology was pioneered in the late 1940s. As聽 CEO, Niraj Mathur, points out, it鈥檚 insufficient for the insatiable computational demands of modern AI.
鈥淚t was never designed to run neural networks,鈥 he says, referring to the brain-inspired architecture that underpins the technology.
What Blumind is currently developing is an analogue chip specifically for that purpose.
In a digital chip, memory and processing are separate units and data must travel between the two, which increases latency (basically, the time it takes for AI to 鈥渢hink鈥 and respond to an input). Blumind鈥檚 chips, in contrast, directly process signals and perform computations within the memory itself, rather than converting them to binary code.
The result, Mathur says, are chips that, depending on workload and application, could use 100 to 1,000 times less energy than digital chips. This makes it possible to run extremely small neural networks directly on devices like smart glasses where the size of batteries is inherently limited.
Blumind is focused on this edge computing market for the moment, but Mathur sees a day where similar technology could be deployed on a much larger scale.
鈥淎 lot of the innovation that鈥檚 happening on the devices, on the edge, will soon translate into efficiencies at the data centre as well.鈥
Making it do twice the work
Cooling systems can consume up to t of a data centre鈥檚 energy, with much of that energy used to dispel heat out of the centre and into the atmosphere.
It鈥檚 effectively a double-whammy of emissions. But rather than letting that hot air just escape 鈥 the last thing the atmosphere needs 鈥 you could theoretically harness its energy.
This is what the startup, , plans for the 142-megawatt data centre it鈥檚 building in L茅vis, Que. Partnering with Les Fraises de l鈥櫭甽e d鈥橭rl茅ans (FIOINC), a local fruit company, Qscale will use its waste heat to warm a kilometre away. They鈥檙e also giving the heat away for free, using a district heating network 鈥 a closed-loop system that captures and transports the heat 鈥 built with FIOINC鈥檚 help.
鈥淲e鈥檙e not trying to monetize it,鈥 says Qscale CEO Martin Bouchard. 鈥淔or us, it鈥檚 giving back to society. To the people that are basically accepting that data centres are using the energy in their communities.鈥
While the system adds about five to 10 per cent to the cost of the data centre, the benefits, for Bouchard, are worth it.
Not only does it enable companies like FIOINC to grow things like blackberries, whose importation creates its own share of GHG emissions, it also helps offset some of the environmental costs associated with AI.
鈥淚t鈥檚 basically like you鈥檙e using each electron two times,鈥 says Bouchard.
Mantle鈥檚 Kathleen Kauth is cautiously optimistic that such moves will help us responsibly adopt AI, and limit the negative impacts on power and emissions.
鈥淭he digital infrastructure industry takes it on the chin sometimes, and rightfully so in many ways,鈥 she says. 鈥淏ut we work across all the real estate segments 鈥 industrial, commercial, residential 鈥 and far and away, digital infrastructure is the most institutionally committed to improving climate outcomes in their operations. And I don鈥檛 think that鈥檚 going to change.鈥
To join the conversation set a first and last name in your user profile.
Sign in or register for free to join the Conversation