ai growth data center demand

Not many would have heard of AI-powered consumer-targeted tools five years ago. However, last year witnessed a boom in artificial intelligence (AI) tools such as OpenAI’s ChatGPT, Google’s Gemini, Meta AI’s Llama and YouTube are launching a series of AI generative tools.

While we’ve seen a boost in consumer usage of AI-backed tools, have you really questioned how the computational power is achieved to support such heightened expansion? This is where data centers come into play.

According to Cisco, data center is a physical facility that organizations use to house their critical applications and data. The facility is designed to harbor “a network of computing and storage resources that enable the delivery of shared applications and data.”

It covers various features to support information flow and storage including routers, switches, firewalls, storage systems, servers, and application-delivery controllers.

However, the International Energy Agency (IEA) reported earlier this year that electricity demand for global data centers is likely to double from 2022 to 2026, vastly owing to the growth of AI. 

AI computational power surging energy demands

Cory Lopes-Warfield, editor-in-chief at Tech For Good and co-founder CXO of eight startups told EM360Tech, that AI is trained on data and needs data that’s not shared or common to differentiate from other models, reduce hallucinations, and a lot more. Having proprietary data is what makes models unique, and better.

“However, AI is machine learning that just predicts the next token - that’s the compute,” he added. “It’s a series of complex mathematical equations, and they all need to be calculated using compute. That said chips are getting way better and compute faster using less energy.

The power needed to run the data centers driven by AI growth is resulting in emissions contributing to climate change.

Google’s greenhouse gas emissions (GHGs) seemed to have soared to a whopping 48% in just five years as a consequence of building data centers for artificial intelligence.

The tech giant stated in its annual environmental report that emissions rose by 13% in 2023 injust a year reaching 14.3m metric tons. This could be a setback in the firm’s goal to reach net zero across all operations and value chains by 2030. Google says, “Our net-zero goal is supported by an ambitious clean energy goal to operate our offices and data centers on 24/7 carbon-free energy, such as solar and wind.”

Hindering clean energy goals

Circling back to the role of AI in data centers, Bloomberg reported in June 2024 that the dramatic increase in power demands from Silicon Valley’s growth-at-all-costs approach to AI also threatens to upend the energy transition plans of entire nations and the clean energy goals of trillion-dollar tech companies.

John Ketchum, chief executive officer at NextEra Energy Inc. told Bloomberg that power demand is projected to increase by 40% over the next 20 years in the US mainly because of the boom in the demand for data centers.

He says AI is the reason behind the boom due to the energy demands required to train models in addition to the inference process by which AI concludes data it hasn’t seen before. “It’s 10 to 15 times the amount of electricity.”

Warfield told EM360Tech that the most resource-intensive AI technology is Quantum AI because of the complexity behind quantum computers making it necessary to implement it correctly. Text-to-video is another resource-intensive AI-powered technology.

However, the entrepreneur added that AI’s carbon footprint is being reduced perpetually. “It can be used to forecast and otherwise advert climate risks and risks to the environment technology Poses.”

When asked about the increasing demand for data centers due to AI impacting carbon emissions and overall energy consumption, Warfield said that it depends wholly on the data centers.

“TermoBuild is an example of buildings that generate their own energy and heating and cooling and cloud storage - data centers built by them don’t impact carbon outputs and so on. Therefore, it’s incumbent upon the data centers to ‘do it right,’” Warfield emphasized.

TermoBuild is an engineering services company based in the US and Canada that caters to delivering sustainable high-efficiency buildings. They are known for assisting low-energy building solutions aiming to decarbonize construction and tackle conventional building design as
seen in Canada, India, and the USA.

The company achieved this by integrating sustainable solutions into building design such as the consolidation of Thermal Storage Ventilation which has been estimated to save operational energy between 35% to 50% over standard building systems. 

AI to the rescue: Supporting sustainability efforts

“Groq LPU chips (made in the USA), small language models, UBC (universal basic compute), and autonomous agents that can deployed locally,” Warfield says are some strategies he has observed to be mitigating the environmental impact of AI-driven data centers.

The AI expert also emphasized that the technology can be used to advantage. “AI can run the centers and be programmed to run them sustainably and with Sustainable Development Goals, and much more.”

When asked about the business's ethical responsibility to manage the environmental impact caused by AI technologies, Warfield, who has co-founded several tech startups, said, absolutely, firms are “100%” responsible.

“Companies will ruin humanity and this planet with their new innovation without oversight, accountability, guardrails, and ethics prioritized,” he said. “AI can actually be net-positive (it can).”

He added that based on his experience there is a need to implement blockchain. “Try the AI tokens to the blockchain using smart contracts.”

To balance the benefits of AI with sustainability goals, Warfield advises leaning into the technology, having an intention and mindful roadmap, and hiring experts like him “to get it right” rather than risk getting it wrong or underleveraging it.