The Greening of Data Centers & AI
GettyImages-492394210

The Greening of Data Centers & AI

Artificial Intelligence (AI) is reshaping industries and our daily lives. No longer hidden in R&D or enterprise applications in sciences, finance, & automation, it’s now emerged onto the consumer scene where the number of daily users will scale up rapidly. Beyond being useful for predicting weather models, AI will also be at your beck and call in your phone or Alexa to answer your pressing questions about the best way to make a sous-vide. But this transformative technology comes with a hefty environmental price: the vast energy consumption of data centers. Energy costs and carbon footprint are critical concerns.

On this topic, the Cleantech Council meets to review startups and technologies working on green data centers on September 11, 2024 in Silicon Valley.

Fortunately, a new generation of startups is rising to the challenge, developing innovative solutions to make AI data centers more sustainable. They're tackling the problem from multiple angles, from revolutionary cooling methods to AI-powered energy optimization.

The Energy Appetite of AI

AI's remarkable capabilities are fueled by massive data centers that house countless servers and storage devices, all working tirelessly to process and analyze information. These data centers require enormous amounts of electricity to operate, primarily for computation and cooling.

The International Energy Agency (IEA) forecasts global data center total energy consumption to more than double from 460 terawatt-hours (TWh) in 2022 to 1,000 TWh in 2026 - approaching the total energy consumption of Japan. BCG says that US data centers' electricity demand will explode from 2.5% of aggregate demand in 2022 to 7.5% in 2030, with 1% of that from GenAI alone. (“Great Scott, Marty! How am I going to generate that kind of power?”) 

Worse, the energy demand of AI is growing at higher than a linear rate related to usage. Larger, more complex AI models require increasingly powerful hardware and longer training times, driving up energy consumption. Additionally, the proliferation of AI applications in various industries further amplifies the need for energy-hungry data centers. This growth is not sustainable. It's clear that solutions are urgently needed to mitigate the environmental impact of AI.

Startup Solutions for a Greener AI

A wave of startups is tackling this challenge head-on, developing innovative technologies and approaches to reduce energy consumption and carbon emissions in AI data centers. Here are some notable examples:

  • Lightmatter (Boston, Massachusetts): Lightmatter is harnessing the power of light to build more energy-efficient AI hardware. Their photonic chips use light instead of electricity to perform computations, promising significant reductions in energy consumption and heat generation.

  • Groq (Toronto / Mountain View, California): Groq is developing a new type of AI chip called the Tensor Streaming Processor (TSP). The TSP is designed to deliver high performance while consuming significantly less power than traditional GPUs.

  • Fungible (bought by MSFT): Fungible’s Data Processing Unit (DPU) offloads networking tasks from servers, reducing their workload and energy consumption.

  • Run:ai (NYC  / Tel Aviv): Run:ai's Atlas platform optimizes the utilization of AI infrastructure, ensuring that resources are used efficiently and reducing energy waste. Their solution helps data centers get the most out of their existing hardware.

  • ZutaCore (San Jose, California): ZutaCore is revolutionizing data center cooling with its HyperCool2 liquid cooling system. This innovative technology directly cools processors and other components, significantly reducing energy consumption compared to traditional air cooling methods.

  • CoolIT Systems (Calgary, Canada): CoolIT Systems specializes in advanced liquid cooling technologies that significantly reduce the energy required to cool data center servers. Their direct contact liquid cooling (DCLC) solutions are designed to handle the increasing heat densities of modern data centers.

  • LiquidStack (San Jose, California): LiquidStack has developed a two-phase immersion cooling technology that uses a non-conductive liquid to dissipate heat from servers. This solution offers energy savings and more compact data center designs.

  • DDN ExaScaler (Chatsworth, California): ExaScaler focuses on optimizing power delivery and management within data centers. Their software platform utilizes AI to analyze real-time data and dynamically adjust power distribution, ensuring that servers receive the optimal amount of energy needed, thus preventing overconsumption.

  • Kinetica (San Francisco, California): Kinetica offers a high-performance database platform specifically designed for AI and machine learning applications. Their solution aims to reduce energy consumption by accelerating data processing and analysis.

  • Graid Technology (Santa Clara, California): Graid Technology's innovative SupremeRAID technology offers significant energy efficiency improvements for storage systems. Their solution offloads RAID processing from the CPU to a dedicated card, reducing CPU utilization and overall energy consumption.

Additional Avenues for Energy Savings

In addition to startups, other promising approaches, including AI-based tools, are being explored to reduce energy consumption in data centers:

  • Software optimization: Developing AI algorithms that are inherently more energy-efficient, requiring less computational power to achieve comparable results. Of course, AI itself is a useful tool for finding, simulating, and testing various algorithms.

  • Digital Twins: Data center digital twins can be used to test new physical layouts, or architectures as well as new technologies. This offers a virtual test of the combined physical and digital models.

  • Hardware-software co-design: Designing hardware and software in tandem to optimize energy efficiency throughout the entire AI system.

  • Energy-aware machine learning: Training AI models to consider energy consumption as a factor in their decision-making processes.

  • Waste heat recovery: Utilizing the excess heat generated by servers to warm buildings or other processes, reducing the need for additional energy sources.

  • Renewable energy integration: Powering data centers with renewable energy sources like solar and wind to reduce reliance on fossil fuels.

  • Geographic Least-Cost, Least CO2 Workload Distribution: With a few geographically distributed data centers, workloads can be distributed towards datacenters with renewable energy sources nearby. The workload can “chase the sun” or “follow the wind” in real time. 

A Sustainable Future for AI

The growing importance of AI and its environmental impact have created a pressing need for energy-efficient solutions. The startups and innovators highlighted in this article are leading the charge, trying to assure that AI's growth doesn't have to come at the expense of the planet.

While the challenge remains huge, the progress made by these startups is encouraging. With continued innovation and collaboration, we must find a way where AI thrives alongside a healthy environment. The journey towards green AI is underway, and it's being driven by the ingenuity and determination of those who believe that technology can be a force for good in the fight against climate change. No, technology is not “the solution” to all the world’s problems, nor is it a slam dunk for green data centers, but we’d be crazy not to try.

By Derek Kerton, Cleantech Council Chairperson

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics