Courtesy of AEI
679cf5286f73cfb6e5378e75 Data Center Electrical Infrastructure Image 2

Shaping the Future of Data Centers

Feb. 26, 2025
Innovative strategies for electrical design engineers to address growing energy demands in the data center space

Data centers are vital to maintaining our digital world, playing a crucial role in business operations and daily life. However, the demand for data center capacity is increasing at an unprecedented rate, driven by the rapid growth of advanced technologies such as artificial intelligence (AI), machine learning (ML), and cloud computing. McKinsey & Company estimates data capacity demand in the United States will surge from 25 GW in 2024 to over 80 GW by 2030.

While this growth presents exciting opportunities, it also comes with challenges. With AI/ML advancements driving future data center demand, data center developers, owners, and operators are pressured to keep pace with the increasing power density and capacity needs associated with these computing requirements. The latest advancements in data center networking and computer and storage technologies must be deployed to handle staggering volumes of data and traffic with minimal latency — all of which must be adequately supported by power and cooling infrastructure.

Given the projected data demands over the next decade and the current constraints of the aging United States electrical grid, electrical design engineers are in a unique position to help data center operators/owners reduce energy consumption and improve performance through innovative power generation and cooling strategies, including on-site power generation and liquid cooling.

Power considerations for high-power-density data centers

Data center energy consumption will continue to grow as the need for data, digitalization, cloud migration, and new technologies expands. According to the “2024 United States Data Center Energy Usage Report,” U.S. data center power needs will likely double or triple current capacity within the next four years, going from around 4% total electricity consumption in 2023 to 6.7% to 12% in 2028. The report further anticipates the Compound Annual Growth Rate (CAGR) for electricity consumption to reach 13% to 27% between 2023 and 2028 with an estimated low- and high-end energy use of roughly 325 TWh and 580 TWh in 2028. This annual energy use range would translate to a total data center power demand of between 74 GW and 132 GW.

AI/ML advancements are major factors driving U.S. data center demand and power consumption. AI-specific data center energy usage is projected to grow at an average rate of 43% annually over the next four years with AI-data capacity demand rising 33% per year. Goldman Sachs estimates that a single query to ChatGPT  uses around six to 10 times the power of a traditional Google search. Furthermore, the growth of AI, ultimately fueled by chip suppliers such as NVIDIA and AMD, has prompted data center giants (including Google, Meta, Amazon, and Microsoft) to start building super-sized hyperscale data centers requiring much more power to meet their energy consumption needs. In existing hyperscale data centers, cabinet power densities range from 10 to 20kW per rack. However, AI-ready racks equipped with resource-intensive GPUs require more than 130kW per rack. This shift in cabinet power density is also occurring in existing colocation and enterprise data centers, where there’s a significant need for greater cooling and power to accommodate AI/ML.

Power distribution will eventually need to evolve to support the increased and continued use of high-power-density IT equipment. This includes enabling higher voltage and possibly a change in operating current (i.e., alternating current versus direct current). Such changes will require IT hardware with different power supplies to accommodate higher voltages and power distribution infrastructure to support these different voltage conditions. These new conditions will also require revisions to power coordination studies and arc flash values as well as staff training to operate in these environments.

On-site generation for power-constrained sites

Utilities in the United States are currently struggling to meet increasing power demands with challenges such as congestion, reliability, and inadequate transmission capacity impacting the U.S. power grid. In recent years, states like Texas and California have experienced shortages and constraints, leading to significant service outages and interruptions. Over the next five years, U.S. electricity demand could rise 128 GW, largely due to data center and manufacturing growth.

The rising power requirements for high-performance computing and AI will only further strain data center energy infrastructure, amplifying the challenges of sourcing sufficient power capacity. It is estimated that $50 billion in new generation capacity is needed just to support data centers. Intensifying data center power requirements could also create roadblocks to meeting sustainable energy goals at a time when data centers are facing increasing regulatory pressure to reduce carbon emissions.

Power generation availability is crucial for data centers to meet the ever-increasing demands of AI/ML computing workloads. As such, data center operators are exploring on-site power generation and alternative power sourcing strategies. On-site power generation using technologies like fuel cells, small modular reactors, and renewable energies paired with storage solutions can help minimize the impact of increased energy demands, enhance resilience, and reduce dependence on the traditional grid.

Fuel cell technology. Fuel cell technology is emerging as a key solution in the push for energy diversification and alternative energy sources. With the ability to produce energy on site, enhancing electrical reliability and energy security, these systems can reduce the burden on the existing power grid and provide power where access to the grid is limited. Fuel cells are particularly ideal for microgrids powering critical facilities like data centers.

Natural gas fuel cells convert the chemical energy from methane in natural gas into electricity with minimal footprint or infrastructure at the data center’s location, reducing the need for transmission lines or related materials. While not entirely “green,” these systems produce less than half the emissions of coal or oil, generating approximately 0.96 pounds of CO2 per kilowatt hour.

Hydrogen fuel cells operate similarly, using hydrogen instead and emitting only water without carbon emissions. Innovations in fuel cell efficiency, hydrogen production, and renewable energy integration, along with supportive government policies and a growing commitment from the private sector, are making this technology more viable and cost effective. Prominent tech companies, including Microsoft and Google, are currently exploring using hydrogen fuel cells in data centers to achieve their carbon-free energy goals.

Natural gas generation. Power constraints and the long wait for utility power upgrades are pushing some data center operators and developers toward on-site natural gas-fired generation to meet the demand for reliable and efficient services.

In recent years, energy companies have observed a shift in site selection for data centers toward regions with greater energy and supply infrastructure. A significant portion of U.S. data center construction is now concentrated in areas near natural gas resources (like the Marcellus shale production region in Northern Virginia and Dallas-Fort Worth) where there is access to gas from the Permian Basin in West Texas. S&P Global estimates that demand for natural gas to support data centers could increase to between three and six billion cubic feet per day over the next six years, with this infrastructure playing a crucial role in supporting data centers by 2030.

An example of this trend is the Microsoft Dublin data center campus, which received approval in 2023 to construct a 170MW gas-fired power plant to provide power for its operations. In another instance, plans for a 3-GW data center in Western Pennsylvania include installing natural gas turbines for on-site generation for the data center while using electricity from the local power utility to power the site. Also, ExxonMobil recently announced its commitment to providing data centers with low-carbon electricity by coupling carbon capture with natural gas-fired power plants by the decade’s end.

Small modular reactors (SMRs). Although still in the development phase and currently unavailable for commercial use, data center operators are beginning to look toward small modular reactors for future on-site power generation needs. With roughly one-third of the generating capacity of traditional nuclear power reactors, SMRs use nuclear fission to generate heat to produce a large amount of low-carbon electricity — up to 300MW(e) per unit.

A key benefit of SMRs is their smaller size, which allows for installation on site or within a reasonable distance of data centers. This can help overcome grid constraints and improve energy resiliency. They are also scalable and have the potential for incremental power additions. Local authorities having jurisdiction (AHJ) will play a significant role in determining whether SMRs can be a viable power source for data centers.

In October 2024, Google and Kairos Power announced an agreement to develop, construct, and operate advanced reactor plants to supply clean electricity to Google data centers. Concurrently, Amazon Web Services (AWS) signed a Memorandum of Understanding (MOU) with Dominion Energy to explore an SMR project that could bring at least 300MW of power to Virginia, where AWS is expanding its data 
center presence.

Renewable clean energy. Data centers are increasingly turning to renewable energy sources like solar and wind to power their operations, reduce reliance on fossil fuels, and minimize carbon emissions. While integrating clean energy technologies presents challenges for affordability and reliability, advancements in energy storage systems, grid management, and energy systems integration are helping companies overcome these obstacles.

Leading tech companies like Google and Amazon are at the forefront with ambitious goals to run their data centers entirely on renewable energy. Google has committed to operating on carbon-free energy in all its data centers by 2030, while Amazon has announced plans to power its operations with 100% renewable energy by 2025.

Energy storage systems. Grid-scale energy storage systems (ESSs) store energy and supply it back to the grid when needed. While an ESS is not a means of power generation, this technology plays a key role in maintaining a reliable power supply, especially when using alternative power sources.

In particular, battery energy storage systems (BESSs) installed on site can help lessen dependence on the grid, ensuring service remains stable, resilient, and reliable as energy demands continue to grow. These systems can aid in peak shaving and load leveling, voltage and frequency regulation, and emergency power supply. They can also facilitate broader adoption of renewable energy sources, helping data centers manage intermittent generation patterns associated with wind and solar energy.

Recently, the State of Virginia, along with South Carolina and private sector partners, was awarded an $85.3-million federal grant for renewable energy initiatives, which included the installation of a large-scale battery energy storage system at the Iron Mountain data center campus in Manassas.

Cooling considerations for increasing power densities

As data centers face increasing power densities driven by advancements in AI and ML, cooling has become key in managing power consumption and improving data center performance. Although traditional air conditioning systems have been useful for maintaining suitable temperatures over the last few decades, they are limited in handling the thermal load from densely packed, powerful servers. Consequently, many operators are shifting to advanced cooling solutions like liquid cooling, which is gaining traction in the data center cooling market.

Providing the computing power needed to meet the ever-growing demand for AI and ML applications requires deploying sophisticated infrastructure with high-powered general-purpose graphics processing units (GPUs). Advances in processor technology have introduced chips with thermal design power ranging from 300W to 1,000W, with the possibility of 2,000W processors in the near future.

This increase in power density and power-hungry silicon devices poses challenges for traditional air cooling systems to effectively and efficiently dissipate heat. High air temperatures can negatively impact the performance of electronic components within data centers, potentially leading to reduced efficiency or hardware failure in extreme cases. For systems to remain reliably operational, data centers need to implement sophisticated and redundant cooling systems to maintain optimal temperatures.

This increase in power density and power-hungry silicon devices poses challenges for traditional air cooling systems to effectively and efficiently dissipate heat. High air temperatures can negatively impact the performance of electronic components within data centers potentially leading to reduced efficiency or hardware failure in extreme cases. For systems to remain reliably operational, data centers need to implement sophisticated and redundant cooling systems to maintain optimal temperatures.

With its efficiency and energy-saving capabilities, liquid cooling can facilitate better management of thermal output and overall performance for data centers. Unlike air, liquid cooling efficiently absorbs and dissipates heat from data center components, allowing for closer component packing and reduced space requirements.

Liquid cooling methods also keep electronics at a more consistent temperature by targeting the hottest spots. This can increase the life of the hardware and allow it to operate at higher speeds than those originally intended by manufacturers. A December 2022 study published in ASME on the power usage effectiveness of an air-liquid hybrid-cooled data center found liquid cooling reduced total data center power by more than 10%.

The adoption of liquid cooling as an alternative or supplement to air cooling in data centers has dramatically increased over the last few years due to the rapid increase in power consumption for AI/ML workloads. Cloud service providers (CSPs) and original equipment manufacturers (OEMs), in particular, have intensified their adoption, as CPU power has steadily increased from around 100W to greater than 400W over the last five years.

Many liquid cooling options are entering the data center cooling market for various applications with some solutions better suited to certain conditions than others.

  • Rear-door heat exchangers (RDHx) cool components with liquid-cooled heat exchangers installed at the back of the rack. They are categorized into passive and active heat exchangers. While neither delivers liquid directly to the server or chip, both rely on fans to either push or pull heated exhaust air across liquid-filled coils that absorb the heat before the air is returned to the data hall.
  • Immersion cooling typically involves submerging the server and its components in a carbon or stainless-steel tank housing dielectric, thermally conductive fluid. The fluid rapidly absorbs heat, effectively keeping the temperature low. These fluids usually consist of either a single-phase fluid or a two-phase fluid.
  • Direct-to-chip liquid cooling (DLC) employs a liquid cooling mechanism that directly contacts the chip’s surface, quickly absorbing and removing the heat produced by the chip, leading to efficient heat transfer and dissipation. In single-phase DLC, fluid (deionized water or water treated with bacterial growth inhibitors) circulates through the microchannels within the cold plate and then moves to a heat exchanger to dissipate the heat. Two-phase DLC involves fluid entering the cold plate and changing from liquid to vapor through an evaporative process with the vapor returning to a heat exchanger and condensing into a liquid.

Of the three technologies, DLC is one of the most commonly deployed liquid cooling systems to date and has been implemented in many large-scale data centers worldwide. DLC primarily focuses on cooling processors and other high-heat flux components, leaving less energy-intensive heat-generating components to be cooled by air. The computer room cooling system typically cools these components, which also helps to cool other IT equipment, including switches and storage hardware.

Optimizing the future of data centers

Rapid advancements in technology — particularly in AI and ML — are shaping the future of data centers, driving unprecedented power demand. As data center developers, owners, and operators face the challenge of meeting this growing energy requirement while maintaining operational efficiency and resilience, adopting innovative design and infrastructure solutions becomes imperative.

Integrating advanced power management strategies, on-site generation, and effective cooling mechanisms will be crucial for optimizing performance and mitigating strain on existing electrical grids. By embracing these changes, data center operators can ensure they are well equipped to navigate the complexities of a technology-driven world and capitalize on the opportunities that lie ahead.

About the Author

Matt Koukl

Matt Koukl, DCEP-G, is a Principal and Market Leader at Affiliated Engineers, Inc. Matt has nearly two decades of experience leading Mission Critical projects and supporting the planning and design of data center facilities. He is also the current Chair for ASHRAE Technical Committee 9.9, defining standards for mission critical facilities, data centers, technology spaces, and electronic equipment.

Voice your opinion!

To join the conversation, and become an exclusive member of EC&M, create an account today!

Sponsored Recommendations

Latest from Design

ID 328365653 © Alexei Onufriiciuc | Dreamstime.com
Data Center of the Future
ID 330196603 © Thanyarat Deepradap | Dreamstime.com
Fiber Optic Cable in Data Center

Sponsored