VentureBeat presents: AI Unleashed – An exclusive executive event for enterprise data leaders. Network and learn with industry peers. Learn More


In this digitally-savvy world, where every single click, swipe or stream connects to a whole new universe of information, sits a hidden driving force: the data center. These enigmatic facilities house all the infrastructure that corporate giants need to deliver their services and serve as the unsung heroes of our online lives, letting us play what we like, shop what we want and order what we need (among many other things) with remarkable efficiency. 

However, beneath their impressive processing capabilities and sleek exterior exists a little-known paradox: data centers are insatiable energy beasts. According to the International Energy Agency (IEA), they are responsible for about 1.5-2% of global energy consumption – nearly as much as the airline industry – and can use up to 50 times the energy of a similar-sized commercial office building.

More importantly, this energy consumption isn’t slowing down. There are more than 9000 data centers (mostly in the U.S.) and the footprint is consistently growing, given the rising computational demands of the digital age. At this pace, the overall energy demand is expected to escalate up to 8% of total energy consumption by 2030. 

That could spell trouble for our planet — if appropriate interventions aren’t made in the design and operation of these facilities.

Event

AI Unleashed

An exclusive invite-only evening of insights and networking, designed for senior enterprise executives overseeing data stacks and strategies.

 


Learn More

Cooling: The big energy problem

Data centers can vary in size, ranging from small 100-square-foot hubs to massive 400,000-square-foot facilities housing thousands of cabinets. However, at the core, they are driven by servers and IT equipment, covering CPUs, GPUs, storage and networking devices. These components, combined with auxiliary systems, run extensively to ensure continuous delivery of data and services – and consume energy in the process.

But, here’s the thing. Like all electronics, servers and IT equipment used in a data center also produce a lot of heat while running and demand cooling in return — which further increases the power consumption of the facility and costs of the business running it.

“The conventional way to take that heat out is by throwing cold air on it. And to create more and more cold air, you need energy. So the issue is how much energy does that system need to create the cold air needed to cool down the servers. A cooling system in a very critical data center can take up to 40% of the overall energy usage of the data center. So, if a high compute data center is taking 100 to run everything, a cooling system could almost take as much as 40 of the energy, which is definitely which is not the best,” Pankaj Sharma, executive vice president of secure power division and data center business at Schneider Electric, told VentureBeat.

Even though factors like location (building the facility in a fairly cold region) and managed provisioning (using cooling systems according to needs) can help reduce cooling demands to some extent, the reality is that traditional heating ventilation and air conditioning (HVAC) systems, comprising of air handlers, chillers, etc., are gradually maxing out on their evolutionary process. The physics of air cooling is so energy-intensive that these systems can only do so much to keep up with the growing server densities that artificial intelligence (AI) and other modern applications demand. 

In a nutshell, if the number of CPUs and GPUs optimized for HPC and other next-gen workloads increases, the heat will grow and so will the energy footprint of these cooling systems. The energy footprint could go up to 50, 60 or even 70 kilowatts per server rack. Data centers are currently using 10-15 kilowatts, and that too in very extreme environments.

“Other cooling methods, such as water-based cooling systems, can consume vast amounts of water, amounting to hundreds of billions of gallons annually. It can put pressure on local water resources, especially in regions with water scarcity concerns,” Tiwei Wei, assistant professor of mechanical engineering at Purdue University, told VentureBeat. “Plus, there’s also the concern of environmental impact. The energy used for cooling (via HVAC systems) contributes to a data center’s operational CO2 carbon footprint and…can contribute to climate change and other environmental issues.”

How to make cooling power efficient in high-compute scenarios?

To make cooling efficient in high-compute scenarios, like generative AI training, organizations have to look beyond HVAC-based air cooling and take a systematic three-pronged approach covering design, technology and management of their data center.

On the design front, the focus should be on building the facility in a way that integrates natural ventilation into mechanical cooling systems without letting external heat get in. This approach, called passive cooling, can include multiple things such as designing the facility with good insulation, using heat-absorbing roofing material and taking advantage of shades and breezes.

When natural ventilation is all set, the technology bit comes in. As part of this, teams working with HVAC systems can explore “liquid cooling” as an option. 

“Liquid cooling conducts up to 3,000 times more heat than air cooling. When taking a comprehensive view of data center efficiency, liquid immersion cooling was able to handle higher compute density, while reducing power consumption by 50%, data center building size by 61% and land use by 32% – all while using no water at all,” Joe Capes, the CEO of LiquidStack, told VentureBeat. His company uses an “open bath” system under which servers are immersed in a tank of dielectric liquid for cooling.

Currently, liquid cooling is not as widely adopted as air cooling, but it is seen as a promising solution. The U.S. Department of Energy (D.O.E) has already backed multiple liquid cooling technologies as part of its $40 million grant aimed at accelerating the development of solutions that could reduce the energy footprint of data center cooling.

While Liquidstack is focusing on immersion-based liquid cooling, JetCool, one of the vendors supported under the D.O.E program, is working on direct-to-chip microconvective liquid cooling. It utilizes microjets to target hotspots with pinpoint accuracy on chipsets. This approach, unlike heat sinks or cold plates that pass fluid over the surface, impinges fluid directly at the surface, maximizing heat extraction and allowing for enhanced thermal performance for energy savings.

“Our cold plates are installed today at federal labs, utilizing coolants over 65C, eliminating the use of its evaporative cooler, and saving 90% in water usage per year,” Bernie Malouin, founder & CEO at JetCool, told VentureBeat. “Our self-contained solutions are deployed today at colocations where they are seeing a chip temperature reduction of 30% while also reducing power consumption by 14%. These self-contained solutions are also deployed at financial institutions where they are cooling next-generation chipsets at ambient temperatures over 35C.”

Notably, Wei from Purdue University is also working in the same area with a chip-level, two-phase impingement jet cooling solution that offers a significant improvement in thermal performance while reducing the need for pumping power.

“Our innovative solution holds tremendous potential in the cooling of data centers housing high-performance computing systems. Additionally, it extends its benefits to cooling other high-power electronic devices and the wireless ecosystem, including emerging data-intensive applications like fully immersive reality and 3D holography as well as the highly potent next-generation 6G network,” he noted.

In most cases, direct-to-chip liquid cooling technologies can work in conjunction with air cooling systems. This allows organizations to combine them with HVAC systems with high seasonal energy efficiency ratio (SEER) ratings and maximize the level of energy savings. Beyond this, they can also take additional measures such as using Energy Recovery Ventilation (ERV) systems that help with recovering and reusing the energy from the outgoing air as well as segregating hot and cold air and maintaining dynamic fan controls.

Monitoring and management for maximizing efficiency 

Finally, the job of management comes in with monitoring the data center’s power consumption and making sure the cooling systems in place are delivering maximum efficiency. This task is usually handled manually, but teams can leverage the power of new-age innovations like AI-driven self-optimization tools that continuously analyze temperature data and optimize cooling systems in real time. 

This way, teams can easily facilitate proactive adjustments, eliminating hot spots and reducing overcooling. This will ultimately lead to energy savings.

According to EkkoSense, a company optimizing data centers with AI, organizations can save more than $1.7 billion in cooling energy costs globally by simply applying the best practices for data center cooling and optimization in a systematic and coordinated manner.

Dean Boyle, the CEO and co-founder of the company, said they have already made a positive impact by helping clients reduce their carbon emissions related to cooling power by approximately 4,100 tonnes of CO2 equivalent per year. 

“This reduction is equivalent to saving more than 10 MW of cooling power and cutting cooling energy expenses by $10 million. These figures continue to grow as more clients benefit from these practices every day,” he said.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Shubham Sharma

Source link

You May Also Like

Takeoff, dead at 28, recently said he wanted his ‘flowers’ for his contributions to Migos. ‘I don’t want them later on when I’m not here.’

At just 28, rapper Takeoff had cultivated a rich hip-hop legacy with…

Jason A. Stern | Weber Law Group, LLC | Long Island Business News

Jason A. Stern was installed as the Suffolk Academy of Law’s twenty-fourth…

Unveiling the Hottest Real Estate Cities and Regions to Target in August: NYC, Chicago, LA, and More! | Entrepreneur

Bracing up to diversify your investment portfolio and venture into real estate?…

Pending cases may touch 5 crore mark in two months: Rijiju

Law Minister Kiren Rijiju on Tuesday said the number of pending cases…