The next wave of AI infrastructure faces a critical challenge. Cooling, not power, is the true bottleneck... With NVIDIA's Vera Rubin Ultra requiring 600kW racks by 2027, cooling infrastructure will reshape the global data center landscape. According to Uptime Institute's research, cooling systems consume up to 40% of data center energy. Yet the physics of thermal transfer create absolute limits on what air cooling can achieve. This reality is creating a three-tier market segmentation worldwide: 1. Liquid-cooled facilities (30-150kW+ per rack) capturing premium AI workloads 2. Enhanced air-cooled sites (15-20kW per rack) limited to standard enterprise computing 3. Legacy facilities facing increasing competitive disadvantages The challenge manifests differently across regions: - Tropical markets (#Singapore, #Brazil) battle 90%+ humidity that reduces cooling efficiency by 40% - Water-stressed regions face constraints with cooling towers consuming millions of liters daily - Temperate regions benefit from free cooling opportunities but still require liquid solutions for #AI densities Regional innovations demonstrate tailored approaches: 1. #Singapore's STDCT has achieved PUE values below 1.2 despite challenging humidity 2. #SouthAfrica's MTN deployed solar cooling to address energy reliability concerns 3. #Jakarta's SpaceDC uses specialized designs for both climate and power stability challenges Research from ASME shows that transitioning to 75% liquid cooling can reduce facility power use by 27% while enabling next-gen compute densities in any climate. The Global Cooling Adaptation Framework provides a strategic approach: 1. Regional Climate Assessment 2. Thermal Capacity Planning 3. Water-Energy Optimization 4. Infrastructure Evolution Timeline For investors, the implications extend beyond operations. Facilities with limited cooling capabilities may find themselves at a disadvantage when competing for higher-margin segments, regardless of location advantages. What cooling strategies is your organization implementing to prepare for the 600kW future? Read the full analysis in this week's article. #datacenters
How Cooling Technology is Advancing for AI
Explore top LinkedIn content from expert professionals.
Summary
Advancing cooling technology for artificial intelligence (AI) is essential to meet the growing energy and thermal demands of high-performance data centers. Innovations like liquid cooling, underwater data centers, and waste heat reuse are reshaping infrastructure to support AI while addressing environmental and sustainability concerns.
- Adopt liquid cooling systems: Transition to advanced methods like direct-to-chip or immersion cooling to handle the heat generated by high-density AI hardware and ensure cost efficiency.
- Explore innovative solutions: Consider sustainable approaches such as underwater data centers or systems that recycle waste heat, reducing environmental impact and operational costs.
- Plan for future demand: Proactively assess your facility's thermal capacity and energy needs to prepare for rising AI workloads and ensure long-term scalability.
-
-
AWS Builds Custom Liquid Cooling System for Data Centers Amazon Web Services (AWS) is sharing details of a new liquid cooling system to support high-density AI infrastructure in its data centers, including custom designs for a coolant distribution unit and an engineered fluid. “We've crossed a threshold where it becomes more economical to use liquid cooling to extract the heat,” said Dave Klusas, AWS’s senior manager of data center cooling systems, in a blog post. The AWS team considered multiple vendor liquid cooling solutions, but found none met its needs and began designing a completely custom system, which was delivered in 11 months, the company said. The direct-to-chip solution uses a cold plate placed directly on top of the chip. The coolant, a fluid specifically engineered by AWS, runs in tubes through the sealed cold plate, absorbing the heat and carrying it out of the server rack to a heat rejection system, and then back to the cold plates. It’s a closed loop system, meaning the liquid continuously recirculates without increasing the data center’s water consumption. AWS also developed a custom coolant distribution unit, which it said is more powerful and more efficient than its off-the-shelf competitors. “We invented that specifically for our needs,” Klusas says. “By focusing specifically on our problem, we were able to optimize for lower cost, greater efficiency, and higher capacity.” Klusas said the liquid is typically at “hot tub” temperatures for improved efficiency. AWS has shared details of its process, including photos: https://lnkd.in/e-D4HvcK
-
Headline: China Sinks Data Centers into the Ocean to Tackle AI Cooling Crisis ⸻ Introduction: To support its aggressive push into artificial intelligence and cloud computing, China is rapidly expanding its data center infrastructure. But this expansion poses a growing challenge: how to cool vast server farms without depleting precious water supplies. In a bold and innovative move, China is deploying data centers underwater, turning to the ocean as a sustainable cooling solution—and in doing so, it may be outpacing the rest of the world. ⸻ Key Details: 1. AI Demands Fuel Data Center Growth • China’s economic strategy prioritizes AI, digital infrastructure, and cloud computing as critical engines of future growth. • These technologies depend on high-performance data centers, which consume massive energy and water resources for cooling. 2. Water Scarcity vs. Data Center Demand • Traditional land-based data centers use hundreds of thousands of gallons of water per day to dissipate heat. • Many are located in arid regions like Arizona, Spain, and parts of the Middle East due to their low humidity, despite water scarcity in these areas. • As these centers proliferate, they compete directly with agriculture and human consumption, prompting sustainability concerns. 3. China’s Ocean-Based Solution • In response to the growing water challenge, China is leading the deployment of underwater data centers, placing them offshore to utilize natural ocean cooling. • This method drastically reduces water usage and energy costs while avoiding the land-use conflicts associated with traditional facilities. • China’s efforts appear to be ahead of other nations, which have only experimented with submerged servers on a limited scale. 4. Environmental and Strategic Implications • Underwater data centers may reduce carbon footprints and eliminate the need for massive evaporative cooling systems. • However, there are questions about long-term maintenance, ecological impact, and geopolitical access to maritime infrastructure. • The shift could reinforce China’s position in the global AI arms race by improving data center efficiency and reducing operational constraints. ⸻ Why It Matters: As AI continues to drive demand for computing power, the environmental costs of data centers—especially water usage—are becoming unsustainable. China’s underwater strategy not only offers a bold path to sustainability but also serves as a geopolitical differentiator in the digital era. If successful at scale, ocean-based data centers could reshape the future of computing infrastructure worldwide, offering a cleaner, cooler alternative to traditional server farms on land. https://lnkd.in/gEmHdXZy
-
AI’s Biggest Bottleneck Isn’t Code. It’s Concrete, Copper, and Cooling. Let’s get real for a second. Everyone’s obsessed with the next big AI model, but almost nobody wants to talk about the hard limits: Power. Heat. Space. You can’t ship intelligence if you can’t plug it in. According to Goldman Sachs, global data center power demand is set to rise 165% by 2030, with AI workloads as the primary driver. https://lnkd.in/gKcsRuxj In major regions, data center vacancy rates are below 1%. That means, even if you have the hardware and the talent, your biggest challenge is often finding enough megawatts, enough cooling, and enough floor space to actually run your workloads. From my vantage point—deploying AI at scale—the constraints are physical, not theoretical. Every breakthrough in model design gets matched by an even bigger jump in energy and cooling requirements. No grid, no cooling, no go. What’s shifting right now? Direct-to-chip and immersion cooling are turning waste heat from a liability into an asset, doubling compute density per rack. Infrastructure leaders are designing for sustainability and modular deployment—not just patching legacy hardware. The next leap in AI won’t come from a new algorithm. It’ll come from infrastructure that’s actually ready for it. Here’s my challenge to every operator, investor, and AI team: Are you tracking your megawatts and thermal loads as closely as your training parameters? Are you planning for true density, or just hoping the power and space show up? Bottom line: The future of AI will be won by teams who master both the software and the physical world it runs on. Code matters. But so does concrete, copper, and cooling.
-
🚨 Cooling the AI Megawatt: 7 Innovators Redefining Data Center Thermal Design 💧 The AI era isn’t just driving compute demand—it’s rewriting the rulebook for thermal design. With chip TDPs > 1kW and rack densities topping 100kW, traditional air-cooled systems are hitting the wall. 🔥 The signal just got louder: Microsoft is rolling out LG’s full-stack liquid cooling suite—chillers, CDUs, cold plates, and CRAHs—across a wave of new AI data centers. 💰 According to Dell’Oro, the liquid-cooling hardware market will hit $15B in 5 years. Investors are paying attention—and so should you. 🧊 Here are 7 Companies Leading the Thermal Arms Race: 1️⃣ LiquidStack – Two-phase immersion pioneer backed by $35M from Tiger Global 2️⃣ Iceotope – Just launched Iceotope Labs, the first AI-focused liquid cooling test center 3️⃣ Submer – Closed-loop immersion with heat reuse potential for district heating 4️⃣ CoolIT Systems – KKR-owned, pushing 4kW cold plates & high-pressure loop design 5️⃣ ZutaCore – Waterless two-phase cooling, now integrated with Carrier’s HVAC lineup 6️⃣ Vertiv – “Switchable” colocation-ready systems; future-proofs for air + liquid hybrids 7️⃣ Schneider Electric – Partnered with $NVDA on turnkey EcoStruxure™ cooling blueprints 🏆 Honorable Mention: Nautilus Data Technologies — using river water for open-loop cooling & deploying CDU pods for third-party DCs. 🌊🛳️ 🧠 What Operators & Investors Need to Know: 💡 Hybrid is the New Normal – Designs that toggle between air & liquid de-risk capex as AI loads evolve 💰 Funding is Flowing – Strategic HVAC players and PE firms are betting liquid will be mandatory for >50kW racks 📐 Standards Are Lagging – Early adopters who solve for serviceability and safety can turn risk into revenue 🌱 Sustainability = Permits – Cooling solutions that reuse heat or eliminate water help meet ESG goals + unlock interconnection 🔭 What’s Next? If 2023 was the pilot phase, 2025 is the deployment phase. Expect more OEM x Cooling JV deals, more campus-scale rollouts, and municipal regulations focused on water use and heat recycling. 💬 Your Turn: Which tech will dominate AI cooling? 🌡️ Immersion? 🥶 Cold Plate? 🌬️ Rear-door heat exchangers? 🧪 Something we haven’t even seen yet? Drop your thoughts (and cooling war stories) 👇 #AI #DataCenters #LiquidCooling #ThermalDesign #Infrastructure #Hyperscale #Sustainability #CoolingInnovation #EnergyEfficiency #HeatReuse
-
Everyone is getting excited about the new Data Center 'gold rush' and understandably so (so am I). Although in my opinion, the cooler stuff is happening behind the scenes with energy and heat optimization. Schneider Electric for example is teaming up with NVIDIA to create a solution to the problem of heat and power in hyper-scale data centers. Through high density liquid cooling technology and their EcoStruxure asset monitoring system Schneider Electric are solving one of the biggest headaches in the sector - Power & Cooling. By moving away from traditional Air Cooling, they are able to handle NVIDIA AI racks up to 132 kilowatts each. As this technology continues to improve you will see the rate of hyper-scale #datacenter construction increase exponentially.
-
📢𝗠𝗶𝗰𝗿𝗼𝘀𝗼𝗳𝘁 𝗗𝗶𝘀𝗰𝗼𝘃𝗲𝗿𝘆 𝗔𝗜 𝗔𝗴𝗲𝗻𝘁 𝘎𝘦𝘯𝘦𝘳𝘢𝘵𝘪𝘯𝘨 𝘮𝘰𝘳𝘦 𝘦𝘯𝘦𝘳𝘨𝘺 𝘸𝘪𝘵𝘩 𝘭𝘦𝘴𝘴 𝘦𝘮𝘪𝘴𝘴𝘪𝘰𝘯𝘴 𝘥𝘦𝘮𝘢𝘯𝘥𝘴 𝘣𝘰𝘭𝘥 𝘪𝘯𝘯𝘰𝘷𝘢𝘵𝘪𝘰𝘯 – 𝘢𝘯𝘥 𝘈𝘐 𝘪𝘴 𝘩𝘦𝘭𝘱𝘪𝘯𝘨 𝘥𝘦𝘭𝘪𝘷𝘦𝘳 𝘪𝘵. Last year, I shared how Microsoft and Pacific Northwest National Laboratory (PNNL) used #AI to identify a novel solid-state electrolyte that requires up to 70% less lithium—a breakthrough that could reshape the future of battery storage. Today, I’m excited to share another leap forward: the launch of the 𝗠𝗶𝗰𝗿𝗼𝘀𝗼𝗳𝘁 𝗗𝗶𝘀𝗰𝗼𝘃𝗲𝗿𝘆 𝗔𝗜 𝗔𝗴𝗲𝗻𝘁, a new agentic AI system that accelerates scientific discovery by autonomously generating hypotheses, analyzing literature, and even designing experiments. One of the most compelling recent outcomes? Discovery AI helped identify a non-PFAS coolant prototype—a safer, more sustainable alternative to traditional fluorinated compounds—𝗶𝗻 𝟮𝟬𝟬 𝗵𝗼𝘂𝗿𝘀 𝗶𝗻𝘀𝘁𝗲𝗮𝗱 𝗼𝗳 𝗺𝗼𝗻𝘁𝗵𝘀 𝗼𝗿 𝘆𝗲𝗮𝗿𝘀. This is a game changer for the industrial sector and datacenters, where cooling is essential but environmental impact must be minimized. 🎥 Watch the video below to see how 𝗗𝗶𝘀𝗰𝗼𝘃𝗲𝗿𝘆 𝗔𝗜 𝗶𝘀 𝘁𝗿𝗮𝗻𝘀𝗳𝗼𝗿𝗺𝗶𝗻𝗴 𝘁𝗵𝗲 𝗽𝗮𝗰𝗲 𝗮𝗻𝗱 𝘀𝗰𝗮𝗹𝗲 𝗼𝗳 𝘀𝗰𝗶𝗲𝗻𝘁𝗶𝗳𝗶𝗰 𝗶𝗻𝗻𝗼𝘃𝗮𝘁𝗶𝗼𝗻. This is just the beginning. With agentic AI, we’re not just imagining a more secure, resilient, equitable and sustainable #energy future—we’re helping build it. https://lnkd.in/gYhDzxU5