How It Works

Cooling Is the New Compute: How LiquidJet Unlocks the Next Wave of AI Infrastructure

Cooling Is the New Compute: How LiquidJet Unlocks the Next Wave of AI Infrastructure

Cooling Is the New Compute: How LiquidJet Unlocks the Next Wave of AI Infrastructure

The AI revolution isn’t coming. It’s here — and it’s accelerating faster than almost any technological curve before it. Model sizes are ballooning, inference demand is surging, and data centers are racing to deliver more performance per rack than ever imagined. But as performance soars, a hard, physical barrier is rising right alongside it.

That barrier isn’t compute.
It’s cooling.

For years, AI performance scaling has been powered by more powerful GPUs and accelerators, denser racks, and more energy-hungry workloads. But the hidden cost of that performance has been thermal. Every watt of compute eventually becomes a watt of heat — and removing that heat out of increasingly dense chips is pushing today’s cooling infrastructure to its limits.

A quiet but pivotal shift is happening in how the industry thinks about growth: cooling has become the new limiter.

The Increasing Energy Squeeze

A recent report from Deloitte “Can US infrastructure keep up with the AI economy?” warns that power demand from AI is already outpacing available grid capacity in key regions, creating major constraints for data-center growth. Thermal systems — cooling pumps, chillers, and fluid loops — represent a significant portion of that energy draw. 

Meanwhile, in their report “The cost of compute: A $7 trillion race to scale data centers” McKinsey & Company forecasts a staggering $7 trillion in global investment required to scale compute infrastructure to meet future AI demand. That’s not just a GPU story. Cooling and energy systems are embedded in that cost structure — meaning every watt saved, every loop simplified, every pressure drop reduced, every improvement in cooling efficiency directly impacts the bottom line. 

And the Stanford Institute for Human-Centered Artificial Intelligence 2025 AI Index Report paints the macro picture: model complexity and compute requirements are growing exponentially, not linearly. The old infrastructure playbook won’t keep up.

Why Cooling Is Breaking at Scale

Today’s most powerful AI accelerators are running at thermal densities that were unthinkable a few years ago. GPUs with power envelopes of 1.4 kW to the future 4.0+ kW with the NVIDIA Feynman are becoming the new normal. Hotspot power densities are skyrocketing.

Traditional skived 2D microchannel cold plates weren’t built for this world. They require high coolant flow rates to remove heat, they generate significant pressure drops, and they struggle to keep up with localized hotspot power flux.

These limitations compound quickly at scale:

  • More pumps and energy to maintain flow
  • More plumbing complexity
  • Higher TCO for data-center operators
  • And most critically, thermal ceilings that limit chip performance

What used to be an engineering detail has become a boardroom issue. If cooling can’t keep up, AI infrastructure growth slows down.

Enter LiquidJet: Cooling Re-Engineered for the AI Era

LiquidJet is a next-generation precision cold plate engineered to break through this barrier. Its breakthrough lies in its 3D short-loop jet channel microstructure — an industry-first design that flips conventional thermal limits on their head.

LiquidJet delivers:

  • 2x higher hotspot power density — up to 600 W/cm² at 40 °C inlet temperature and 0.02 C-cm2/K TIM
  • 50% higher kW/lpm efficiency, removing more heat with less coolant, enabling more chips per rack
  • 4x lower pressure drop, reducing risk of leaks
  • Custom-fit design to match any SoC or GPU power map with pinpoint precision
  • Easy drop-in upgrades, no change to liquid cooling infrastructure required

While traditional cold plates typically require flow rates around 1.4 lpm per kilowatt of heat removed, LiquidJet achieves the same thermal performance with a 1 lpm. In plain language: it does more with less. That means less pumping energy, lower operational costs, and higher rack density.

Problem #1: Power Supply Strain

AI growth is colliding with the limits of the power grid.

Deloitte’s infrastructure analysis highlights a harsh reality: grid operators are struggling to keep pace with data-center expansion. In some regions, wait times for new power connections are measured in years. Cooling systems are one of the largest non-compute energy consumers in a data center.

LiquidJet directly addresses this bottleneck. By lowering flow requirements and reducing pressure drop, it cuts the energy footprint of cooling loops. Less power to pump coolant. Less wasted capacity. More sustainable rack design.

This efficiency unlocks headroom within existing grid allocations, enabling operators to scale compute without waiting for new power infrastructure.

Cooling efficiency is power efficiency. And power efficiency is growth

Problem #2: Capital Cost Explosion

The cost to scale AI infrastructure is massive — and cooling is baked into it.

McKinsey’s $7 trillion forecast underscores just how high the stakes are. As data centers expand to support hyperscale AI workloads, every component that simplifies infrastructure or lowers TCO compounds across thousands of racks.

Traditional skived 2D microchannel coldplates demand large pumping stations, complex manifolds, and expensive plumbing architectures to handle high pressure drops. That complexity translates to CapEx.

LiquidJet’s radically lower pressure drop enables:

  • More accelerators per loop
  • Simpler, smaller pumping systems
  • Lower installation and maintenance costs

For large AI deployments, this isn’t a small advantage. It’s a structural shift in cost per watt of cooled compute.

Problem #3: Exponential Compute Demand

The thermal curve is lagging behind the compute curve.

The Stanford AI Index highlights what the industry already knows: compute demand is growing exponentially. Chip power envelopes are increasing year over year — from 1.4 kW to beyond 4 kW per chip — while rack densities are compressing.

That means cooling must do more, in less space, with less energy.
This is exactly where LiquidJet shines. By surgically targeting hotspots and maximizing thermal efficiency per liter of coolant, all while reducing pressure drop, it creates a path for next-generation racks without requiring exotic two-phase cooling systems and massive infrastructure retrofits.

LiquidJet isn’t just a cooler. It’s a growth enabler for the AI era.

Why This Matters Now

The race to scale AI infrastructure isn’t slowing down. Enterprises, hyperscalers, and governments are pouring trillions into compute build-outs. But there’s an uncomfortable truth under that growth story: if cooling doesn’t innovate as fast as chips, the curve flattens.

  • Data-center operators face energy caps.
  • Developers face thermal throttling.
  • The industry faces spiraling costs.

LiquidJet offers a different trajectory — one where cooling isn’t a limiter but a force multiplier. By removing thermal and energy barriers, it creates space for continued exponential compute growth without proportionally increasing cost or carbon footprint.

The Bigger Picture: Cooling as Strategic Infrastructure

We tend to think of GPUs, accelerators, memory and networking as the “main characters” in AI infrastructure. But behind every petaflop, there’s a watt of heat that has to go somewhere.

Cooling is no longer an afterthought. It’s a strategic lever — one that determines how fast and how far the AI revolution can scale.

As our CEO and Founder, Seshu Madhavapeddy has repeatedly stated

“Thermal architecture is the foundation
on which the future of AI will be built.”

LiquidJet was built precisely for this inflection point. Its efficiency, hotspot performance, and ease of integration give operators the ability to scale now — not years from now when alternative exotic cooling solutions catch up.

Looking Forward

We’re entering an era where every watt counts. The efficiency of tomorrow’s data centers won’t be decided by compute alone — it will be decided by how smartly we manage heat.

LiquidJet isn’t just cooling better. It’s changing what’s possible for AI infrastructure at scale.

Imagine an industry where:

  • Power isn’t the limiting factor.
  • Thermal design keeps up with silicon innovation.
  • Scaling AI isn’t a $7 trillion problem — it’s a sustainable growth path.

That’s the future LiquidJet is enabling.

References

Published:
October 16, 2025
October 14, 2025