Infrastructure Intelligence

Heat Is The New Constraint

Hyperscalers deploy $300B+ in 2025 capex, but AI chips exceed 1000W TDP. Air cooling can't keep up. Liquid is mandatory.

Air Cooling
Traditional CRAC
<1000W TDP
Direct-to-Chip
Cold Plates
1000–1600W TDP
Immersion / Two-Phase
Dielectric Fluid
>1600W TDP
Nvidia Blackwell: 1200W+ TDP — forces liquid transition
Next-gen inference chips approaching 2000W

Direct-to-Chip

Cold plates target individual processors

Deployment Retrofit existing racks
Serviceability Hot-swap capable
Heat Capture 70–80% of chip heat
Leading Players CoolIT, Motivair
Wins on serviceability and deployment speed

Immersion Cooling

Servers submerged in dielectric fluid

Deployment New facility design
Serviceability Tank extraction required
Heat Capture 95–100% of all heat
Leading Players Submer, Iceotope, GRC
Wins on total energy efficiency and heat recovery
$1B+
Venture capital into cooling startups since 2023
Company Type Value Multiple
Motivair (DTC) Schneider $1.1B 25× revenue
CoolIT (DTC) KKR Undisclosed 23× revenue
Submer (Immersion) Series B $131M
Iceotope (DTC) Series C $78M
AI Servers
36%
2024
59%
2029
HPC Clusters
40%
2024
62%
2029

Capital flows to atoms, not bits. Data centers become thermal management companies that happen to run compute. The constraint shifted from silicon performance to heat removal.

Every 1% improvement in PUE cascades through operating costs. The infrastructure buildout touches geothermal waste heat recovery, SMRs for baseload power, and the $400B+ AI capex super-cycle.