
Neel Somani on the Energy Economics of Liquid Cooling in…
Neel Somani, a researcher and technologist with a background in large-scale computational systems, notes that modern AI deployment is no longer limited by raw processing power alone. Thermal management has become one of the defining economic questions in data center design, particularly as next-generation AI hardware begins to exceed the limits of conventional cooling architecture.
As artificial intelligence infrastructure expands beyond traditional enterprise computing, engineers and investors are confronting a basic physical constraint: heat.
“The conversation around AI infrastructure often focuses on chips, models, and power contracts,” says Neel Somani. “But in practice, the cooling layer determines whether that compute can actually operate at scale.”
The challenge is growing quickly. High-density AI racks built for modern accelerator systems now demand thermal performance far beyond what air-based systems were designed to handle. As that threshold rises, cooling is shifting from an operational detail into a capital allocation decision with direct consequences for energy pricing, asset valuation, and long-term grid participation.