FAU Innovation for Next-Generation AI Factories Featured by NVIDIA

by Gisele Galoustian | Wednesday, Nov 19, 2025
Arslan Munir

As the United States races to build out the next wave of artificial intelligence factories – specialized data centers designed to train and deploy large-scale AI models – researchers from the College of Engineering and Computer Science at Florida Atlantic University have unveiled pioneering findings that show how liquid-cooling technologies can dramatically boost performance and energy efficiency in high-density computing environments.

The research, now featured on NVIDIA’s website, demonstrates that direct-to-chip liquid-cooled GPU (Graphics Processing Unit) systems deliver up to 17% higher computational throughput while reducing node-level power consumption by 16% compared to traditional air-cooled systems. These results translate to potential annual facility-scale savings of $2.25 million to $11.8 million for AI data centers operating between 2,000 to 5,000 GPU nodes.

The study titled, “Comparison of Air-Cooled Versus Liquid-Cooled NVIDIA GPU Systems,” was co-authored by Arslan Munir, Ph.D., associate professor in the FAU Department of Electrical Engineering and Computer Science, alongside collaborators from Johnson Controls and Lawrence Berkeley National Laboratory, and was conducted using NVIDIA HGX™ H100 GPU systems.

“Our research shows that liquid cooling fundamentally unlocks higher sustained performance, superior energy efficiency, and the thermal headroom needed to push AI workloads to their full potential,” said Munir. “These results provide the technical foundation for designing AI factories capable of handling extreme rack densities while maintaining optimal performance, sustainability and cost efficiency.”

The findings from this research arrive at a pivotal time. According to recent federal and industry reports, U.S. investment in AI and data-center infrastructure is expected to exceed $400 billion over the next several years, as the government and private sector build out new AI-training campuses and cloud facilities nationwide. These AI factories are the backbone of modern innovation, powering breakthroughs in health care, national security, transportation and climate research. However, they also consume vast amounts of energy.

“By directly confronting the thermal and power-efficiency bottlenecks that define today’s large-scale GPU clusters, this research is charting a clear path toward truly sustainable, high-performance computing,” said Stella Batalma, Ph.D., dean of the College of Engineering and Computer Science. “This work demonstrates how innovative cooling technologies can lower operational costs, reduce environmental impact, and significantly increase compute density within existing data-center footprints – advancing both the science and the scalability of AI.”

Key findings from the FAU-led study reveal:

  • Thermal Advantage: Liquid cooling maintained GPU temperatures between 46 C to 54 C compared to 55 C 71 C for air cooling.
  • Performance Gain: Up to 17% higher computational throughput and 1.4% faster training times for large AI workloads.
  • Energy Efficiency: Average 1 kilowatt reduction per server node, equating to 15% to 20% lower facility-level energy use.
  • Operational Impact: Up to $11.8 million in annual energy-cost savings for large AI data-center deployments.
  • Sustainability: Liquid cooling shifts thermal load from inefficient node-level fans to centralized systems, enabling more accurate Power Usage Effectiveness metrics and reduced carbon footprint.

The work underscores FAU’s growing leadership in intelligent systems, data-center innovation, and quantum- and AI-driven computing. Munir’s Intelligent Systems, Computer Architecture, Analytics and Security Laboratory in the College of Engineering and Computer Science is pioneering research that intersects AI hardware acceleration, hybrid quantum-classical computing, and sustainable HPC system design.

“Energy-efficient AI infrastructure isn’t just an engineering optimization – it’s a national imperative,” said Munir. “By rethinking how we cool and power AI factories, we can dramatically increase performance while aligning with global sustainability and cost-reduction goals.”

Collaborators on the white paper include Imran Latif, vice president of global technology and innovation for data centers at Johnson Controls; Alex Newkirk, energy technology researcher at Lawrence Berkeley National Laboratory; FAU doctoral researcher Hayat Ullah; and Kansas State University doctoral researcher Ali Shafique, both supervised by Munir.

The white paper, Comparison of Air-Cooled Versus Liquid-Cooled NVIDIA GPU Systems, can be accessed here.