NVIDIA: How AI can Make Data Centres Sustainable
As AI’s capabilities continue to expand, the market size is expected to continue growing extremely quickly.
An enormous majority of companies are exploring and implementing AI across essentially every industry.
NVIDIA, a dominant force in AI development, hopes that it can reduce energy consumption.
Joshua Parker, Senior Director of Corporate Sustainability at NVIDIA, says: “AI, I firmly believe, is going to be the best tool that we’ve ever seen to help us achieve more sustainability and more sustainable outcomes.”
The problem with AI and data centres
Data centres are home to lots of processes including data storage, cloud services, media streaming and AI.
They usually run 24/7 to meet constant demand for these services and can contain tens of thousands of devices.
These machines need constant power and cooling, requiring huge amounts of electricity.
Jensen Huang, CEO and Co-Founder at NVIDIA, explains: “Data centres are already about 1-2% of global electricity consumption and that consumption is expected to continue to grow.
“This continued growth is not sustainable, neither for operating budgets nor for our planet. Accelerated computing is now the most sustainable way to advance computing.”
About NVIDIA
Chip-maker NVIDIA has become one of the largest American companies by market cap and is a dominant supplier of AI hardware and software.
The company says it invented graphics processing units (GPUs) in the 1990s which are essential to modern computing.
Headquartered in California, USA, NVIDIA has more than 26,000 employees.
In March 2024 it became the third company in the history of the US to close with a market capitalization in excess of US$2tn.
In 2023 the company had revenue of US$60.9bn.
How AI can help data centres to become more sustainable
NVIDIA says that AI’s “superpower” is in its ability to optimise using accelerated computing platforms.
These platforms combine GPUs and CPUs that can handle complex computations quickly and efficiently.
NVIDIA says these systems can be up to 20 times more energy efficient than traditional CPU-only systems for AI inference and training.
Joshua says: “The change in efficiency is really, really dramatic.
“If you compare the energy efficiency for AI inference from eight years ago until today, it’s 45,000 times more energy efficient.”
Direct-to-chip liquid cooling allows data centres to cool more efficiently than traditional air conditioning.
“Our recommended design for the data centres for our new B200 chip is focused all on direct-to-chip liquid cooling,” Joshua explains.
AI addressing climate change
AI can do more than just optimise energy consumption.
Weather forecasting enhanced by AI is becoming more accurate and allows for preparation for climate-related weather events.
Digital twins can also make a big impact, allowing companies to simulate and optimise energy consumption without having to make changes in the real world.
AI is also supporting the development of new materials for renewable energy technologies like electric vehicles and solar panels.
Joshua says: “AI and accelerated computing in general are game-changers when it comes to weather and climate modelling and simulation.”
Receive the next edition of ClimateTech Magazine by signing up for its newsletter.
As part of this portfolio, make sure you check out Sustainability Magazine and also sign up to our global conference series - Sustainability LIVE.
Also check out our Sister Brand, Energy Digital.