Generational AI uses a lot of electricity and water, which the aging U.S. grid can’t handle.

Anamika Dey, editor

Brief news

  • The increasing expansion of artificial intelligence has led to a significant increase in data center building and power usage.
  • Low-power CPUs from companies like Arm are popular among big tech giants due to their ability to reduce data center power usage by 15%.
  • The energy demands of AI, particularly in data centers, are a growing concern, with estimates showing that AI training and usage contribute to significant carbon emissions and greenhouse gas emissions.

Detailed news

The increasing expansion of artificial intelligence has increased data center building. Power use has increased significantly due to server efficiency and maintenance. Growing worries exist about the US’s capacity to fulfill the electrical demands for broad AI adoption and our aging grid’s ability to handle this load.

“If we don’t change our perspective on the power problem soon, our aspirations will remain out of reach,” said Arm automobile division head Dipti Vachani. Low-power CPUs from the semiconductor business are popular with big giants like Google, Microsoft, Oracle, and Amazon. Their capacity to reduce data center power usage by 15% is the main reason.

Arm-based CPU-based Grace Blackwell is Nvidia’s newest AI processor. Nvidia claims its new processor can run generative AI models with 25 times less power than the previous version.

“Conserving every ounce of power will require a completely different design approach compared to when the goal is to maximize performance,” Vachani said.

Optimizing computing efficiency is important to solving the AI energy dilemma. We may advance in this area by lowering power usage while maintaining performance. It falls well short of requirements.

Goldman Sachs found that a ChatGPT inquiry uses approximately 10 times the energy of a Google search. Creating an AI picture uses as much energy as charging a phone.

This problem is old. In 2019, estimations showed that training a single big language model caused carbon emissions equal to five gas-powered automobiles’ lifetime emissions.

Hyperscalers building data centers to manage increased power use are also increasing emissions. Google’s environmental report shows a roughly 50% rise in greenhouse gas emissions between 2019 and 2023. Their data centers’ energy use contributes to this growth. The research also notes that Google’s data centers are 1.8 times more energy efficient than average. Microsoft’s data centers contributed to a large emissions rise between 2020 and 2024.

Power is in great demand in Kansas City, where Meta is building an AI-focused data center. The closure of a coal-fired power plant is delayed.

The pursuit of energy
Over 8,000 data centers exist worldwide, with the majority in the US. It will rise significantly before the end of the decade owing to AI. Boston Consulting Group expects data center demand to rise 15%–20% yearly until 2030. Data centers are expected to consume 16% of US power by then. Since OpenAI’s ChatGPT’s 2022 release, the percentage has climbed dramatically, currently accounting for a big fraction of U.S. households’ energy consumption.

CNBC visited a Silicon Valley data center to learn how the sector is managing with significant expansion and power shortages.

Jeff Tench, Vantage Data Center’s executive vice president of North America and APAC, expects AI-specific applications to outpace cloud computing.

Many prominent IT businesses host their servers with Vantage. Vantage’s data centers have the power of tens of thousands of households, according to Tench.

Individual consumers lease whole areas in some of these facilities. That amount can rise to hundreds of megawatts for AI applications “Tench said.

Vantage is located in Santa Clara, California, a desirable site for data centers near high-data-demand clients. Nvidia’s headquarters were visible from the roof. Tench attributed low activity in Northern California to a power deficit from local utilities.

Vantage is opening campuses in Ohio, Texas, and Georgia.

Tench said the business is looking for places with easy access to wind and solar energy. They want to use existing infrastructure to convert coal-fired plants into natural gas or get power from nuclear plants.

Several AI businesses and data centers are investigating ways to produce power on-site.

Sam Altman, CEO of OpenAI, strongly believes this is necessary. A solar business that makes shipping-container-sized modules with panels and power storage was his latest investment. Altman invested in Oklo, a nuclear fission business creating A-frame tiny nuclear reactors, and Helion, a nuclear fusion startup.

Microsoft agreed to buy Helion’s fusion electricity in 2028 last year. Google has partnered with a geothermal firm that believes their plant would generate enough power for a large data center. Vantage has built a 100-megawatt natural gas plant to power one of its Virginia data centers off the grid.

Power grid strengthening
Even when enough power is supplied, the old system fails to meet demand. Power transmission from generation to consumption is clogged. Adding hundreds or thousands of kilometers of transmission lines is one option.

Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside, says the procedure is costly and time-consuming. These expenditures are sometimes passed on to residents through higher utility bills.

Local ratepayers in Virginia’s “data center alley” opposed a scheme to extend wires. They worried about higher costs.

Predictive software might reduce transformer failures at a crucial grid location.

VIE Technologies CEO Rahul Chaturvedi says every unit of power produced must travel through a transformer. Chaturvedi said the US has 60–80 million transformers.

Transformers, which average 38 years old, cause power disruptions. Replacing them is expensive and time-consuming. The VIE sensor is small and easy to connect to transformers. This sensor effectively predicts failures and identifies transformers that can carry more load. It allows load shifting away from failing transformers.

Since ChatGPT’s introduction in 2022, Chaturvedi says business has grown significantly and will likely continue to rise next year.

Keeping servers cool Ren found that generative AI data centers will need 4.2 billion to 6.6 billion cubic meters of water by 2027 to sustain ideal temperatures. That quantity exceeds half the U.K.’s yearly water intake.

AI energy use worries many. We can solve this by acting and learning about nuclear energy, right? You can fix it. Tom Ferguson, managing partner of Burnt Island Ventures, believes water will shape AI.

Ren’s research team found that 10-50 ChatGPT prompts use about as much energy as a 16-ounce water bottle.

A lot of water is used for evaporative cooling. However, Vantage’s Santa Clara data center uses large air conditioning machines to cool without water use.

Direct-to-chip cooling using liquid is another approach.

Many data centers need extensive retrofitting. Tench from Vantage highlighted how six years ago Vantage created a design to access the data hall floor cold water loop.

Apple, Samsung, and Qualcomm have touted on-device AI’s power savings by avoiding energy-intensive queries on the cloud and data centers with restricted resources.

“We will have as much AI as those data centers can handle.” It may not meet expectations. In the big picture, many people are studying ways to reduce supply constraints “Tench said.

Source : CNBC News

Leave a Reply

Your email address will not be published. Required fields are marked *