This audio is automatically generated. Do us know if you have comments.
The global race to develop sophisticated and versatile models drive an increase in the development of physical infrastructure to provide them with support. Microsoft, Amazon, Meta and Google Parent Alphabet – Four of the main actors in the IA industry – Plan to invest A combined $ 325 million in capital expenses in 2025, much for data centers and associated hardware.
According to Yahoo! Financing and tracking a Double the average size of the proposed data centers From 150 Gigawatts to 300 Gigawatts between January 2023 and October 2024. In December, Meta said it would be built A $ 10 million dollars data center in Louisiana, which could consume the equivalent of energy production of two major nuclear reactors.
The announced expense ads took place after the January publication of the Highly Efficient Open Ai Platform, which led some experts and investors to question the amount of infrastructure-from from High performance chips into new electric power plants and electrical equipment: The AI boom really requires. But experts interviewed by installations dug the AI efficiency gains as a net positive for long -term data centers, operators and clients, despite short -term uncertainty that could be temporarily created over some overload in some Markets.
“All technologies have been more efficient”
The late January’s commercial launch of Deepseek surprised many north -AmericansBut the rise of a more efficient and capable AI model follows a well -worked pattern of technological development, said KR Sridhar, founder, president and CEO of the Power Solutions Solutions Provider Bloom Energy.
“In 2010, it took about 1,500 watts to facilitate the flow of a gigabyte of information,” said Sridhar. “Last year took about a tenth … [but] The total traffic growth and, therefore, the chips and the energy, has been twice as much of this amount. ”
As the development of and generative models has increased more intensely capital, the advantages of efficiency have also increased, wrote the Senior Vice President of Morningsar DBRS, Victor Leung January 28 Research Note.
“Deepseek’s approach to building and forming its model could be considered as a manifestation of this impulse to more efficient models and will be expected, as they and other companies aim to reduce the costs of deployment of the AI” , said Leung. Leung added that efficiency improvements could be moderated by long -term data center’s demand.
As shown by the recent ads of technology giants, it is likely that the biggest players in the AI are planned investments on the Deepseek account, Ashish Nadkarni, group vice president and director general of World IDC infrastructure research, Va Va. say to the division facilities.
For companies such as Meta and Microsoft, the promotion of building the capacity of the AI is so quickly “existential”, with Apple the only important technology company that achieved the trend so far, said Nadkarni. These companies, and others like Elon Musk’s Xai, use large amounts of property data to form differentiated founding models in the creation of “AI -AG” emerging tools and other future innovations, he added.
Less training, more inference and cooling cooling?
Despite the scale of their planned and ongoing AI investments, large technology companies “remain on the starting line”, with an uncertain commercial reward and probably still distance, Nadkarni said.
The impending advances that allow the deployments of the AI could “stabilize the long -term IA industry” by reducing the risk of a capital expenses shooter that could lead to “an investment bubble bursting in sectors related to the Ya, “said Leung.
A reduction in computer power needed to train new models could end the balance of new data center deployments to smaller “inference” data centers, which are used for users’ requests instead of development and The refinement of models, according to experts. This would be proportionally less developing on the scale of the Louisiana two -Gigawatt Meta or in the data centers of the compass’ Planned Campus In Chicago Suburba that could end with five mass data centers.
“An acceleration of [large language models] In the large training centers, they will lead to a natural acceleration of inference data centers … located in proximity to people, processes and machines, “said Sridhar. Inference data centers usually consume dozens of power megawatts, compared to hundreds of megavatios for centralized training facilities.
For now, proposals from the Data Center continue to increase, with the size center proposed in size about 150 megawatts to 300 megawatts Between the beginning of 2023 and mid 2014, according to a 2024 October report by Wood Mackenzie.
Although AI efficiency gains continue, new data centers will probably not change their plans for using fluid fluid cooling systems. Compass data centers say that their Chicago area campus will use non -water fluid cooling, while Microsoft has recently presented a new Closed loop cooling system This said that it could prevent the need for 125 million liters of water annually in the new facilities.
Taylor also hopes that liquid cooling will play a key role in new inference data centers in “edge” environments closer to users.
)[The] Edge has hard operating and service requirements such as air pollution, hot and hot climate [and] Smaller footprints … liquid cooling shines in this type of Environments, “he said.
However, some operators of existing air-refrigerated data centers may rethink intensive capital plans to adapt them with liquid cooling systems, which “are still in several years” of widespread deployment, said Nadkarni.
)[Liquid cooling] It is a much heavier elevator for adaptations, “he said.
A reset of the business of power
In the midst of long frames of construction time and Bottle of grid’s interconnection bottleThe developers of the data center such as Meta and Lamb are “to secure as many natural gas as possible,” said Gabriel Kra, director general of the prelude venture capital venture capital firm.
Entcy Louisiana plans to build three combined cycle gas plants to feed the new Campus goal while lamb was deployed more than a dozens of gas turbines off the Last year while waiting for a network connection for its Memphis campus.
In the long term, the growth of inference data centers will create a great need for local electricity availability in metro areas already congested, “further increasing the value of power at the site, said Sridhar.
But many data centers will still connect to the network and the models of the AI of efficient question the long -term economy Premium Price Power Offers have recently announcedSaid Kra.
“If the growth of the load of the AI is suddenly reduced by a factor of five or ten, this has great implications,” said Kra. “We have heard recent launches of companies that are based on expensive contracts with hyperscalers …[This] Seriously call these business models. “”
On the brilliant side, the energy demand of the data center lower than expected could pay environmental dividends by reducing the need for older coal power stations, added Kra.
Tailwinds for the placement and calculation of the premise
If AI models continue to see efficiency gains, larger companies (such as global automobile manufacturers) can have more closely the development of technology and computer power at home instead of relying on resources Of third parties, Nadkarni said.
This would need a capital investment by these companies, but it could reduce the demand for Centralized AI computer science, he said.
The placement data centers, where customers rent space to house computer equipment instead of maintaining their own facilities, in the meantime “grow extremely fast,” said Taylor.
Placement facilities are usually located in highly connected and properly connected places near population centers, which makes them attractive to users who need computer power shortly, said Scott McCrady, a customer executive in none de Colologix, a firm of placement. Those located in major network cores also offer low latency computer science, which is essential for inference applications from which users expect quick outings, he said.
However, experts say that parties that benefit from calculation and placement are not enough to keep the current data center boom, according to experts.
For starters, hyperscalers can boost shorter leases on large -scale facilities, as they weigh uncertainty around future computer power requirements, said Leung.
And the wider industry could be for a temporary period of overcapacity, similar to what happened after the dot-com boom in the late 1990’s and early 2000’s, Nadkarni said. It took years for years to absorb the glut of fiber networks and data centers built at that time.
“I have a concern that we will see the same this time,” Nadkarni said. “I don’t think we can follow the calculation of the construction at the same pace.”