This audio is automatically generated. Please let us know if you have any comments.
Johnny Clemmons is global vice president and head of industries at software company SAP. The opinions are the author’s own.
Artificial intelligence is driving a massive and unprecedented expansion of infrastructure $40 billion in annual data center spending only in the US.

Johnny Clemmons
Permission granted by SAP
The pace of data center construction is accelerating worldwide. There is OpenAI $400 billion Stargate initiativewhich will add 10 gigawatts of data center capacity with partners Oracle and SoftBank, along with Amazon. $100 billion global data center expansion effort. Moody’s projects $3 trillion in global data center spending over the next five years.
As hyperscalers and others invest to meet the growing appetite for AI data capacity, it is vitally important that they also bring more standardization and modularity to data center design and construction.
In short, the architectural aesthetics of these projects is secondary to efficiency, safety, security and repeatability. This means that unless a particular project has unique environmental, location or regulatory requirements, highly customized designs are unnecessary, even frivolous.
For overall building design, the occupant experience almost always comes first. But for data centers, while the safety and comfort of those working on-site should be a high priority, it also creates an optimal environment for the server racks and their contents.
As much as stakeholders may feel inclined to treat a data center like most other buildings, with customizations and unique design elements to make the building more attractive and the occupant experience richer, these considerations take a backseat to standardization, predictability, modularity, repeatability, and sustainability.
In fact, data centers can be more repeatable than most other types of construction projects. Many of its critical components—power density, cooling strategies, safety zones, structural loads, and commissioning requirements—tend to follow predictable patterns. Once these patterns are identified, with the help of artificial intelligence of course, they can be used to inform design and construction strategies from the start.
This, in turn, creates an opportunity to develop a repeatable model or formula that data center developers can easily apply to future projects.
Excessive customization lead to longer lead times, higher and less predictable costs, inconsistencies in quality and high risk. Modularization, on the other hand, allows projects to be designed, approved, built and commissioned more quickly, at a more predictable and often lower cost, with fewer surprises along the way.
Here are the specific advantages that modularization brings:
speed: Modularization and standardization can significantly speed up projects without compromising quality. The ability to develop one-megawatt modules, for example, with standardized HVAC configurations, materials, and labor requirements can allow developers to scale different components and then assemble the project as a whole. AI tools can help design and configure the modules themselves.
Off-site manufacturing of modules, as well as continuous commissioning within a project, where stakeholders can identify and address issues as they arise, offer additional time savings.
Better cost control, if not cost reduction: Modularization brings greater certainty of project costs and likely cost reduction, due to less customization and economy in purchases.
Stricter standardization for interior specifications: Inside a data center, every tenth of a degree of temperature and every millimeter of space counts. Miscalculations can lead to costly inefficiencies and even equipment failures. The intelligent, repeatable model and design is based on historical performance data and design specifications to optimize the internal configuration of the data center.
Along these lines, AI can create and manage a reference library of data center-specific construction best practices. This simplifies construction, especially for those with less experience.
Predictability in terms of equipment and labor requirements: Repeatable designs allow stakeholders to develop standard requirements for the parts, tools, equipment, and skilled labor needed on site, as well as for how long. These factors can then be adjusted according to the site conditions of a specific project.
Project monitoring and quality control: The modular approach allows contractors and other project stakeholders to monitor construction processes, progress and quality in near real-time to ensure everything is delivered, built and installed to specification and code. Because it is predictable and planned following a template, AI can also help in this regard, monitoring a project and alerting the right people when discrepancies arise.
Sustainability monitoring and reporting: Smart tools can help project stakeholders measure, track, and report carbon footprint, use of recycled materials, and other important sustainability-related metrics to meet regulatory requirements and internal goals/benchmarks.
Put all these elements together and the result is a repeatable set of specifications, processes and practices to bring data centers online faster, at a more predictable and hopefully lower cost, without compromising performance. This is how our future in AI should be built.
