How will AI impact the data center industry?
Release time:
2023-08-25
Developments in artificial intelligence (AI), especially generative AI products such as ChatGPT, have dominated media headlines over the past year. Aside from the potential to disrupt or improve everyday life, an often overlooked impact of AI-and all of the widely adopted technologies-is the impact on the data center.
How will AI impact the data center industry?
After experiencing the introduction and rapid adoption of mobile devices and the cloud, data centers are very good at taking a proactive approach to using new technologies.
With AI still in a relatively immature state, this is a critical time for data center professionals to consider how to respond to the coming AI boom.
Adapt to new workloads
Artificial intelligence can be divided into four broad categories: natural language processing (NLP), computer vision, machine learning, and robotics. While robotics is particularly sensitive to latency, edge computing solutions are often required to be very close to the physical location of the processes being managed. But it is expected that the first three solutions will really increase the demand for data center solutions.
Meeting this growing demand is no easy task. Not only do you need to consider the physical impact of hosting a large number of servers to accommodate higher density workloads, but you also need to consider how to integrate new technologies, such as liquid cooling and immersion cooling, to combat the heat that these servers will generate.
In addition, the load is also unstable. A huge surge can occur at any time, and historically, data center management loads have been fairly smooth and consistent.
One of the biggest challenges is that AI is not a homogeneous entity, but a technology that is divided into two distinct phases: training and reasoning.
Successful data centers will learn to adapt to both. AI training will no longer need to focus on resilience and redundancy, but more on cost, PUE and overall efficiency. On the other hand, inference is very sensitive to latency and needs to be close to the metropolitan center to ensure fast response times for the user interface and applications.
Regulatory aspects
The difficulty for regulators is not knowing how AI will work. It's largely in its infancy, and regulators understandably want to cover all potential hazards.
The EU's AI bill is a clear example, where regulators classify applications into four key risk levels: unacceptable risk, high risk, limited risk, and minimal or no risk. Among other things, the NIS2 directive will expand the number of sectors expected to comply with its original regulations on cybersecurity, which now includes the digital realm.
The challenge for many industries, including data centers, will be ensuring compliance with changing regulations. AI is advancing faster than anything we 've seen in recent years, and data centers are sure to feel the ripple effects as regulators keep updating parameters and defining new risk boundaries.
Addressing critical shortages
It is well known that the strategic value of microprocessors makes them subject to government trade restrictions. With the acceleration of diverse AI adoption, and the huge workloads required for these applications, graphics processing units (GPUs) are becoming increasingly scarce.
Scaling up production is not a simple solution. In fact, recent data found that it would cost about $40 billion to build a two-nanometer chip factory in the United States or Europe. While there has been a concerted effort to spread production across multiple regions, and businesses such as Vattr and Northern Data are moving in earnest to create a whole new "AI cloud" industry, microprocessor shortages are sure to remain a sore point until supply matches demand.
The data center shortage is also a concern, but the challenge here is not innovation, but limited land and power resources, not to mention politics.
Addressing the data center shortage requires a two-pronged approach: a. maximizing power capacity to provide the low latency levels required by AI; and B. doing the same in areas with more available land. Finding remote locations for AI training so that it doesn't take up the workload of a large reasoning metropolitan area is a very valuable approach.
Reconfigure the Data Center for Artificial Intelligence
This concept of maximizing existing resources can determine how to reconfigure the data center because it puts sustainability at the heart of the strategy.
In France, "zero net artificial" is an agreement designed to stop urban expansion and preserve the biodiversity of green spaces. For data centers, this means taking full advantage of the potential of existing buildings and making these sites as dense as possible. But to do so requires some reconfiguration.
You need to evaluate how to maximize the space at these existing sites in order to prioritize efficiency to support high AI workloads. Sustainability is no longer an intangible concept, but a very real issue that should define reconfiguration strategies on a global scale.
If you don't start making better decisions to extend the life of data center products, such as moving to liquid and immersion cooling technologies, then previous efforts to adapt to AI-intensive infrastructure will largely be in vain.
Staying ahead of the AI revolution is an ambitious goal for any industry, including the data center. But by adopting advanced cooling technologies, complying with ever-changing regulations, and taking every opportunity to advocate for sustainable development, we believe there is potential to thrive in this new technological era.
Previous article
Next article
Previous article
Next article
Recommended News
Share