Published on

How ChatGPT turned the data centre industry on its head

Danny Quinn, MD of DataVita, Scotland’s largest data centre provider

I've witnessed firsthand the transformative effects of the artificial intelligence (AI) revolution on the data centre industry. Historically, attracting data centre opportunities to Scotland posed significant challenges – primarily due to the lack of geographical necessity for many businesses to locate here. However, the landscape has dramatically shifted since the beginning of 2022, driven by the exponential rise of AI.

In early 2022, our data centre, with a 16MW capacity, had sold approximately 3MW over five years. This changed almost overnight with the explosion of AI demand. To meet this, we've doubled our capacity to 40MW and plan to bring over 300MW online in the next few years, all driven by the needs of AI.

AI became mainstream almost overnight, thanks to ChatGPT. The core technology behind ChatGPT isn't new – neural networks have been around since 2012, and Google introduced transformer models in 2017. But, ChatGPT captured public attention due to OpenAI's substantial investment in computational resources, training large-scale models like GPT-3, with its 175 billion parameters. This was a quantum leap from previous models.

Despite not fully understanding why stacking multiple transformer layers results in such powerful models, the success of them lies in the complex and effective way these layers process and integrate information. Advancements in hardware affordability, cloud computing efficiencies, and algorithmic improvements made training such large models feasible.

OpenAI's demonstration of what is possible with vast computational power ignited a global AI arms race. Now, everyone from enterprises to the public sector wants a piece of what is likely to be one of the most defining technologies in human history. The entry point into this market, based on our experience, starts at around 2,000 GPUs – equating to roughly 3MW of power. However, this is just the beginning and that number will almost certainly rise.

The current AI arms race comprises three main components:

1. Access to the latest hardware: Securing the newest and most powerful GPUs and other hardware is crucial.

2. Data centre space: The physical space to house this hardware is in high demand.

3. Power availability: Ensuring sufficient power to run these systems is critical.

For instance, Meta aims to deploy 600,000 GPUs by the end of the year, requiring 180MW of power – equivalent to the power consumption of a small city. It’s worth bearing in mind that this is just one company's demand and that will lead to steep competition for access to products and infrastructure.

As a result, many leading AI companies are now exploring the use of private nuclear reactors to support their new models. What began as an IT challenge has morphed into a classic resource grab, where the fastest and most efficient deployments will dominate the market.

We are in an exciting yet uncertain time. The primary goal in the coming years is to be at the forefront of this revolution, despite the inherent risks and challenges involved. The AI-driven boom has already transformed the data centre industry, and its impact will continue to shape the future of technology and resource management in the years ahead.