The intelligence revolution. Where data centre AI is going next



AI could add US$13 trillion to global economic output in the next decade

Capable of reducing overheads and protecting systems, AI is becoming integral to data centre management. With estimations that it could add US$13 trillion to global economic output in the next decade, Melanie Mingas explores the latest breakthroughs and asks where the tech is likely to go next.

This article first featured in the Apr/May 2020
edition of the Data Economy Magazine.

In the world of consumer tech, Artificial Intelligence (AI) has been making headlines for years. From the animation of robo-pets to writing music and news stories, AI’s only barrier, ironically, is human imagination.

In the data centre, AI is predominantly utilised to tackle the greatest challenge in modern computing: how to increase capacity while reducing the demand for energy.

In response to this brief, many AI developers are focused on creating tools to promote eco-friendly facilities management, while also potentially reducing HR overheads.

“You can think big but start with a smaller project,” says Ivo Koerner, VP of systems hardware at IBM Europe. “The investment you need to drive in a data centre is similar to all AI projects – you need to ensure you are collecting the right data, you have it, and you potentially find ways to enrich that data with external data to get better insight,” he adds.

IBM’s exploration of AI started in the 1950s and evolved to advance the areas of machine learning, deep question answering and cognitive architectures, among others.

IBM is behind the Deep Blue chess playing programme and the Watson machine learning system, and AI is embedded in its sales process and production environment inspections. Today, the firm isn’t just implementing solutions across its network of 60 data centres in six regions; for its outsourcingbased business model, efficiency is the Holy Grail and automation through AI is key to that efficiency.

“AI allows you to automate higher complexity tasks because you leave rulesbased process optimisation and enter an optimisation level where you really need to have knowledge and skills,” Koerner explains.

“We see that in data centre operations but also in different industries where clients augment operations with AI; those are very complex scenarios and operating environments,” he continues.

This work isn’t without challenge. According to Koerner, the tools and skills needed to build an AI solution aren’t easily found in the marketplace, meaning data centre operators have three choices: build on-premise, purchase infrastructure, or utilise a cloud offering – and the work doesn’t end there.

“It starts with building the right technology by sourcing the right skills. You need to find the right way to apply AI but you should never forget that it’s also going to create a cultural change in your organisation,” says Koerner. “Of course, you then need to think how you can inject AI into the decision process of the business, because that is the ultimate goal,” he adds. OUT OF

OUT OF THIS WORLD

Located at the junction of multiple subsea cables and satellite feeds in Cornwall, UK, is Goonhilly, a carrier-grade satellite ground station that not only acts as a communication and data hub but generates masses of its own data.

It also offers more than 2,000 sqm of AI-augmented data centre space for thirdparty use.

“AI is complex maths run by algorithms that we are using to solve very complex challenges. It challenges your design philosophy and also your build ethics,” says Chris Roberts, head of datacentre and cloud at the Goonhilly Earth Station.

Goonhilly began integrating AI a little over a year ago; initially implementing a solution to address the power density per rack and most recently looking at immersive liquid cooling. Today an additional goal is on the cards: to facilitate collaboration between academia and enterprise and ultimately progress AI capability.

In mid-2019, the station opened a new, green data centre as part of this goal. It offers high performance GPU-based compute and storage for decentralised and centralised AI and ML applications for the automotive, life sciences and aerospace sectors.

“You can’t just talk about AI you have to live it and collaboration is incredibly important. Even though we have some fantastically clever scientists, engineers who know all about space and data centres and connectivity, we aren’t all astrophysicists and not all of us have skills in the new applications that are coming out,” says Roberts.

Advising on how data centre operators can mobilise such information, he adds: “Know your place, know where you fit in it and work very hard to get the right blend of partners in to deliver value back to the rest of the world.”

INTELLIGENT SECURITY

Security is one area where AI is coming into its own. Already some of the most secure places on the planet, data centres are now looking to AI to keep ahead of both physical and virtual threats. According to Andrew Tsonchev, director of technology at Darktrace, over recent years AI has evolved from completing low-level mundane tasks, such as basic instructions, to more creative high-level tasks.

“What has happened is an encroachment of AI up this hierarchy and I’m not sure how far it will go. One optimistic view is that in the future AI will do everything – from protecting our networks from threats to designing the networks and data centres,” Tsonchev comments.

“From a security point of view the good news is you can certainly think of adopting AI as a force multiplier at the minute in terms of freeing up people to do the more thoughtful tasks,” he adds. While we remain “very far” from the point where AI is fully autonomous and predictive – and even further from a future where it designs its own next generation technology – today AI’s main draw is the resilience it facilitates.

As Tsonchev highlights, the point of investing in such solutions is to not to avoid attack but to create a system that can survive adverse events. “For that you need AI that will immediately respond and make smart decisions about how to contain stuff,” Tsonchev says.

Those smart decisions are based on modelling techniques, which essentially decision tree the path to a workable solution: discover what will happen if something is unplugged, without actually unplugging it.

That solution can then be modelled by the AI and if it doesn’t work, another can be explored until the issue is resolved.

“Most knowledge is lost in the complexity of systems, but if you can extract that knowledge and tell a system that there is an attacker in the network, you can then ask it to recommend a solution to contain the threat,” Tsonchev says.


Newsletter

Time is precious, but news has no time. Sign up today to receive daily free updates in your email box from the Data Economy Newsroom.


WHERE NEXT?

In attaching a dollar value to the immediate growth trajectory for AI tech, the World Bank’s official estimation is based on a 2018 McKinsey Global Institute report. It predicts a gradual 16% increase on global economic output, valued at US$13 trillion, by 2030.

Adopting a slightly more urgent tone, a 2019 report published by Gartner concluded that more than 30% of data centres that fail to implement AI solutions “won’t be operationally and economically feasible by 2020”.

In conclusion, the intelligence might be artificial, but the benefits can be very real. For those who own and operate data centres, the pressure is certainly on to create robust infrastructure that is scalable, personnel-light, secure and, wherever possible, not terrible for the environment.

AI has taken a role in all of this, but its full potential is far from realised.

“The more altruistic we can be the better, I would think we need to focus on improving the flow of data and enabling people to be more efficient,” says Roberts. “AI generally is going to touch everything – how do you narrow it down?”

Read the latest from the Data Economy Newsroom: