Saturday, November 25, 2017


‘AI is going to be one of the most spectacular drivers for the edge,’ says VaporIO president



A driverless car is driving itself through a motorway in sunny California until, suddenly, the vehicle in front of it breaks hard, leaving the car to crash, killing its occupants. The time it took for the vehicle and the surrounding infrastructure to process the information was not fast enough to avoid disaster. It is to that end – to avoid scenarios like this and many other use cases – that edge computing sites and AI will have to collaborate, as VaporIO’s president Don Duet explains to João Marques Lima.

Neural networks, deep learning, deep dreaming, and so on. The artificial intelligence spectrum is about to walk into the data centre as never seen before.

What yesterday was acceptable to take up to ten milliseconds in latency terms, today might not be able to deliver the desired business outcome, and tomorrow, that latency will have to be in the order of nanoseconds if that business or application wants to deliver the needed value it set itself to provide in the first place.

This article originally appeared in the last issue of the Data Economy Magazine. Click here to read more.

When building an edge architecture, AI capabilities will become even more paramount to curb latency times and to the survival of the edge itself.

With thousands – if not millions in the long term – edge nodes being deployed across the globe, it will be physically and economically inviable to staff up on human resources to manage the systems.

That is why, according to Don Duet, President, VaporIO, AI really will be a major ally of the edge revolution and will enable new and bigger opportunities to most sectors.

“AI is going to one of the most specular drivers for the edge and it continues to be propagating,” Duet said.

The whole idea of the edge is to bring the computer power closer to the end user, to the application that needs running faster and at a greater reliability and speed.

Think of driverless cars or a natural disaster response robot. Yet, simply deploying the infrastructure out in the world will not solve this. With the right AI brain power, decisions and data processing at the edge can be reduced to as low as three milliseconds.

“That fundamentally opens up a whole new set of use cases for AI,” Duet added.

“Financial services are a great example of the edge, because when we look at the market structure, electronic trading has evolved into the low latency. It is all edge, down to the milliseconds, nanoseconds.

“They all matter because you are trying to make decisions effectively and as close to real time as possible using all the AI array.”

The need to build these distributed systems with the right AI components, brings to the discussion table the ongoing calls for more collaboration not just within the data centre industry, but the wider technology world.

As Duet warns, the edge “isn’t going to be a winner takes it all solution by any means”.

“It is going to be a collaborative effort across many different companies and people that come with different points of view; suppliers with different points of view; service providers, end customers, and so on.

“[The industry has to] bring them together and get them working together to make it simple and easy and therefore reduce the hurdle to business.”

 

FIRST MAJOR EDGE DEPLOYMENTS

Don Duet, President, VaporIO

Thinking of itself as being “like Apple”, but on a startup phase with investments being made by Goldman Sachs and Crown Castle, VaporIO, has set out an ambitious plan to deploy thousands of edge data centres next to cell towers across the US.

“We want to be the enablers of the edge and hopefully make the edge as simple as cloud; make the edge something that doesn’t require 600 years of planning,” says Duet.

To enable the edge, VaporIO has built its own hardware which resulted in the creation of the Chamber, a cylindric micro data centre structure designed to be composable.

“We look at ourselves as the plumbers. We are the ones that make [edge applications, such as driverless cars] possible but we are not an affective part of the stack. Yet, we are in the critical part of the stack.”

Cost wise, the deployment of such environments, says Duet, are minimal when compared to the construction of multi-MW centralised facilities.

CAPEX can be as low as “$100,000 per site type-thing” if the facility where the infrastructure is going to be deployed doesn’t require much retrofitting work to be carried out to improve air quality, air flow, power supply, and other on-site requirements.

“We can run in really distressing climates. We don’t need much conditioning of air, relative to a more hustle facility. As a result, many of those costs are massively deflated.

“We also fit very well across the distribution grid as we are not looking for MWs of power per site but for a couple hundred KW of power per site, and that is easy to get.

“Then you work from around $300,000-$400,000 to get the facilities in place, the equipment is usually very commoditised.”

Duet added: “The capital cost of data centres and the cost of developing micro data centres is a massively lower capital cost, and I think that is not just a trend but something that is just going to continue to broaden.”