Wednesday, October 18, 2017


Edging away from the cloud? Why the new model of computing may not be as radical as we think  



by Martyn Davies, Director, Product Management, Rocket Software

Edge computing has been called the new ‘Next Big Thing’. Just as the world had adjusted to the idea that clouds were full of data rather than raindrops, we may need to re-think the way in which we manage our data processing. Gartner, Forrester and IDC have all produced reports discussing Edge computing and an article recently appeared in the media examining whether the cloud may soon be taking a back seat.

 

The drivers behind the edge

The move towards the Edge has been driven largely by the need for speed generated by the ever-increasing proliferation of mobile devices.

This includes not only the attention-grabbing Internet of Things (IoT) applications such as drones and self-driving cars, but those that are on our phones and smart watches, from fitness apps to sports results. The small-time delay that comes from sending a request thousands of miles to a cloud server on another continent is not acceptable to today’s customers.

As a result, more processing is taking place at the “edge” of the network, including the creation of small data repositories, which ensure that transactions can take place without needing to take into account long-distance data communications.

Computing at the edge also means that companies can ensure the data is processed in line with tend-user management and security requirements. For example, a large automotive manufacturer such as GM may require globally dispersed suppliers to send and receive computer-aided design (CAD); however, these files are often incredibly large and contain sensitive intellectual property (IP) information.

Edge computing can minimise the latency when having to move these files, and ensure that data security on the supplier’s side is up to scratch to protect the manufacturer’s IP.

 

The issues with long distance

The problems with distance are real. For example, the streaming of content such as video and music is now central to our home entertainment, but it can create challenges.

In the US, there was a period of time when Phoenix, Arizona had only standard quality streaming because it relied on content from Los Angeles, California, and at this distance they couldn’t achieve the necessary bandwidth. AOL addressed this problem by installing a range of “micro data centres” to ensure that it could handle spikes in demand following its acquisition of Huffington Post and Engadget.

A great many more service providers will be offering more localised processing to ensure that customers receive the level of service they demand. AT&T has already made the move to put micro data centres in its central offices and masts, while Amazon Web Services recently launched its own Edge initiative, Greengrass, to provide customers with more localised processing power.

 

Ever more options for businesses

So, does this mean the end for the big data centres?  I don’t think so. Since computers first became mainstream, we have seen cycles in the way in which processing is handled.

The initial mainframe model, with a large computer and an array of “dumb terminals”, was followed by client-server computing, in which everyone installed software packages onto their desktops and the office server. Then cloud computing came next, based on the principle of giant data centres located in secure locations.

Each was proclaimed to be an entirely new concept, wiping out its predecessor. Yet none of these technologies has disappeared. Mainframes, for example, still form the backbone of many of our large corporations; the financial services industry has long understood the need for low latency and banks rely heavily on these giants to handle high volumes of transactions rapidly.

It is also unlikely that Edge computing will remove the need for large data centres. The contribution that mobile devices have made to business is to provide us with vast amounts of information which, if processed effectively, is incredibly valuable.

Analysing this data in order to understand customers, and, even more importantly, learning from it, is one of the most important challenges currently being tackled by the IT industry. Machine learning, whereby computers not only examine trends in data but adjust their response accordingly, is a technology that is developing fast.

For it to be effective it needs to access data on a major scale, so no matter how much Edge computing is taking place, information will still need to be fed back and aggregated to be analysed.

 

The cloud is here to stay

Edge computing is undoubtedly shaking things up. We can expect to see many more transactions taking place at a local level, and as a result we will experience a more satisfactory service as consumers.

There will be concerns about data security as our information is held in yet more places, and more discussions about vanishing data centres. Yet, in reality, this is another exciting new development that adds to the options available to businesses rather than simply replacing old ones. The cloud is not going to remain unchanged, but nor is it about to disappear.