Monday, August 21, 2017
Edging away from the cloud? Why the new model of computing may not be as radical as we think
By Guest Contributor Published: 10:07, 21 August, 2017 Updated: 10:07, 21 August, 2017
by Martyn Davies, Director, Product Management, Rocket Software
Edge computing has been called the new ‘Next Big Thing’. Just as the world had adjusted to the idea that clouds were full of data rather than raindrops, we may need to re-think the way in which we manage our data processing. Gartner, Forrester and IDC have all produced reports discussing Edge computing and an article recently appeared in the media examining whether the cloud may soon be taking a back seat.
The drivers behind the edge
The move towards the Edge has been driven largely by the need for speed generated by the ever-increasing proliferation of mobile devices.
This includes not only the attention-grabbing Internet of Things (IoT) applications such as drones and self-driving cars, but those that are on our phones and smart watches, from fitness apps to sports results. The small-time delay that comes from sending a request thousands of miles to a cloud server on another continent is not acceptable to today’s customers.
As a result, more processing is taking place at the “edge” of the network, including the creation of small data repositories, which ensure that transactions can take place without needing to take into account long-distance data communications.
Computing at the edge also means that companies can ensure the data is processed in line with tend-user management and security requirements. For example, a large automotive manufacturer such as GM may require globally dispersed suppliers to send and receive computer-aided design (CAD); however, these files are often incredibly large and contain sensitive intellectual property (IP) information.
Edge computing can minimise the latency when having to move these files, and ensure that data security on the supplier’s side is up to scratch to protect the manufacturer’s IP.
The issues with long distance
The problems with distance are real. For example, the streaming of content such as video and music is now central to our home entertainment, but it can create challenges.
In the US, there was a period of time when Phoenix, Arizona had only standard quality streaming because it relied on content from Los Angeles, California, and at this distance they couldn’t achieve the necessary bandwidth. AOL addressed this problem by installing a range of “micro data centres” to ensure that it could handle spikes in demand following its acquisition of Huffington Post and Engadget.
A great many more service providers will be offering more localised processing to ensure that customers receive the level of service they demand. AT&T has already made the move to put micro data centres in its central offices and masts, while Amazon Web Services recently launched its own Edge initiative, Greengrass, to provide customers with more localised processing power.
Ever more options for businesses
So, does this mean the end for the big data centres? I don’t think so. Since computers first became mainstream, we have seen cycles in the way in which processing is handled.
The initial mainframe model, with a large computer and an array of “dumb terminals”, was followed by client-server computing, in which everyone installed software packages onto their desktops and the office server. Then cloud computing came next, based on the principle of giant data centres located in secure locations.
Each was proclaimed to be an entirely new concept, wiping out its predecessor. Yet none of these technologies has disappeared. Mainframes, for example, still form the backbone of many of our large corporations; the financial services industry has long understood the need for low latency and banks rely heavily on these giants to handle high volumes of transactions rapidly.
It is also unlikely that Edge computing will remove the need for large data centres. The contribution that mobile devices have made to business is to provide us with vast amounts of information which, if processed effectively, is incredibly valuable.
Analysing this data in order to understand customers, and, even more importantly, learning from it, is one of the most important challenges currently being tackled by the IT industry. Machine learning, whereby computers not only examine trends in data but adjust their response accordingly, is a technology that is developing fast.
For it to be effective it needs to access data on a major scale, so no matter how much Edge computing is taking place, information will still need to be fed back and aggregated to be analysed.
The cloud is here to stay
Edge computing is undoubtedly shaking things up. We can expect to see many more transactions taking place at a local level, and as a result we will experience a more satisfactory service as consumers.
There will be concerns about data security as our information is held in yet more places, and more discussions about vanishing data centres. Yet, in reality, this is another exciting new development that adds to the options available to businesses rather than simply replacing old ones. The cloud is not going to remain unchanged, but nor is it about to disappear.
Is Interconnection Maturity the Key to Scaling Digital Business?
By Guest Contributor Published: 13:58, 17 August, 2017 Updated: 18:11, 17 August, 2017
by Tony Bishop, Vice President, Global Vertical Strategy & Marketing at Equinix
Global Interconnection Index helps companies forecast Interconnection capacity.
Direct Interconnection capacity between businesses for private data exchange is forecasted to grow ten times faster than traditional MPLS networks, according to the recently released “Global Interconnection Index” market study, published by Equinix.
Driving this Interconnection proliferation is the fact that enterprises are now directly connecting to each other and service providers every day to exchange data traffic, create new business ecosystems and scale digital business. In fact, the state of an organization’s “Interconnection maturity” can be key to scaling its digital platform and growing as a digital business.
The Global Interconnection Index provides insights into the trends driving the massive need for Interconnection created by digital transformation and projects how Interconnection is increasingly growing in response to those trends worldwide.
Reinventing enterprise IT
The trends driving enterprise Interconnections worldwide include:
- Digital Technology Use, which forces the need to support real-time interactions requiring more Interconnection Bandwidth. According to Accenture’s “Digital Density Index,” digital technology use is projected to add $1.36 trillion in additional economic output in the world’s top 10 economies by 2020.
- Urbanization, which is transforming global demographics and creating a proximity need for digital services concentrated across metro centers globally. More than two billion people are expected to migrate to major cites by 2035, creating as many as 50 major urban metro hubs requiring dense Interconnection fabrics, according to “Connectography” by Parag Khanna.
- Data Sovereignty, which requires maintaining data locally while being used globally. More than 18 major countries globally block the transfer of data related to accounting, tax and financial information according to the Information Technology and Innovation Foundation’s “Cross Border Data Flow Report.”
- Cybersecurity Risk, which expands Interconnection consumption as firms increasingly shift to private data traffic exchange to bypass the public Internet and mitigate against digital threats. As per Gartner’s August 30, 2016 “Special Report: Cybersecurity at the Speed of Digital Business,” by 2020, an estimated 60% of digital businesses will suffer major service failures as breaches permeate across physical and digital platforms.
- Global Trade of Digitally Deliverable Services, which ushers in a new era of dynamic business processes and demand for Interconnection. Global digital workflows require a global mesh of Interconnected metros to fulfill demand. According to McKinsey’s “Digital Globalization” report, trade in digitally deliverable services now comprises 50% of total services exports globally, with an expected 9x increase by 2020.
Enterprises feel the impact of these trends as they strive to drive global collaboration and data exchange between businesses and geographic regions. If you don’t have a strong understanding of where your organization’s Interconnection maturity stands, you could fail to scale as a digital business.
What’s your Interconnection maturity?
In this climate of increasing digitization, if you are not scaling your Interconnection capabilities to digitalize your business, you are likely falling behind your competitors. So how do you know where you stand in a global Interconnection maturity model? The Global Interconnection Index helps you understand your company’s Interconnection maturity by assessing your current Interconnection Bandwidth consumption and projecting your future requirements.
It defines Interconnection Bandwidth as a measure of the total capacity provisioned to privately and directly exchange traffic with a diverse set of counterparties and providers at distributed IT exchange points. The Global Interconnection Index predicts that by 2020, an estimated 5,000 terabits per second (Tbps) of installed direct, private Interconnection Bandwidth capacity could be provisioned by enterprises and service providers, a fourfold increase from 2016, with double-digit growth across all industries and use cases.
By looking at the Global Interconnection Index’s firmographic profiling of Interconnection Bandwidth consumption trends (see the diagram below), you can gain insight into enterprise Interconnection adoption across three dimensions — company size (employees and revenue), geographical presence and use of distributed IT services.
For example, it shows that requirements for greater Interconnection Bandwidth capacity are driven by:
- Users: Companies that are larger in revenue and size require a greater number of Interconnections for their users.
- Locations: Businesses that are focused on broadening their physical geographic presence need to increase the number of Interconnections used in dispersed locations.
- Services: Organizations that extensively employ various distributed IT services need to provide more Interconnections to service partners (e.g., SaaS) and users.
By identifying how your company aligns across these three Interconnection-driving dimensions, you can gauge what your Interconnection Bandwidth capacity requirements are today and project how you will need to scale your Interconnection capabilities to grow and expand your company as a digital business.
Remember that your Interconnection Bandwidth capacity alone is not enough to determine where you stand in terms of Interconnection maturity. It is a quantitative metric, not a qualitative one. To better understand your Interconnection maturity position, you also need to weigh your Interconnection capabilities against:
1) The problems you are trying to solve,
2) The purpose for the Interconnections
3) The value they provide to your business and users as illustrated in the diagram below. The diagram below demonstrates how Interconnection maturity plays out as businesses scale.
For example, if you’re a manufacturing company and your goal is to enhance your North American supply chain partners’ experience when they interconnect with your enterprise resource planning (ERP) system, then you can solve for high latency by shortening the distance between those users and your ERP services via a single Interconnection or an Interconnection Node.
On the other hand, what if your company wants to expand its current supply chain globally to open new markets and will ultimately need to collaborate with many more partners and customers worldwide?
In this case, you may want to migrate your ERP systems to the cloud to deliver greater scalability and integration at a local country level while adjacently deploying critical security and data services to your ERP service to satisfy compliance requirements.
This is a case that may require you to move up through the Interconnection maturity model into a cluster or an ecosystem of Interconnection Nodes that enable you to deploy new capabilities and monetize new business opportunities faster.
Interconnection scales digital business
By leveraging the Global Interconnection Index, you can gain deeper insight into the vital role that scaling your Interconnection capabilities can play in digitizing your enterprise’s operations, products and services for an improved competitive position, enriched customer experience and expanded distribution footprint.
You can scale your Interconnection to enable greater digital collaboration and engagement with your employees, partners and customers. And you can innovate new digital capabilities that will spark the growth and expansion of your digital business globally.
To start mapping out your way toward greater Interconnection scalability, read the Global Interconnection Index market study.
Machine Learning: More than just algorithms
By Guest Contributor Published: Updated: 11:57, 17 August, 2017
by Brett Ley, Director, Data Center Sales, EMEA, Juniper Networks
Machine Learning (ML) and Artificial Intelligence (AI) are the new orange in networking. The most common storyline is that algorithms will drive behaviour. The theory is that they represent another logical opportunity for vendors to move “up stack”.
And while it is true that algorithms will (at least initially) provide a proprietary means of making solutions better, they aren’t the only lucrative aspect of a move towards a Self-Driving Network™.
Machine Learning in three sentences
Machine learning is basically the idea that systems can learn new behaviour without being told explicitly by a programmer what that behaviour ought to be. The behaviour is expressed in terms of models, which are themselves the result of examining data. The data scientists that you probably see popping up all over LinkedIn are the ones who find ways of expressing the data (and its patterns) via algorithms.
What are the valuable elements of machine learning?
The most obvious answer here is that value will accrue in the algorithms. Put more simply, whoever can find reason in a sea of data will be able to monetise that reason.
Basically, organisations will fall into two major camps: those for whom algorithms provide competitive differentiation, and those for whom machine learning is really just a tool to make things ‘cheap and cheerful’. Depending on which of these better fits your objectives, the answer to where the value resides will be different.
Everybody intuitively understands the role of better search and tagging algorithms that improve Google’s ability to tune its results, target content, and monetise adverts. Most people also understand that algorithms can help large retailers make purchase recommendations and tweak pricing to maximise profits. Some people are aware that gaming companies track playing and purchasing behaviour, and then use machine learning to entice players to buy into their in-app purchase schemes.
But not every use case provides a direct link between the business and the algorithms. In fact, for the vast majority of companies and use cases, machine learning is more likely to be a tool than a core competency.
Consuming machine learning
If machine learning is relegated to playing a supporting role, this means that it won’t be algorithms that companies must master. Rather, algorithms will be procured for sure, as part of broader solutions. And, if done well, the actual algorithms will be analogous to source code—important but ideally obfuscated if the solution is functioning as desired.
Of course, algorithms are not what drives the eventual solution behaviour. The models that the algorithms produce will be the means by which generalised rules become contextualised and so enable more effective behaviour patterns.
In fact, in a networking environment, if the goal of machine learning is to automate workflows as part of adaptive or predictive operations, generalised algorithms are simply building blocks. Workflows are not ubiquitous. They will be hyper-contextual. And this means that generalised building blocks probably represent only 80 percent of the solution.
Data is valuable too
So how do you contextualise an algorithm? In machine learning speak, you train it. This is where data comes in. If the behaviour being trained is common and consistent across all or even many environments, then the data can come from many places and be aggregated as part of the networking solution.
But if the behaviour is specifically based on the actual deployment—both the devices and also the surrounding infrastructure, applications and tools—then the generalised algorithm has to be fed into highly contextualised data.
In this scenario, the data is almost as important as the algorithm. In fact, companies that don’t have a data strategy are going to find that the trend around machine learning and AI is particularly brutal. Imagine internally selling an effort to automate everything through machine learning, only to find out that it requires massive rip and replace across huge chunks of infrastructure. This won’t endear you or the concept to your business.
Everything is a sensor and it ought to be streaming
Over the past few years, there has been a pretty strong push for streaming data in networking. Efforts around gRPC and message buses (Rabbit, ZMQ, etc) have been fairly popular among the DevOps-ian crowd. It turns out that solving data distribution is critical in moving to an event-driven infrastructure.
Much of that same work will translate nicely to a world where machine learning plays a role. There will need to be ways to collect training data. And that is not going to be a one-time thing, it has to be an ongoing evolution. If you do not update models as your infrastructure evolves, you will find things like automation simply accelerate the rate at which you can shoot yourself in the foot.
The bottom line
If you are listening to the siren song of machine learning but not considering how you are going to collect and use data dynamically over time, the next few years are going to be fairly disappointing. And if you are clinging to the hope that you can avoid the event-driven interim step in the automation journey, you are likely missing the value of the data in the future state.
While algorithms are going to be important, they are not going to do the work themselves – they are a means to an end. People should be planning now for how to contextualise more generalised rule sets.
And the clever organisations will realise that the data has value. This opens up opportunities to monetise in ways that traditional networking has not seen before.
Driving a data powered future – transformation within the insurance industry
By Guest Contributor Published: 04:33, 12 August, 2017 Updated: 16:36, 11 August, 2017
by Maggie Game, Strategic Client Director, Experian
It’s a time of change for the insurance industry. The explosion of consumer data, advancement of technology and increased focus on digital bring both challenges and opportunities for organisations.
So what developments are shaping the insurance industry, and how are businesses adapting? What innovations can we look forward to, and how can we best concentrate efforts to maximise success?
The growth of open data and data sharing is the first factor to acknowledge. The banking industry is currently going through the Competition and Markets Authority’s retail banking review, meaning organisations will need to allow access to current account transactions information by the first quarter of next year.
While this will be depend on customer consent, those who agree to share their information will benefit from greater transparency and increased access to innovative new products and services. This as a blueprint for other industries, including insurance. Similarly, increasing the control given to the consumer is something organisations across all sectors will have to adjust to.
Closely linked to this is the fact that we have access to more information than we ever have before. Legislation around data sharing, plus new data sources such as social media, connected cars and wearables, will bring exciting possibilities. The Internet of Things will allow organisations to collect permission-based, real-time data about how, when, and where consumers use their products.
Connecting data sources
Achieving a single view of the customer, with all their data in one place, is the ultimate goal for 97% of organisations. For insurers, having a comprehensive picture of customers’ accounts and activities is especially valuable.
Experian has invested in technology to make this possible. Organisations are empowered to link data across over 2 billion data records and create a single unified view by assigning a unique pin to almost every UK consumer.
Honing in on analysis and insight
Data is only valuable when it’s used to source insights which can help inform business decisions. The more information available to process, the more analytics will be used. There’s a real need for more advanced decision-making capabilities, such as automated systems that inform decision-making in real time. This is now being linked with optimisation which enable decisions to be made in less than 30 milliseconds, allowing the ultimate in next-best-action marketing.
Keeping the customer at the core
In rethinking and updating strategies, organisations need to innovate, but they need to take guidance from what their customers want. As technology continues to influence attitudes and expectations, insurers have to deliver the fast, frictionless and personalised journey their customers demand. Data and analytics can pave the way.
New technologies open up exciting – potentially revolutionary – possibilities. By 2020, 90% of cars will be connected. How can this new means of data collection enhance products and services? Machine learning, the Internet of Things, wearables, connected homes: all add to the complexity, but all can make significant improvements to the way businesses interact with customers. Transformative times are upon us and there’s a lot to look forward to.
Dynamic customer journeys are another notable development. Many insurance companies are now looking to digitise their processes across the whole value chain. As well as helping acquisition, this gives a better understanding of customers, telling them who’s more likely to convert as well as who represents a risk. It can also link to call-centre capabilities, allowing teams to choose how to best allocate their resources.
As the industry continues to evolve, so do the solutions within it. The focus now is on partnering with insurers to understand every consideration and make the most of every opportunity.
In this changing landscape, insurance providers need to find ways to adapt their strategies and focus on new opportunities for growth. A collaborative approach will help the industry unlock the power of data to drive acquisitions and growth, improve efficiency, combat fraud and to build stronger customer relationships.