Cisco director leaves to become key alliances head for Commvault
By João Marques Lima Published: 19:20, 16 March, 2017 Updated: 17:33, 17 March, 2017
Provider seeks to strengthen its footprint across EMEA where the storage and data markets are growing at an exponential rate.
Data protection and information management company Commvault has appointed Ed Baker as director of alliances for Europe, Middle East and Africa to driver forward relationships in the region.
Baker’s experience includes roles at IBM and Brocade. In 2003 he joined Cisco as a sales leader for data centres in emerging markets.
He left Cisco in November 2016 as director for data centre incubation sales EMEAR. While at Cisco, he was responsible for building sales and technical teams across the region.
At Commvault, Baker will be responsible for developing new and strengthening current partner alliances.
Bruce Park, Vice President of EMEA for Channels and Alliances at Commvault, said: “We consistently strives to provide the best solution for its customers by continuously improving the Commvault Data Platform.
“With Ed’s proven experience building teams and forging alliances with some of the biggest players in the industry, we will be able to offer even more data management solutions and tools that Commvault users need today and as they look forward.
“This will allow our customers to manage and ensure the compliance of their ever growing amount of business critical data while delivering tangible competitive differential and ensuring confidence to embrace the new opportunities of cloud and solution-as-a-service offerings.”
Baker said: “I’m excited to have this opportunity to work with Commvault and develop partner alliances with some of the most innovative technology companies in the world. This role provides an opportunity for me to add real value for our customers by simplifying the way Commvault’s offerings can be integrated into wider, partner delivered, service offerings.
“Commvault has a data platform that can fix real problems for businesses and help organisations make their journey to the cloud.
“Commvault’s disaster recovery and data protection offering is incredibly strong, but when combined with Commvault’s simple and efficient cloud migration capabilities, customers have a data platform of the future at their fingertips.”
Huawei bets storage as a service can really help enterprises deal with massive data surge
By João Marques Lima Published: 01:10, 22 March, 2017 Updated: 22:25, 21 March, 2017
CeBIT 2017: With a single pane to manage data and the ability to consolidate resources on and off-premise, STaaS enters the as a service market.
Shenzhen-based Huawei is leaving no market untapped and has now upped the game in the storage arena with the introduction of Storage as a Service (STaaS) for the hybrid cloud world.
As enterprise data lakes grow at record levels driven by the IoT, social media and other trends the need to store, analyse and process data has never been so great, and that was the reason why 27-year-old Huawei has bulked its storage portfolio.
Unveiled at CeBIT 2017, STaaS for the Hybrid Cloud solution manages SAN, NAS, object, and other storage equipment in a single pane, to support a ‘Data on Demand’ approach.
Huawei claims STaaS can help enterprises reduce total cost of ownership by 20% through the consolidation of storage assets into resource pools, on-demand allocations, and resource utilisation.
In addition, Huawei said the solution could augment data management efficiency by 50% through the usage of self-help service catalogues, a high-level of automation in resource allocation, intelligent prediction capabilities, and decision-making support.
STaaS includes a unified architecture designed to enable users to choose from service catalogues and service level agreements (SLAs).
Meng Guangbin, President, IT Storage Product Line, Huawei, said: “As cloud enablement continues to accelerate, there is no doubt that IT services will become a more prominent part of data centre operations.
“Our STaaS solution consolidates on- and off-premise resources and delivers standardised, automated, and on-demand data storage services to increase agility in enterprise cloud transformations.”
Additionally, Huawei also carried out the global launch of its all-flash OceanStor Dorado V3 hardware built to handle data-intensive enterprise applications.
‘CEOs desperate to quantify the value of data and analytics,’ warns Information Builders chief
By João Marques Lima Published: 06:00, 21 March, 2017 Updated: 23:49, 20 March, 2017
In a world ever more built on data, 8 in 10 CEOs say they are unsatisfied with the value generated from investments in data and analytics.
Data and analytics is this week driving several thought leaders to the British capital in search for answers around the huge lack of satisfaction surrounding data.
Conferencing at Gartner Data & Analytics London, leaders will discuss the latest data which suggests that only 19% of CEOs report being satisfied with the value generated from their investments in data and analytics.
Data Economy (DE) sat down with Michael Corcoran (MC), chief marketing officer at business intelligence and data analytics company Information Builders, who is also attending the event to discuss more.
DE: Why are the vast majority of CEOs unsatisfied with the value generated from their investments in data and analytics?
MC: The good news is that most executives and organization leaders finally recognize that data is as valuable an asset as financial capital and human capital. Similarly to those other assets, data needs to be put to work in order to quantify its value.
Businesses have been collecting data for decades and we have finally reached the point where most people recognize that analysing data is more valuable than merely storing transactions.
CEOs are desperately seeking to quantify the value of data and analytics, and ultimately monetize it. Unfortunately, most organizations have made data and analytics available to less than 25% of their employees, and have yet to address the needs of business partners and customers.
The unserved 75% of employees, and the partners and customers outside the firewall, represent the best opportunities to operationalize and monetize their data and analytic investments.
DE: With only 19% of CEOs satisfied, the problem seems serious: what sort of board actions are needed to change this?
MC: There are some very positive developments occurring, such as the introduction of the Chief Data Officer and Chief Analytics Officer roles. These are top-level, sometimes Board-level, positions focused on how the organization manages, governs, and derives value from these data assets.
Executive teams need to adopt an aggressive strategy to make data and analytics pervasive throughout the extended enterprise.
The overwhelming majority of analytic activity is focused on back office analytics to generate management dashboards. Providing management with dashboards for insight is valuable, but on its own does little to alter employee performance or the bottom line directly.
Conversely, when analytics are operationalized to account executives, call centre operators, field technicians, police officers, nurses, etc., we see immediate dramatic improvements in individual performance and the associated impact on finance.
When extended out to suppliers, distributors, service partners and customers, the impact on direct revenue is even more evident.
Analytics is often employed to analyze customer data, with the goal of “Know Your Customer”. If you really knew your customer, you would realize that they want direct access to self-service information and analytics as well.
DE: What will the role of the CIO and CDO be in that transformation and how will this match the CFOs and CISOs expectations of low costs and high security?
MC: We have witnessed a philosophical shift in the relationship between IT and the business as related to data and analytics. Traditional business intelligence tools were considered IT centric as they required extensive data modelling and longer time to value.
Many of the newer visual “self-service” analytic tools are considered more business centric as they emphasize ease of use without IT involvement and often without metadata.
The challenge is that these tools promote a silo approach to analytics, lacking centralized data and collaboration. This combined with the lack of metadata is generating a new level of information discrepancy within the enterprise which is difficult to audit, similar to spreadsheet-centric environments.
The pendulum seems to be finally shifting to a place where there is a greater balance and collaboration between IT and the business.
The CIO and CDO/CAO roles need to work together to ensure that data is easily accessible, accurate and consistently governed, and that analytics are faster and easier to generate and consume.
DE: What are the common pitfalls of big data and business intelligence programmes and how do you avoid these?
MC: Big Data and business intelligence/analytics suffer the same fate. Organizations tend to isolate these technologies and limit their exposure.
Most organizations are trying to figure out what to do with Big Data, so they create Data Lakes where specialized data scientists and business analysts can access them with analytic tools with the hope of finding some “golden nuggets” of insight. Big Data assets need to be operationalized and monetized.
There is an important transition from “hype” to “reality” that often occurs in technology, and Big Data has reached that stage.
The issue that will legitimize Big Data is not technology, but the business potential of the Internet of Things (IoT). Big Data was first focused on clickstream and social media analytics, which is of value to e-business, consumer product, and entertainment industries.
The explosive growth is now coming from machine generated, location and home automation data. We are seeing exciting implementations in the automotive, manufacturing, transportation, logistics, local government and utilities industries.
More importantly, Big Data is also now being exposed to operational employees, partners, customers and citizens through online reporting and self-service analytic apps.
DE: What are the stages that organisations must follow in order to be able to rely on and generate insights and revenue from data?
MC: Data and analytics are a critical component of your Digital Transformation. There are 4 critical stages of the strategy to ensure the journey is successful.
Stage 1 is to Harmonize all data, including data warehouses, Big data, operational data, social media and cloud data. In recent surveys, almost 60% of organizations stated they do not trust their data.
Data needs to be integrated, federated, cleansed for quality, and mastered for consistency across all systems. A strong Data Governance strategy should be employed, encouraging strong collaboration between IT staff, data stewards, and business users.
Stage 2 is to Visualize the data in order to generate new insights about the business. Today’s new analytic tools should allow business analysts to easily generate data visualizations, reports, and dashboards to solve problems and identify business trends.
Analytical tools should scale and incorporate advanced analytical functions such as predictive analytics, search, text analytics, machine learning and AI.”
Stage 3 is to Operationalize those insights by deploying easy to use analytic applications, portals and interactive documents to non-technical employees and business partners.
The focus here is to provide operational decision support for ongoing questions, as well as comparative performance metrics to motivate and drive performance. These deployments improve cost control, reduce waste and deter fraud.
Stage 4 is to Monetize data and analytics by deploying portals, embedded business intelligence, and interactive e-statements directly to customers.
These analytic deployments will generate increased revenue and customer loyalty. Some organizations are discovering they have a new opportunity to create direct revenue generating “data products” as well, providing new value for their customers and a new future for their traditional business.
Brexit turns overseas data centre rivals into partners
By João Marques Lima Published: 12:00, 15 March, 2017 Updated: 01:11, 15 March, 2017
As the UK moves closer to trigger Article 50 and ignite formal negotiations to leave the EU, providers try to protect themselves against whatever is coming.
Fears of Brexit complications over cross-border data management have led the European Business Reliance Centre (EBRC) to sign UK-based data centre operator Migsolv as a partner.
The EBRC is a highly sensitive information management organisation based in Luxembourg with a portfolio containing a total 17,000 sqm of server space across five data centres, three of them Tier IV-certified.
In addition, the organisation provides cybersecurity, cloud and managed services to 280 clients from 40 countries.
However uncertainty over Brexit is still an issue not just in the world of data centres, the EBRC ‘s move to partner with Migsolv comes days after UK Prime Minister Theresa May announced the government will be adopting the EU’s GDPR after the separation from the pan-European union.
Yet, by selecting Migsolv as its UK partner, EBRC will be able to guarantee its clients a wider selection of data centres under a single contract, “including a UK facility which some are seeking to address the uncertainties posed by Brexit,” the company said in a statement.
Similarly, Migsolv will also be able to offer customers additional facilities on mainland Europe.
The EBRC will also appoint a partner in other major European countries including Germany, Switzerland and Belgium.
Alexander Duwaerts, International Client Development Director at EBRC, said: “The UK is especially important with Brexit on the horizon as customers are concerned about the implications it may have for cross-border data transfer.
“Many want to secure a presence in a UK facility to protect their options and ensure they can continue trading with Britain without disruption when Brexit takes effect.”
This is, however, not the first time two data centre providers come together to prevent any complications following Brexit.
In October 2016, LuxConnect and Volta Data Centres forged the first alliance with the intention to create a future-proofed environment for customers to move their data in a post-Brexit UK.
There’s a global crisis around software code quality: Why enterprises are still making the mistake of not paying full attention to their lines of code in the data economy
By João Marques Lima Published: 18:14, 10 March, 2017 Updated: 17:09, 12 March, 2017
Data Economy speaks to Lev Lesokhin, EVP Strategy and Analytics at CAST, following a report looking at over one billion lines of code exposes the naked truth of code quality worldwide.
The quality of business application software is crucial in a time when reliance on code and the way this functions within an enterprise’s IT ecosystem to conduct the right tasks at speed and with the right value proves to be the differentiator between winners and losers.
In a recent report by CAST, the company has found that despite businesses understanding the important of their code, the overall quality of too many mission critical functions across the globe is poor.
Financial services were found to be particularly susceptible to security risks, and for an industry carrying large amounts of sensitive data, organisations operating within this sector are at risk of severe regulatory fines, the CRASH Report has found.
The report was based on over one billion lines of code across almost 2,000 enterprise applications as run by 300 organisations such as banks, insurers and government departments.
To discuss more on why enterprises are still failing to arrange their coding lines to protect their own data, Data Economy (DE) spoke to Lev Lesokhin (LL), EVP Strategy and Analytics at CAST.
DE: Why are enterprises still making the mistake of not paying full attention to their lines of code when their business is becoming ever more dependent on IT/code?
LL: Paying attention to the quality of the engineering and construction of software systems falls through the cracks between developers and management in most IT organisations.
Most app dev managers believe the responsibility for software integrity is something developers should take care of themselves. At the same time, project managers are constantly pressed by the business to deliver more, and more quickly.
This leaves no time, nor incentive, for the development teams to pay attention to software structure and integrity with any rigour.
This problem is most pronounced in the UK, followed by the US. In continental Europe, there is a slightly higher consideration for software engineering, structure and integrity.
DE: How can enterprises (especially financial ones) mitigate risks associated with bad coding?
LL: The first thing enterprise IT departments need to do is to establish ownership of the software structural quality problem, and assign responsibility for managing software risk. Successful organisations have a senior level Structural Quality Officer, at the level of an Enterprise Architect.
This senior level individual should be given the right to veto releases to production and should run a Center of Excellence (CoE) that provides system level analysis of applications as a service to all development teams.
The biggest risks to enterprise applications come from the way modern IT applications are built across multiple components, languages, frameworks and data stores.
No individual developer can see the issues he/she will introduce, even in the highest quality code they write, when those issues are a function of multi-component interfaces that are abstracted from that developer.
Levels of abstraction, multi-technology applications, legacy wrappers, service busses, are all contributing to a level of complexity that drives increased software risk and can only be analysed at the system level.
DE: Can you provide an example of a catastrophic scenario that could be sparked by a bad line of code?
LL: A common example would be having an expensive statement made inside a loop in a procedure written somewhere near the front end, which is the user interface, in a large complex application.
If a loop contains a call to a method outside that procedure, to the developer it will seem a rather innocuous structure.
The piece of code might have the best code quality imaginable, but the call to the external method might mask a great deal of resource consumption.
The external method may call other methods or routines, which may encapsulate calls to old COBOL structures, which then reach back into a legacy DB2 database.
This will cause a great deal of network traffic, Central Processing Unit (CPU) thrashing and MIPS consumption, causing the system to slow down drastically – and not just for the user invoking the offending interface, but for everyone else as well.
Usually, that’s handled by throwing more iron at the problem, or lately by the elasticity of the Cloud, but it’s the kind of issue which limits vertical scalability and causes time-outs, outages and poor user experience.
This type of issue can only be detected when examining the interactions of components across the system.