What’s standing in the way of hybrid cloud?
With hybrid cloud uptake on the rise, the challenges involved with creating IT environments where hosted public & private clouds and on-premises infrastructure can thrive as one will become increasingly important. But what kind of implications do these challenges have for the race to provide enterprises with the most effective cloud services?
According to a recent study commissioned by Microsoft and produced by 452 Research, 68 per cent of organisations will adopt some kind of hybrid cloud model over the next two years, a 19 per cent increase over today’s hybrid cloud adoption rates.
The drive towards hybrid cloud clearly outpaces both on-premises private or public clouds, which is set to increase by nine per cent over the next two years, and hosted public or private clouds, which is forecast to grow by 11 per cent over the same period. The study, The New Era of Hosted Services, includes responses from more than 1,500 businesses of all sizes in ten countries.
If you were to measure the importance of integration simply by looking at the volume of investment going into it, the message is equally clear. Ravello closed a $26m deal earlier this year to build up its hybrid offering. VMware invested $30m in Puppet Labs, a company it’s partnering with to help IT organisations better manage and automate their hybrid cloud infrastructure, and the company is also investing $120 million in its Indian R&D unit – which was pivotal in developing the firm’s vCloud Hybrid Service. MuleSoft landed $37m last month. Accenture is investing over $400m in the company’s cloud capabilities in order to help businesses tackle hybrid cloud integration. This is just a selection from Q1 and Q2 this year.
While cloud is certainly making headway in organisations big and small – particularly software as a service and infrastructure as a service (IaaS) – the vast majority of data still resides in-house (somewhere in the area of 90 per cent). Many organisations still take a conservative approach to hosted cloud services, and are increasingly selective about the types of data that they shift off-premises.
There’s also the matter of cost, which tends to escalate quite quickly for businesses using applications that are always on – like dynamically updated databases or resource-intensive applications. Ultimately, most companies strive to operate in some combination of both private and public cloud environments.
The study mentioned above also concludes that about two-thirds of organisations now use SaaS, like CRM or analytics applications, as replacements for software they used to run in house. Most forms of hybrid cloud computing are not terribly complex, and mostly involve the integration of SaaSs – say, Salesforce – with existing systems and software that run in-house on legacy servers. Managing this kind of integration isn’t terribly challenging (or often doesn’t have to be).
The problem for IT arises as increasing numbers of workloads become split between hosted and in-house environments. Vendors like Dell or IBM – and most recently VMware – are actively promoting a vision of cloud computing that sees workloads spread across these public and private environments in an elegant and, as often advertised, seamless way. The value proposition seems immediately appealing – the security and hands-on management capability of on-premises combined with the scalability and virtually unlimited compute and storage ability of the cloud?
Hitting A Few Snags Along The Way
It’s an appealing vision but it’s not without issues. “Public and private clouds are really two different operating environments. The closer they are in terms of technology compatibility, the more likely hybrid cloud scenarios will work. The more divergent, the more difficult,” Ed Anderson, research director at Gartner told Business Cloud News. “At the current time, hybrid cloud computing is still pretty immature.”
There are solutions out there which help manage the gap between public and private environments. VMware’s Hybrid Cloud Service, Enstratius (acquired by Dell), IBM has the WebSphere range of products, MuleSoft has a widely used integration platform, and the list goes on. There seems to be an almost weekly evolution in the solutions available for self-service and automation. But the fact that the virtual machine ecosystem has become increasingly fragmented also means that it’s becoming more difficult to fit some of these pieces together, and speaks to some of the fundamental incompatibilities between public and private clouds.
This raises the value of APIs, VM encapsulation and software defined storage for enterprise cloud service providers and vendors. Still, while there are some solutions out there to help move workloads from machines operating in one environment to another (from VMware to Xen or Ravello for instance), because the majority of the public / private cloud integration offerings are quite young, it’s too early to say whether the gap is truly bridged.
“We don’t advise hybrid models often because they add an unnecessary level of complexity and because integration is challenging – with the exception of particular applications in telecoms for mediation systems and live-updated analytics, where it’s often necessary,” said Alex Fuller, chief technical officer at CloudSense, a cloud software provider and IT consultancy. “The hybrid situation is really representative of a transitional stage, it’s half-hearted – sometimes it’s a genuine need for legacy, and sometimes it’s a matter of company attitudes,” Fuller said. Transitional as it may be, the growing near-term momentum behind hybrid cloud adoption seems unavoidable – at least for the next few years.
Is Data In The Cloud Safe?
Data security is among the top reasons why organisations are moving towards a hybrid cloud framework over fully public solutions, and it no doubt causes some headaches because of the implication that data is crossing between two different zones, much of this data often have differing security policies surrounding it. “Once you cross operating boundaries, you risk compromising your ability to meet strict regulatory requirements because you cede control of part of your operations to a third-party,” Anderson said.
This also has ties to compliance, issues which often have to do with legislation applicable to the place where the data actually resides. “Things will change and in the UK regulations on data residency and sovereignty will catch up with the mass move to the cloud but it’s hard to say when,” Fuller said.
While solutions like the above-mentioned aim to address security and compliance issues as well as automation, auditing and the like, enterprise cloud solution providers are also moving to address security and data residency compliance by building out new data centres in the regions where their customers are based, giving them more choice on where their data sits. Oracle and Salesforce are both attending to this at present in the UK. This helps cloud solutions providers and enterprises get around the lack of new, “cloud-read” data residency policies needed in order to accompany increases public cloud adoption.
Security and compliance are also key areas of interest for telecom carriers in particular, who tend to deploy their cloud services across vast network and data centre footprints often found in multiple countries and regions. Those moving beyond offering customers stock enterprise cloud solutions have cited it as a huge concern, particularly when partnering with those who can help them address this for their customers. When Orange Business Services partnered with Nexthink IT to improve the former’s monitoring capability for its enterprise cloud solutions, Khaled Khondker, VP Global Services Europe – IT Services and Solutions at Orange Business Services cited security and compliance risk as key factors motivating the partnership.
Rather than building new data centres in every country in which the services are being offered, which can dramatically impact time to market and costs among other things, solution providers may be better off bundling sophisticated data obfuscation software into their enterprise offerings. This would help address data residency, security and privacy issues in one swoop. Not to mention the security technologies built into data centre infrastructure, public and virtual private clouds. If one looks past the fear mongering of security solution vendors, you might actually notice that these technologies are becoming more effective. Data from this year’s Verizon Data Breach investigative study suggest that while the number of attacks on data centres has increased dramatically – to over 47,000 reported security incidences, the number of confirmed data breaches has actually decreased from 855 to 621.
The Need For Speed
It’s clear that building new data centres is in an impractical solution to the data residency and security problem. It’s one of the key reasons many telcos prefer to partner with third-party data centres in order to deploy enterprise cloud solutions, focusing instead on improving the networks connecting them. Security, like automation, auditing and the like can often be tackled through software, but no combination of software can effectively address a central issue to all public and virtual private cloud services: latency (although that’s slowly changing with software defined networking). Speaking at the DataCentres Europe conference recently, Morgan Stanley’s managing director of enterprise infrastructure proclaimed latency to be the number one issue for the cloud sector, particularly for applications demanding real-time big data crunching abilities. Tackling latency in many ways requires capital intensive network and data centre upgrades, but are there ways to address this without embarking on a costly multi-year upgrade plan?