This article was published on September 11, 2013

5 reasons enterprises are frightened of the cloud


5 reasons enterprises are frightened of the cloud

_

This post is brought to you by Comcast Business.









_


In the first of a series of articles on enterprise cloud computing, TNW takes a look at why many corporates still fear the cloud and what it will take to fix this.

Cloud computing has been a hot topic for several years, but while startups have been quick to adopt cloud computing, many corporates have been more cautious to make the switch.

According to a recent survey, 49% of executive-level managers think cloud computing will transform their business, yet many business have been slow to adopt the cloud because of a range of concerns.

But why are enterprises fearful of the cloud? Here are 5 of the top reasons large corporates are worried about making the move:

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

1. Losing control

167466548When systems go down, it’s invariably the IT Director who takes the blame, so it’s no surprise that they are nervous about handing over responsibility for infrastructure to somebody they don’t even know.

Traditionally, if a hard drive fails or a CPU overheats, it’s the IT department who get the 3am call to fix it. It’s up to them to order the replacement hardware, power down the machine and get it back into production as soon as possible.

Even if you’re renting space in a hosting centre and paying for support, chances are you are physically allowed into the building and have the peace of mind that comes from knowing that, if all else fails, you can go in and fix the hardware yourself.

By moving to cloud computing, the IT Director or CIO can feel like they’re handing over control to a 3rd-party. What happens if one of the drives goes or there’s a networking issue? Who decides what hardware to buy?

However, it’s important to distinguish between “doing the donkey work” and “having control”. It’s certainly true that some things that might be possible with a traditional setup, such as sending a crashed hard drive to a data recovery service, are no longer possible when you don’t physically own the hardware. But the APIs and resources which can be scaled quickly, and even automatically, can actually lead to a different, and arguably greater, level of control.

The flexibility of cloud pricing models can actually lead to more overall control, according to Ryan Stenhouse, a freelance cloud engineering specialist formerly with cloud accounting solution FreeAgent.

“If anything, you have more control over what you’re deploying on, since you have no fixed allocation of resources – you use as much or as little as you need and that leads to savings.”

2. Security

123208401Along with loss of control, security is arguably the biggest concern many large organizations have. Can other customers access our data? Have all the security patches been kept up to date?

By definition, public clouds share resources between different customers and use virtualization heavily, and this does create additional security vulnerabilities, both from access levels as well as from exploits in the virtualization software.

“The biggest security concerns are around access to your data on your VM, you should carefully investigate the controls providers have in place to secure your environment,” says Stenhouse, “Big providers such as Amazon and Rackspace make this information available and are accredited to the highest industry standards.”

Arguably the biggest security risk in any infrastructure is overlooking serious security flaws because of time, expertise and resources. No system is perfect, and the reality is that a well-staffed cloud provider, with highly-trained staff dealing with security every day, is often likely to reduce the chances of security breaches occurring, compared to an overworked and under-resourced corporate IT department. It’s a simple economy of scale.

A cloud host is more likely to have robust and well-configured load balancers, firewalls and up-to-date patches than an average enterprise, as it’s the focus of their business.

Aside from VM-related data and access problems, the majority of potential security issues are the same on cloud and traditional servers, say Stenhouse, “Your best way to mitigate security risks is to hire a competent sysadmin, and for critical applications, an external security tester.”

By moving to the cloud, this can mean your IT department can spend more time checking and overseeing security, and less time on mundane day-to-day setup work. However, there are big challenges around infrastructure.

One of the big challenges for IT departments over the coming years will be how to give developers access to a speedy and solid network infrastructure that’s set up so internal services to talk to cloud-based applications, and developers can get access to cloud servers, without compromising the network’s security.

In the cloud era, the IT department’s role may well evolve. Cloud hosts will take care of traditional SysAdmin tasks like running backups, configuring VMs and keeping security patches up to date.

In turn, the IT department’s focus will shift to dealing with the big networking challenges. For example setting up and maintaining DMZs (demilitarized buffer zones, designed to allow cloud APIs to talk to internal applications indirectly) to increase security, prioritizing network access requests from developers (like adding cloud IP ranges to firewall whitelists) within SLA, or even automating these updates so there’s no manual step required.

Far from being a threat to the IT director, the cloud could actually increase their importance to the company.

3. Data Protection

163326532Closely tied in with security, enterprises are concerned about data protection. Many governments place strict data protection requirements on large companies and standards audit schemes such as ISO-9001 place additional restrictions on firms.

For example, the UK’s Data Protection Act requires that personal data not to be transferred to any country outside the EU, unless that country “ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.” Practically, that means it’s often easier for servers to be physically located in the EU for many European businesses.

Data Protection is definitely a big consideration for larger companies, however it’s not necessarily a barrier to cloud computing, just something for businesses to take into account when evaluating suppliers.

It makes sense to start by moving non-critical services which don’t contain sensitive customer data to the cloud first, giving the company’s legal team time to perform due diligence on suppliers.

4. Performance and uptime

155616939Performance and uptime have a direct effect on the bottom line, and corporates are still worried about this aspect of cloud computing. It’s no secret that a fraction of a second on load time can lead customers to leave a site, costing sales. And if your site’s down for even a few minutes it can affect not just the direct bottom line, but have a longer-term impact on SEO and brand reputation as well.

A recent survey by Research in Action [PDF] found that 43.5% of global IT managers were worried about losing revenue because of cloud service problems, while nearly 80% were worried about hidden costs such as damage to their brand due to downtime and poor performance.

However, reliability issues aren’t normally the fault of hosting providers, cloud or otherwise. “Most of the causes of downtime and slow performance are down to bad code in your application,” says Ryan Stenhouse, “Something you have direct control over.”

Obviously downtime and performance issues are a major consideration, which highlights the need for a different approach to engineering and infrastructure when switching to cloud computing.

Netflix has taken this change in attitude to an extreme, by building a system, called the “Chaos Monkey”, which deliberately takes down web services in production. This forces engineers to build a system which anticipates the worst, rather than hoping for the best.

“Netflix’s Chaos Monkey is a brilliant example of how you have to change your thinking to succeed in the cloud,” according to Cloud SysAdmin, John Daniels, “It’s not enough to build a robust system that doesn’t fail, because failure’s out of your hands. Instead you have to embrace failure as a normal part of the system.”

Rather than building a system that’s designed never to fail, Netflix have forced themselves to build one which is designed to cope with failure without interrupting the experience of the viewer.

“We’re designing each distributed system to expect and tolerate failure from other systems on which it depends,” says Netflix’s John Ciancutti, “The Chaos Monkey’s job is to randomly kill instances and services within our architecture. If we aren’t constantly testing our ability to succeed despite failure, then it isn’t likely to work when it matters most – in the event of an unexpected outage.”

It’s an innovative approach which highlights how cloud computing is about more than just a change in how enterprises pay for computing power and disk space.

And it’s not just the application performance firms need to think about. When using cloud services the bottleneck can often be network speed, especially in large enterprises where mission critical applications have traditionally always been hosted inside the firewall.

It’s important for companies to optimize the network if they want their cloud to be performant and secure.

In a world where more and more of a firm’s mission-critical internal apps are likely to be in the cloud, IT infrastructure managers need to consider the speed and ease of external access as a priority. Staff will need a fast network connection to the wider world if they’re using mission-critical apps hosted in the cloud. So, the days of aggressively locking down access to the Internet, and considering web browsing as a drain on time and resources, are numbered.

This is especially true when internal development teams need to develop applications in the cloud.

Inhouse development teams need high-speed access to cloud services, especially as rapid development cycles become more common as enterprises adopt Agile. That means not just a quick connection, but stable access to multiple ports and protocols, and all with a quick turn around.

Taking a few days to respond to a support ticket to open up SSH access to an AWS server might have been acceptable in the past, as it wasn’t needed often and waterfall projects took years to deliver. But as more and more development goes into the cloud and switches to Agile timescales, slow turn around times simply won’t be good enough, as any delay becomes a significant and expensive bottleneck.

5. Fear of being tied into one provider

170436909Many corporates are worried they’ll be tied into using the vendor, either contractually or because software has been built expecting a certain architecture. Obviously if performance, security or uptime become issues your firm needs to be able to switch providers.

This is partly a legacy of pre-cloud arrangements where firms would often be tied in to expensive long-term contracts and systems were developed with an expectation of the architecture it would be running on.

However, cloud providers are far more likely to operate on shorter-term contracts, even pay-as-you-go, as flexibility is one of the major advantages of the model. Virtualization means that porting applications between providers should be comparatively trivial, especially if it’s been well-engineered.

“The best way to ensure you’re not tied to one provider is to automate as much of your setup process as possible,” says freelance cloud engineering specialist Ryan Stenhouse, “using system orchestration tools such as Chef or Puppet to control your ideal configuration. This goes hand-in-hand with a sensible disaster recovery procedure to manage your data – the same things you’d need to have if you were using your own hardware.”

It’s also possible to use a “cloud abstraction layer” (similar to database abstraction layers like Active Record), which makes it even easier to switch between providers.

More than three quarters of enterprises that are already using the cloud have a multi-cloud approach, meaning they’re using multiple suppliers and models, according to a report by RightScale (PDF).

“Avoiding cloud lock-in is an important goal for many companies,” says RightScale’s Kim Weins, “Companies are using multi-cloud management platforms to work with different cloud APIs and different cloud behaviors. This gives them a choice of cloud and portability of their applications over time.”

The competition this multi-cloud approach creates should drive down costs, as vendors are forced to price competitively, and as cloud abstraction makes it easier to seamlessly switch between providers, it should create a strong pressure on infrastructure companies to keep performance and reliability high.

How to overcome fear of the cloud

Being afraid can be a useful emotion, if it’s managed correctly, so don’t worry about a healthy amount of fear. After all, it kept our ancestors from drowning or being eaten. However, it can also lead to missed opportunities for your business if it turns from fear to phobia.

As with any type of fear, the only way to overcome a fear of cloud computing is to confront it, ideally in a safe and gradual way. If you’re scared of water, you probably wouldn’t go swim the English Channel right away, so it makes sense to dip your toe into cloud computing too.

Start by trialling a redeployment of internal system that doesn’t contain any confidential customer data. Then gradually take the plunge to a non-critical public system. This will allow all parts of the business to grow comfortable with the move to cloud computing, from IT to legal.

The benefits which cloud computing brings are too real to ignore, but that doesn’t mean the risks aren’t real too. Ultimately, the biggest risk, though, is being outcompeted by rivals who are quicker to adopt the cloud.

Image credits: Thinkstock: 1, 23, 4, 56

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Published
Back to top