Is Private Cloud Dead?

Is Private Cloud Dead?

Chris EvansCloud

Anyone who followed the announcements at AWS re:INVENT last week will have seen a clear theme that covered most of the announcements made at the event.  AWS is focused on bringing services to the public cloud that concentrate on machine learning, AI and generating insight from large volumes of data.  The AWS team seems to be able to deliver new services at a rate that no-one else in the industry can, although Microsoft and Google are following close behind.  With so much focus on services in the public cloud, is private cloud set to die off?

Predictions

IDC just announced their Worldwide Whole Cloud Forecast, 2017-2021.  This is a paid piece of work (to which I have no access), however, there are some nuggets of information available in the press release.  By 2021, spending on Cloud will reach $554 billion, double the figure in 2016.  The public cloud portion of that spend will increase from 41% in 2016 to 48% in 2021.  The hyper-scalers will account for 76% of the hardware and software spend on infrastructure.  There are plenty of other examples out there of the trend towards public cloud adoption.  Morgan Stanley, for example, predicted last year that 30% of Microsoft’s revenue will be from Cloud by 2018.

Numbers can be cut in lots of different ways, but the message is clear, we’re headed towards cloud in a big way.  What about private cloud in all this – does it have a future?  At NetApp Insight in Berlin, I spoke to Anthony Lye, Senior VP and GM of NetApp’s Cloud Business Unit.  You may have seen him on-stage with Microsoft in the day 2 keynote where Joe CaraDonna presented Cloud Orchestrator.  Mr Lye’s radical prediction is that private cloud will die out completely.  This seems at odds with the position NetApp currently finds itself in, i.e. that of an infrastructure provider.

Public Cloud Innovation

The premise for the prediction of the death of public cloud is based on the rate of innovation we see from public cloud providers.  New data-centric services are being developed and delivered rapidly by public cloud vendors.  As an example, AWS announced the following new services at Re:Invent:

  • SageMaker – an online tool for developers to build machine learning software.
  • Rekognition Video – an API for facial recognition and object recognition in live video.
  • Transcribe – an API to translate audio into punctuated text.
  • Comprehend – an API for performing sentiment analysis on text.
  • Translate – an API for performing language translations.

As an example of the speed of innovation, look at AWS Polly, a text to speech service that was announced at Re:Invent last year.  Then look at the AWS AI blog for Polly to see how the technology is being used and what new features are being released, pretty much on a monthly basis. Amazon is able to deploy new features and releases faster than any business could manage that process on-premises.

Time to Market

This is the key here.  AWS, Google, Azure & Co can bring these innovations to market faster than any IT department could.  So why bother running it on-premises yourself?  This assumes that the AI software is even available for installation in the private data centre.  In many cases, I suspect AWS is leading the market with some of their product features.

What About The Data?

Now here’s where things get tricky.  The hyper-scalers would love you to commit completely to the public cloud.  But putting all your eggs in one basket is a risky business.  By eggs, of course, I mean data.  Data is becoming a core asset for so many companies.  Whether through paranoia, regulation or compliance, many organisations will want to keep their data on-site.  This is one area where private cloud is headed.  It will be a repository for the core assets and services businesses don’t want to trust anywhere else.  Data will simply be made available to the public cloud services for analysis.  The other area I think is in bespoke services the public cloud chooses not to provide.  That might be high-speed trading or some other form of HPC that is unlikely to be used by all businesses.

The Architect’s View

I can see a future where private cloud and private data centres continue to shrink.  Like the mainframe, we won’t eliminate the private side, but it will exist in a niche form.  Nothing ever truly dies in IT.  There’s always someone, somewhere using punched cards.  The private data centre will become just that – a place to keep our data, with a few niche services thrown in for good measure.

What do you think?  Is the timescale for public cloud adoption too aggressive?  Do you think there will be a swing back to on-prem, or is the private data centre really doomed?

Update (December 2019)

I re-read this post with a mix of irony and surprise. In the two years since it was authored, AWS has announced and released Outposts, an on-premises implementation of the AWS infrastructure. Two new features (Local Zones and Wavelength) were also announced. This is a a tacit acceptance that for whatever reason, data and applications can’t all exist in the AWS data centre (latency and ownership challenges).

However, these services are not private cloud as we would expect, but rather a workaround to the problem of moving applications into the physical infrastructure owned by AWS.

It is still true though, that AWS innovates at rapid speed, this year (2019) announcing a whole set of updates to SageMaker, the developer ML/AI framework. The premise of this post remains true, but as usual we’re seeing a more complex view emerge. The public/private debate is not over yet.

Related Links

Comments are always welcome; please read our Comments Policy first.  If you have any related links of interest, please feel free to add them as a comment for consideration.  

Copyright (c) 2009-2019 – Post #646A – Brookend Ltd, first published on http://blog.architecting.it, do not reproduce without permission.