The announcement of Protégé from HYCU puts into focus the ongoing transition from data protection to data management across public and private clouds. We’re starting to see different strategies emerge, either all SaaS-based solutions or platforms like Protégé that can offer a mix of multiple consumption models. This begs the question; exactly how will we be consuming data protection and management services in the future? Is there a clear direction to take?
HYCU Protégé is an integration of the data protection services already offered by HYCU. The solution initially started as data protection for Nutanix environments and has expanded to cover VMware and the Google Cloud Platform. HYCU is a pure software solution that can be driven entirely via API, making it practical to implement alongside application deployments in a multi-tenant configuration.
Protégé provides a unified dashboard for managing all HYCU deployments, with the added ability to move workloads between clouds, either for migration or disaster recovery purposes.
We’re starting to see multiple strategies emerging in hybrid multi-cloud data protection. HYCU deploys software on multiple platforms, with aggregated management. Vendors including Cohesity and Rubrik have followed a similar path, with on-premises appliances and the transition to implement solutions with more emphasis on software.
- Commvault Announces Metallic SaaS Data Protection
- Gaps in Cloud Native Data Protection
- Rubrik, Cohesity and the battle for NoSQL Backup
Druva and Clumio have headed down the pure SaaS route in the public cloud, with little or no software deployed on-premises, except for some local proxy code. Commvault has yet to provide a joined-up solution but has entered both the appliance and SaaS market in recent years.
For businesses with highly centralised IT organisations (such as one or two large data centres), then the SaaS model isn’t as attractive as other solutions. This is because all of the backup data is being moved on/off-premises with requirements for big network pipes. These companies will probably see more benefit modernising their on-premises solutions before looking into public SaaS offerings.
Of course, if your company strategy is to move to the public cloud, moving data protection to the cloud makes sense to do early, and can be used as a route to migration.
For more diversified businesses with many data centre locations (including small branch-type operations), the SaaS model centred on the public cloud looks to be a great solution. In this configuration, each site or location needs a connection to the local cloud point of presence. The SaaS provider exposes the same interface and features to each site while handling the back-end consistency of the backed-up data.
The software-based model also works well here too, especially in virtualised environments where the data protection solution is run as a virtual machine or instance.
There’s more to technology architectures than simply being centralised or dispersed. Long-running businesses are likely to have deployed a range of different technologies that today span physical, virtual and now containerised frameworks. Even for companies that are relatively new, the choice to use a variety of application solutions is seen as essential to agile delivery.
This means data protection has to address the heterogeneous nature of most IT organisations. Should that be done through a single monolithic platform based in the public cloud or through a backup solution that provides a more dispersed implementation, with localised backup/recovery capabilities?
- Nutanix Mine Puts backup Software Vendors on a Level Playing Field
- Exploiting secondary data with NDAS from NetApp
- Data Protection in a Multi-Cloud World
This specific differentiation is significant because IT organisations use a range of different deployment scenarios. In 2018, we talked about how HCI data protection differs from protecting virtual server environments. Essentially, the features offered by the HCI vendor allow more efficient backup. This efficiency is exemplified by the announcement of Mine from Nutanix. This level of depth in data protection also applies to application platforms like databases. Application developers will want the ability to backup and restore data from database platforms, whether that database ran on-premises as a physical machine, a VM, a container or as a managed service in the public cloud.
Clearly, the choice of platform depends on the requirements of the business. Solutions such as Druva and Clumio allow companies to move away from running an on-premises data protection infrastructure entirely. If the long-term goal for the business is to move to public cloud (or use on-premises IaaS), then these solutions can work well.
If backup mobility is an important criterion, then a solution like HYCU is potentially more useful because the data in the backup target can be moved around and restored elsewhere on demand. This data could continue to be used, for example, to move historical backups with an application into (or out of) the public cloud.
The Architect’s View
In 2020 I can see both primary data and secondary data used for backup becoming increasingly mobile and platform-independent. Whether IT organisations use centralised cloud-based SaaS offerings or not, there’s still a degree of lock-in or dependence on specific data protection solutions. This is because there is no independent and universally accepted format for backup images.
Perhaps this is a gap that needs closing in 2020, either with standards for backup or through tools that move backups between formats. The coming decade will definitely be about data management and data standards, rather than the infrastructure on which that data sits even if the emergence of independent backup standards doesn’t appear this year.
Post #ad8d. Copyright (c) 2020 Brookend Ltd. No reproduction in whole or part without permission.
Disclaimer: HYCU, Druva and NetApp have been or are currently clients of Brookend Ltd.