veritas-blog-2

The Future of Data Protection

Enterprises to spend 56% more of their IT budgets on cloud technologies by 2019.
The cloud momentum

As I meet with customers, most of whom are large global enterprises, the topic of the cloud continues to come up. Getting cloud right means new ways to stay competitive and stand out in their respective markets. For example, moving test/dev operations to the cloud has allowed many organizations to reap the benefits of increased productivity, rapid product delivery and accelerated innovation. Or another benefit the cloud provides is an on demand infrastructure which can be used as a landing zone for business operations in the event of a disaster.

No longer do IT staff have to spend countless hours installing a set of SQL, DB2 or Oracle servers to run your in-house databases, CRM or analytics platform. Databases are offered as services that are ready for the largest, most intense data warehouse needs, and the ability to add analytics capabilities on top gives organizations more opportunities to gain more insights from your data. Additionally, companies have choice. Subscribing to multiple services from multiple cloud vendors simultaneously to test products or services in real time, only paying for what resources are used or consumed, is hugely beneficial.

It’s this increased agility companies are after, and what allows them to grow faster and better meet the needs of their customer.

Persisting concerns

But of course, there’s still quite a bit of uncertainty when it comes to cloud, which causes concern. Some of the most common concerns I hear about are related to data protection and service interruptions. There’s a fear of accidentally deleting critical data, being held hostage to ransomware, and the risk of application or resource failure. There’s also a general misunderstanding regarding how much of the responsibility for addressing these concerns sits with customers versus cloud providers.

In a traditional sense, the perception that because servers and data were ‘tucked away’ safe and sound within the confines of the on-premises data center, those concerns were more easily addressed. But, in the cloud, that’s not the case. When the data center moves to the cloud, rows and rows of 42U racks filled with blades and towers transform into on-demand cloud instances that can be spun up or down at will. This causes a sense of ‘losing control’ for many.

Some argue that the risks actually increase when you move to the cloud and no longer own the resources, but we believe those risks can be minimized, without sacrificing the rewards.

The trick here is to keep things simple, especially for IT teams that are responsible for protecting company data – wherever that data is stored. And that’s an important point, because it’s not an either/or conversation. According to RightScale’s 2018 State of the Cloud survey, 51% of enterprises operate with a hybrid strategy and 81% are multi-cloud. This information further provides support for clouds existing alongside an existing on-premises data center strategy for most large enterprise customers. More point solutions, creating silos is a losing strategy. Equally so are platform specific technologies that are inflexible and do not account for the persisting heterogeneous, hybrid nature of enterprise IT environments.

Veritas has you covered

In the midst of this cloud evolution, Veritas has taken its years of data management expertise and leadership, and developed a data protection technology called Veritas CloudPoint, that is cloud-native, light-weight and flexible, yet robust with core enterprise-grade data protection capabilities that can be extended to protect workloads in public, private, and hybrid cloud infrastructures. Veritas CloudPoint can easily be introduced to your AWS, Google Cloud, Microsoft Azure, or data center environments. Utilizing the available cloud infrastructure APIs, CloudPoint delivers an automated and unified snapshot-based data protection experience with a simple, intuitive, and modern UI. Figure 1 below shows the basics of how it works.

Figure 1 

But that is just the tip of the iceberg…

With the recent Microsoft and Google press releases announcing version 2.0 of Veritas CloudPoint, we have expanded the reach of CloudPoint to VMware environments as well as support for high-performance, on-premises databases such as MongoDB.

We are already working on our next release of CloudPoint, targeted for availability in the coming quarters, where we plan to add cloud support for VMware Cloud on AWS and IBM. For private cloud environments, we plan to offer VM-level and application-level support for Microsoft’s private cloud platform Azure Stack. We already announced in-guest support for Azure Stack with Veritas NetBackup earlier this year.

And, in staying consistent with my comment above regarding point solutions and platform specific solutions being a losing strategy, we plan to integrate CloudPoint with the next release of Veritas NetBackup, see figure 2 below. This should be welcome news for NetBackup customers in particular, as they will have an integrated way to address data protection requirements in the most optimized way possible, without adding more silos, and no matter where their workloads run. But, I’ll save the details and specifics on that for my next blog!

Figure 2 

Be on the lookout for more news in the coming months.

[1]Forward-looking Statement: Any forward-looking indication of plans for products is preliminary and all future release dates are tentative and are subject to change at the sole discretion of Veritas. Any future release of the product or planned modifications to product capability, functionality, or feature are subject to ongoing evaluation by Veritas, may or may not be implemented, should not be considered firm commitments by Veritas, should not be relied upon in making purchasing decisions, and may not be incorporated into any contract. The information is provided without warranty of any kind, express or implied.

suse

5 Steps to Getting Started with Open Source Software Defined Storage and Why you should take them

Executive Summary

Back in 2013, analyst group IDC calculated that the total amount of data created and replicated in the world had edged beyond 4.4 zettabytes – a staggering number. The statement made the headlines and was widely repeated across media websites dealing with Big Data and the related storage issues. At the time, IDC attributed the enormous growth to approximately 11bn connected devices, – all generating and transmitting data, many containing sensors which also generate data.

IDC also predicted that the number of connected devices would triple to 30bn by 2020, before near tripling again to 80bn a few years later. If you’ve ever wondered what analysts mean by ‘exponential’ data growth this is what they are talking about, and the growth keeps on coming, even the forecasts for data growth are growing: three years later in 2016, IDC revised their predictions upwards, forecasting that by 2025 the total volume of data stored globally would hit 180 zettabytes. Divide 180 by 4.4 and you have a staggering growth rate of 40 x in just nine years.

Of course not all of that data is made by enterprises, but IDC say they are responsible for 85% of it at some point in its lifecycle. So, whilst enterprises might not make all the data, and might not drive all its growth, they still have to architect and manage storage systems that can cope with the multiple challenges it brings. OPERATIONAL CHALLENGES: VOLUME GROWTH, DIGITAL TRANSFORMATION AND ANALYTICS

Storage costs may have come down a lot in recent years, but the operational issues associated with managing it keep on pilling up. Systems reach capacity and must be replaced. The surrounding architecture is shifting as organisations undergo digital transformation and migrate to hybrid and public cloud environments. Decisions must be made about what data should be kept and what should be deleted -decisions which must be kept on the right side of the law, and which revolve not only around data itself, but on the value of that data to the enterprise; a bigger challenge than some might think as the financial potential in data is not always clear to the IT team, who are after all better placed to understand volume than value: a shortcoming which can lead to the enterprise equivalent of assessing the complete works of Shakespeare based on the number of pages in the book.

There are also substantial problems that come from moving large data sets over limited cabling: the backup routines that have decreasing windows, the challenges with replication and recovery that increase with the related increase in disk failure, the volume of unstructured data that comes with data like video, the security and compliance challenges, making data available for analytics, and for many, the ongoing cost of skilled technical staff for management.
These challenges aren’t going away: like your data, they are only going to get bigger. Unsurprisingly, enterprises are turning to software defined storage as the solution, indeed IT Brand Pulse predict that not only will SDS overtake traditional storage by 2020, but that 70 to 80% of storage will be managed on less expensive or commodity hardware managed by software in the same timeframe. If software defined is the answer to this challenge, why SUSE?

SEVEN REASONS WHY YOU SHOULD CHOOSE OPEN SOURCE SDS from SUSE.

Open source software defined storage on Ceph platform offers several key advantages:

• Cost reduction through elimination of proprietary software licensing costs
• Avoidance of proprietary vendor software lock-down
• Reduction of hardware costs by moving to commodity hardware
• Support for Object, Block and File and key protocols on a single platform
• Scale out infrastructure – simply add new servers and nodes as capacity increases
• Service, support and management to mitigate risks and control operational cost
• Consistent innovation and first-to-market roadmap improvements

GETTING STARTED WITH OPEN SOURCE SOFTWARE DEFINED STORAGE

1. Start small. Storage administrators are rightly risk averse – so choose your first deployment where you can prove the value in terms of cost reduction without putting mission critical data or processes at risk.

2. Find the right use cases. Good applications for Ceph Jewell include unstructured data like video footage, where sheer volume of data presents challenges in costs, volumes, back-up and retention – simply being able to keep video files into the mid-term. Another good example is the cold store – where Ceph can be cheaper than services like Amazon Glacier in terms of dollars per GB, yet remain on premise and avoid hidden costs for retrieval should you need your data back quickly.

3. Scale your usage with your skillset. As with any new technology, it takes time to become familiar with Ceph and build skills and confidence – both your own and your organisations’. Up your deployment in line with your knowledge and capability.

4. Align your strategy for storage with your strategy for the data centre – its not only storage that is moving to software defined. Consider what your infrastructure will look in the future as enterprises moved towards software defined everything. How will your data centre look in five years’ time?

5. Seek expert help when and where you need it. As you more from the periphery to the centre, complexity and risk increase – manage that risk and maximise the benefits by working with skilled third parties.

Veritas-NetBackup-2

Top Reasons to use Veritas NetBackup 8.1 data Protection for Nutanix Workloads.

The continual growth of data increases the use of virtualization and drives the need for highly scalable data protection and disaster recovery solutions. As a result, organizations are turning to hyperconverged solutions as way to keep deployment and management of their infrastructure simple, by managing the entire stack in a single system. As more and more organizations are adopting hyperconverged infrastructure, they are moving their mission critical data and applications to them.

Read how you can protect modern workloads in hyperconverged environments with Veritas NetBackupTM 8.1 including Parallel Streaming Framework, which simplifies modern workload backup and recovery, and delivers the performance required to accelerate the transformation to the digital enterprise.

1. DATA PROTECTION FOR SIMPLE, EFFICIENT HYPERCONVERGED INFRASTRUCTURES.

According to Stratistics MRC1, the Global Hyperconverged Infrastructure (HCI) Market accounted for approximately $1460 million in 2016 and is expected to reach $17027 million by 2023 growing at a CAGR of 42.0 percent from 2016 to 2023. Nutanix is the clear market leader in the HCI space.

hyperconverged is about keeping IT simple. Data protection should be too. Veritas NetBackup 8.1 with Parallel Streaming framework takes multi-node infrastructure running Nutanix Acropolis and AHV and streams from all nodes simultaneously. This is a unique way of backing up Nutanix. In fact, we have partnered with Nutanix to certify protection of those workloads on HCI.

2. ELIMINATE POINT PRODUCTS IN A HIGHLY VIRTUALIZED NUTANIX AHV ENVIRONMENTS.

NetBackup, the market leader of enterprise backup and recovery software, delivers to any size enterprise, unified data protection for Nutanix AHV virtual environments with proven enterprise scalability, and automated VM protection and performance. Veritas and Nutanix combined deliver an integrated, hyperconverged solution that eliminates silos.

3. ON-DEMAND, AGENTLESS, DOWNLOADABLE PLUGIN ARCHITECTURE.

Commvault and Veeam require dedicated resources on a Nutanix server. NetBackup Parallel Streaming technology with scale-out, agentless workload plugins can be used to efficiently protect virtual machines in Nutanix HCI or other hyperconverged cluster environments. The backup environment can be scaled in the same fashion as the production environment it was protecting. The Nutanix plugin is available on-demand for as many backup hosts as you select. No agents, clients, or software are installed on the cluster itself.

4. REDUCED RISK WITH RECOVERY OF POINT-IN-TIME HISTORICAL DATA.

Unlike any major competitive products, NetBackup 8.1 with Parallel Streaming technology enables customers to perform point-in-time backup while eliminating the need for an extra replication cluster, and at lower costs. Snapshots alone cannot refer to point-in-time data, so you need a data protection solution that help you quickly retrieve historical data without worrying about replicating data from human errors. Ensure that you can consistently meet SLAs and compliance mandates.

5. CHOICE OF HARDWARE, HYPERVISORS, AND CLOUD CONNECTORS.

Veritas protects petabyte-scale workloads running on hyperconverged infrastructure and offers a choice of hardware, hypervisor or cloud vendors.

Simplify backup with our Veritas Flex appliance and create a very streamlined solution, or use cloud as another storage tier for data. NetBackup has 40+ fully tested, cloud-connectors, which enables customers to leverage multi-cloud for long-term retention.

gemalto-cloud-security

Cloud Security: How to Secure Your Sensitive Data in the Cloud

In today’s always-connected world, an increasing number of organisations are moving their data to the cloud for operational efficiency, cost management, agility, scalability, etc.

As more data is produced, processed, and stored in the cloud – a prime target for cybercriminals who are always lurking around to lay their hands on organisations’ sensitive data – protecting the sensitive data that resides on the cloud becomes imperative.

Data Encryption Is Not Enough

While data encryption definitely acts as a strong deterrence, merely encrypting the data is not enough in today’s perilous times where cyber attacks are getting more sophisticated with every passing day. Since the data physically resides with the CSP, it is out of the direct control of the organisations that own the data.

In a scenario like this where organisations encrypt their cloud data, storing the encryption keys securely and separately from the encrypted data is of paramount importance.

Enter BYOK

To ensure optimal protection of their data in the cloud, an increasing number of organisations are adopting a Bring Your Own Key (BYOK) approach that enables them to securely create and manage their own encryption keys, separate from the CSP’s where their sensitive data is being hosted.

However, as more encryption keys are created for an increasing number of cloud environments like Microsoft Azure, Amazon Web Services (AWS), Salesforce, etc., efficiently managing the encryption keys of individual cloud applications and securing the access, becomes very important. Which is why many organisations use External Key Management (EKM) solutions to cohesively manage all their encryption keys in a secure manner that is bereft of any unauthorised access.

Take the example of Office 365, Microsoft’s on-demand cloud application that is widely used by organisations across the globe to support employee mobility by facilitating anytime, anywhere access to Microsoft’s email application – MS Outlook and business utility applications like MS Word, Excel, PowerPoint, etc.

Gemalto’s BYOK solutions (SafeNet ProtectApp and SafeNet KeySecure) for Office 365 not only ensure that organisations have complete control over their encrypted cloud data, but also seamlessly facilitate efficient management of the encryption keys of other cloud applications like Azure, AWS, Google Cloud and Salesforce.

Below is a quick snapshot of how SafeNet ProtectApp and SafeNet KeySecure seamlessly work with Azure BYOK:

1. SafeNet ProtectApp and KeySecure are used to generate a RSA Key Pair or required Key size using the FIPS 140-2 certified RNG of KeySecure.

2. A Self-SignedCertificateUtility.jar (which is a Java-based application) then interacts with KeySecure using a TLS-protected NAE service to fetch the Key Pair and create a Self-signed Certificate.

3. The Key Pair and Self-signed Certificate are stored securely in a PFX or P12 container that encrypts the contents using a Password-based Encryption (PBE) Key.

4. The PFX file (which is an encrypted container using a PBE Key) is then uploaded on Azure Key Vault using Azure Web API / Rest.

5. The transmission of the PFX file to the Azure Key Vault is protected using security mechanisms implemented by Azure on their Web API (TLS / SSL, etc.).

6. Since the PFX files will be located on the same system on which the SelfSignedCertificateUtility.jar utility will be executed, industry-best security practices like ensuring pre-boot approval, enabling two-factor authentication (2FA), etc. should be followed.

7. Once the Keys are loaded on Azure Key Vault, all encryption operations happen on Azure platform itself.