netapp-blog8

Reinvent Your Modern IT with the latest AFF NVMe All-Flash Storage

As you embark on your digital transformation journey, modernizing and simplifying IT to accelerate business-critical applications is the new imperative. NetApp has been at the forefront of helping you modernize your data center and has been recognized as a storage leader. Although this recognition is an outstanding achievement, NetApp is not resting on its laurels. Instead, we are raising the standard with the launch of the new NetApp® AFF A400 system.

Primary storage that’s simple, fast and intelligent. Discover why NetApp has been named a Leader in the 2019 Gartner Magic Quadrant for Primary Storage. #DataDriven https://t.co/HYdxxlElNO

— NetApp (@NetApp) September 23, 2019

The new AFF A400 offers the power of secure data acceleration in a AFF storage system for the first time. The AFF A400 data acceleration capability offloads storage efficiency processing, thus delivering significantly higher performance. This capability enables you to capitalize on the exploding data growth from emerging technologies such as artificial intelligence (AI), machine learning (ML), and big data. You can also harness the benefits of this data acceleration technology for enterprise apps and databases.

In addition to the new data acceleration technology, the AFF A400 system enables you to modernize SAN deployments with ultralow latency, end-to-end NVMe. The front-end NVMe/FC makes it possible to achieve the performance, scale, and operational efficiency goals of emerging workloads such as AI/ML, real-time analytics, and MongoDB. AFF A400 self-encrypting disk (SED) technology allows you to easily incorporate data-at-rest encryption in all your deployments with either the onboard key manager (OKM) or Key Management Interoperability Protocol (KMIP) servers. Finally, while enabling the latest end-to-end NVMe benefits, the AFF A400 also provides investment protections for AFF customers who want to continue to use existing SAS-attached SSD storage.

Today, we are also taking a giant leap forward in simplifying storage array management with the new NetApp ONTAP® System Manager. ONTAP System Manager is easy to use and insightful: With an intuitive GUI and smart defaults, it gives IT generalists an extremely simplified way to accomplish key storage tasks such as day-zero setup, storage provisioning by service levels, data protection, and performance awareness. Workflow optimizations such as the performance dashboard allow a quick and easy view of performance data for up to 1 year. And an intuitive global search, sort, and filtering capability at scale enables you to quickly find useful information.

In fact, we are simplifying storage across the board, starting with how the AFF A400 system is sold, configured, managed, and supported. The new ONTAP software offerings available with the AFF A400 are designed to allow you flexibility to acquire software based on your deployment requirements. We are also introducing new support offerings with flat pricing for the life of the system that also include a new digital advisor with predictive capabilities and a high-touch support tier, setting a new standard.

As part of a digital transformation to modernize mission-critical applications running on databases such as Oracle and SQL, customers are looking for a cost-effective, no-compromise resiliency solution that is future-proof and has an easy path to cloud. The new NetApp AFF All SAN Array has the rich data management capabilities of ONTAP, and continuous data availability through symmetric active-active host connectivity and the instantaneous failover capabilities of ONTAP 9.7 make it the premier choice for anyone looking for investment protection with future-proof products and services. The All SAN Array is a new AFF configuration tuned for mission-critical SAN applications, and it will bring resiliency capabilities traditionally found in high-end SAN arrays to the new midrange system.

After testing the new NetApp AFF A400 storage system, World Wide Technology Solutions Architect Chad Stuart summarized its capabilities perfectly: “The AFF A400 combines NetApp’s leadership in end-to-end NVMe storage performance with simple data management of the new ONTAP System Manager. This makes the A400 a top choice for our customers as they upgrade their IT infrastructure for their critical workloads running in a hybrid multicloud world.”

With the AFF A400 system, NetApp is introducing a storage array that lets you modernize your IT. It merges the latest data acceleration technology with the ultralow latency of end-to-end NVMe storage. You’ll continue to get the benefits of the industry-leading cloud integration and KeyStone consumption model to advance your journey into digital transformation. You can confidently start using AFF A400 systems for all applications and workloads without compromising on the reliability, availability, and rich data management capabilities that you’ve come to expect from NetApp technology. To learn more about NetApp AFF NVMe all-flash storage, visit the AFF A-Series All Flash Arrays page.

Source: https://blog.netapp.com/reinvent-your-modern-it-with-the-newest-aff-nvme-all-flash-storage/?linkId=100000008767891

netapp-blog7

FlexPod: The Converged Infrastructure Swiss Army Knife for Your Data Center

More than a hundred years ago (1880), the Swiss army no longer wanted to send its soldiers into the field with a multitude of individual tools such as knives, saws, screwdrivers (for dismantling rifles), can openers, and awls, because they were cumbersome, time consuming, and ultimately easy to lose. This problem led to the idea of designing an extremely practical, resilient, and flexible multi-tool that would support soldiers in their daily exercises and any military tasks. Thus the validated Swiss Army Knife from Victorinox was born, and it has since been developed further in many variants.

We see similar requirements in the adventure of digital transformation, which has drastically changed the way companies serve markets with their products or services. When you look at the number of applications, frameworks, and software variants in IT, it quickly becomes clear that, as with a complex adventure, you can no longer use a multitude of tools in cumbersome silo architectures. A kind of Swiss Army Knife for IT is needed.

This is where flexible converged infrastructures come into play. IDC recently conducted a survey to evaluate the usability and market acceptance of these converged infrastructure architectures—in particular, FlexPod® from NetApp and Cisco. Both the operational and the economic characteristics were considered. According to the study, the concept of converged infrastructures for IT architectures is highly relevant and useful, especially in regard to all topics around digital transformation, like hybrid cloud environments, artificial intelligence, or new ERP systems.

According to the IDC study, the most important features for a multi-tool like converged infrastructure are:

  • High availability. The converged infrastructure solution must have built-in fault tolerance that guarantees maximum availability of applications.
  • Rapid deployment. The system should enable a faster rollout of new applications and services.
  • High performance. The solution must support flash drives, graphics processing units (GPUs), and memory classes to enable high-performance applications such as databases and analytics.
  • All-flash arrays for sustainable and predictable performance. Because of the increased requirement profiles (for example, for AI/ML), all-flash arrays for workloads have become necessary for many enterprise applications.
  • Software-defined infrastructure. Companies are increasingly demanding software-defined compute, storage, and networking; converged infrastructure models must take this concept into account.
  • Automation. The solution should provide mature software that assists IT administrators in automating complex or repetitive tasks such as backup, migration, replication, or resource provisioning through a self-service portal.
  • Integration with multiple public clouds. The solution should support the right protocols and APIs to enable it to work with various public cloud service providers—Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform, and so on.
  • Common data structure for private and public clouds. As the hybrid cloud becomes the gold standard for cloud implementation, there must be a unified data structure that seamlessly connects all public and private clouds. Such a common data structure will provide complete visibility, data mobility, robust security, easy management, and a significant improvement in return on investment (ROI).

FlexPod exceeds the requirements set forth by IDC, helping companies to survive the new “adventures” in the market.

netapp-blog6

How to Lower Your Storage Costs by Tiering Hot & Cold Data

According to IDC’s Worldwide Global DataSphere Forecast, 2019–2023: Consumer Dependence on the Enterprise Wideningthe amount of data that will be stored in the core data center is forecasted to double between 2014 to 2023.

While organizations are planning capacity builds to accommodate this, let’s look at how to determine the right balance of core capacity and cloud capacity, as well as the measures taken when budgets are being reallocated or reprioritised.

IT Leaders need to continue to modernize their on-premises data centers to handle the growth of their on-prem data. In addition, that data needs to be quickly accessable to internal teams who need to use it to deliver value to the organization. However, a recent surveyconducted by IDC with 500 decision makers stated that 30% of those surveyed identified budget constraints on capital requirements as the main inhibitor of IT modernization.

Herein lies the proverbial conundrum of doing more with less. With data capacity requirements growing at the core, IT Leaders will need to make smarter decisions to ensure that their infrastructure is able to handle the growth in a tight budget environment.

Storage Efficiencies – your first move

Just like any other storage, storing data can be inefficient without on-going housekeeping. Through this process, organizations might realize that the amount of space they actually need is far less than what you have purchased.

As a first step, Storage Admins can switch to NetApp clusters running ONTAP® data management software to free up more than 30 times the required storage capacity. This is possible by leveraging the built-in storage efficiency technologies in ONTAP®, such as inline data deduplication, compression, compaction as well as by applying space saving snapshots.

Data Tiering makes data efficiencies more efficient

In addition to storage efficiencies, freeing up storage capacity can also be realized through data tiering. Data tiering is the tiering of inactive data, or cold data, to lower cost secondary object stores available in both private and public clouds. As a result, highly performant storage capacity on the all-flash array is freed up for additional low-latency use cases or applications accessing hot data.

The results of tiering data are significant. Online reportssuch as Evaluator Group’s simulation, shows one can achieve 30% TCO savings by using data tiering capabilities like FabricPool, the data tiering feature in ONTAP®, to tier 80% of inactive data to a public cloud over a five year period.

FabricPool, now provides even more options for hybrid cloud deployments, making and further makes data tiering simple to deploy and manage.

Data tiering in a hybrid cloud

With FabricPool, Storage Admins can have more choices across more cloud providers to determine the best costs. Leveraging partnerships with all the major cloud service providers, ONTAP 9.6 tiering is available in Google Cloud Storage and Alibaba Cloud Object Storage Service, as well as NetApp®StorageGrid®, Microsoft Azure Blob Storage and Amazon S3, which were previously available as cloud targets. Flexible options include tiering as a pay-as-you-go model through NetApp’s Cloud Tiering Service where one can simply extend an OPEX cost model to data tiering.

Simplicity in the modern data center

FabricPool identifies inactive data and provides an automated report that gives an immediate view of potential space savings across Tier 1 aggregates. Once policies are set for that inactive data, full automation kicks in for tiering to the cloud – freeing up precious time and resources.

In the ONTAP design, metadata associated with datasets are kept on Tier 1 storage so that data on secondary storage can be quickly accessed when requested by the application. This also allows Storage Admins to effectively scale tiering capacities to the cloud, leveraging up to 50 times the amount of secondary storage beyond the primary tier.

Your tiered data also remains safe and protected. No one wants to experience a disaster or blackout, and getting operations back online need not be a complex exercise. By mirroring ONTAP® volumes, new destination volumes can now be restored quickly – making the recovery process simple and quick.

FabricPool and the Data Fabric

The amount of data stored on-premises is growing and will continue to grow, so it’s very important to modernize and future-proof your data center with integration into a hybrid cloud world. With NetApp’s Data Fabric, consistent data services to support the unique demands of your business are provided across a choice of endpoints spanning on-premises and multiple cloud environments.

First introduced in 2017 as part of the ONTAP 9.2 release, FabricPool, together with the Data Fabric, have driven wide adoption of ONTAP as the leading storage resource management platform, as reported by IDC. Hybrid cloud is here to stay, and making use of data tiering to the cloud can help free up high-performant all-flash storage in a cost effective manner and lower your storage TCO in the process.

To find out more about how FabricPool can help you modernize your data center with data tiering, visit our website or the NetApp Document Center.

netapp-blog5

Learn How You Can Start Building Your Data Fabric Today

There seems to be a lot of noise in the IT industry today about the fact that hybrid cloud is the only way to go, that it is quickly becoming the new norm. But is anyone actually doing it rather than just talking about it? And if they are, how successful were they, and how difficult was it to implement?

To succeed in pivoting to a hybrid cloud architecture, there are several hurdles to clear. You need to think about what cloud to use and what the crucial decisions are that need to be addressed. This is a key aspect to the NetApp Data Fabric that many overlook.

Over the last several years, NetApp has been making it easier and faster for customers to utilise the benefits of cloud whilst keeping governance over their data. It shows with the numbers of customers that are adopting the solutions that NetApp brings to the market.

During this period, NetApp has been promoting its Data Fabric strategy to its customers. This approach is becoming more of a reality with every day that passes. More and more success stories are coming out from around the globe. As NetApp become the data authority for the hybrid cloud they are helping businesses with traditional infrastructure achieve their business objectives with modern and next-gen data centre capabilities—whilst at the same time helping the CIOs and Cloud Architects fully realise the benefits of the cloud.

We are now moving into the era of “cloud-native” where application stacks are designed for the cloud and focus on containers and microservices to improve their delivery and functionality. Companies that adopt this model not only get to market faster than their competition, but are growing at a faster rate than their competitors.

What if your infrastructure could provide a simple and seamless platform that you could automate to deliver flexible integration of hybrid cloud strategy leveraging various hyperscalers? Or what if you want or need to be cloud-native, yet run on the premises?

NetApp has made a raft of announcements that again further expand its Data Fabric strategy. There are now two new ways of consuming cloud services, the first of which is NetApp Cloud Consumption for NetApp HCI.

Cloud Consumption for NetApp HCI is a way to easily build out a private cloud with monthly billing. Based on the H400 and H600 line of NetApp HCI products, NetApp Cloud Consumption for NetApp HCI starts with 4 storage and 2 compute nodes minimum. You can then mix and match the various models within this line to meet your desired configuration. IT is charged per node per month with a 12-month minimum contract. Then there is the ability to add nodes at a fixed priced with a term extension

This is just an expansion of the capabilities that NetApp HCI already brings to the table of flexible design, simple operations, and predictable performance tied together with the NetApp Data Fabric. NetApp Cloud Volumes provide high-performance, persistent storage through a streamlined and simplified user experience in all major public clouds, but it doesn’t stop there. With the use of NetApp Kubernetes Services (NKS) you too can start to consume hybrid cloud services from the likes of AWS, Azure, and GCP.

NetApp also announced Cloud Data Services on NetApp HCI. By using NKS to act as an orchestrator, automator, and marketplace, NetApp HCI can be used as a deployable region that you can provide Cloud Volumes to on premises. This shared, common API for both the public and private clouds enables NAS as a Service powered by ONTAP, managed simply for self-service to be coded for and consumed regardless of location.

You get a service that can reside on premises, giving you on-demand elasticity accessed via a self-service portal or consumer APIs combined within an operational expenditure model (OPEX). NetApp HCI is hybrid cloud infrastructure that’s delivered by the Data Fabric.

NetApp have also announced Fabric Orchestrator. This is a single control plane connecting all of your data production with your data consumption. It is an extensible user interface for the Data Fabric services you are consuming, allowing you to convert intention into action with no need for advanced administration skills. Under the hood it connects to your data wherever it is via the Fabric API Services; and then through the Data Hub it can monitor, automate, and optimise your resources and provide actionable insights.

Looking at NetApp’s cloud portfolio, there are a vast array of products to consume: from Cloud Volumes in their various forms, to control planes like NKS and Fabric Orchestrator, as well as analytics such as Active IQ and Cloud Insights, and tools like Cloud Sync and SaaS Backup. This is a division within the portfolio that is moving just as fast as a start up and delivering products and services that are not only at the cutting edge, but in high demand.

netapp-okada-article

Okada Manila

Okada Manila delights guests by delivering a data-driven customer experience with NetApp

This year, over 7.3 million guests will experience a new level of personalization at Okada Manila as they enjoy leisure in one of the largest casino resorts in the world. NetApp’s hybrid flash storage solutions ensure uninterrupted availability of Okada Manila’s business applications and reliable access to mission critical data.

"NetApp solutions not only simplify data management, but also provide a robust and flexible IT infrastructure that can adapt to changing business needs. With these capabilities, Okada Manila will be able to effectively cope with the growing volume and complexity of data, and continue being a data-driven organization."
Dries Scott
Chief Technology Officer, Okada Manila

Okada Manila realized early on that consistently providing such a top-notch guest experience is only possible with a data-driven approach to running all of its operations. However, this initially posed as a challenge. Since it has a multitude of customer touchpoints across its facilities, Okada Manila needed an IT backbone that allows the right data to be delivered to the right channel at the right time.

With NetApp’s hybrid flash storage solutions, Okada Manila systems will run optimally 24 hours a day, seven days a week as it combines highly reliable hardware, software and sophisticated service analytics to alert IT teams of possible issues that could lead to service outages.

netapp-blog1

Hybrid Multi-Cloud Experience: Are You Ready for the New Reality?

Determining the right way to deliver a consumption experience that public cloud providers offer, regardless of location or infrastructure, is top-of-mind for many IT leaders today. You need to deliver the agility, scale, speed, and services on-premises that you can easily get from the public cloud.

Most enterprises can’t operate 100% in the public cloud. Between traditional applications that can’t be moved from the datacenter and regulatory compliance, security, performance, and cost concerns, it’s not realistic. But there is a way to have the best of both worlds. You can deliver an experience based on frictionless consumption, self-service, automation, programmable APIs, and infrastructure independence. And deploy hybrid cloud services between traditional and new applications, and between your datacenters and all of your public clouds. It’s possible to do cloud your way, with a hybrid multi-cloud experience.

At NetApp Insight™ 2018, we showed the world that we’re at the forefront of the next wave of HCI. Although, typically standing for hyper converged infrastructure, our solution is a hybrid cloud infrastructure. With our Data Fabric approach, you can build your own IT, act like a cloud, and easily connect across the biggest clouds:

Make it easier to deploy and manage services.

You can provide a frictionless, cloudlike consumption experience, simplifying how you work on-premises and with the biggest clouds.

Free yourself from infrastructure constraints.

You can automate management complexities and command performance while delivering new services.

Never sacrifice performance again.

Scale limits won’t concern you. You can use the public cloud to extend from core to cloud and back and move from idea to deployment in record time.

When you stop trying to stretch your current infrastructure beyond its capabilities to be everything to everyone and adopt a solution that was created to let you meet – and exceed – the demands of your organization, regardless of its size, you’re able to take command and deliver a seamless experience.

Command Your Multi-Cloud Like a Boss

If you’re ready to unleash agility and latent abilities in your organization, and truly thrive with data, it’s time to break free from the limits of what HCI was and adopt a solution that lets you enable what it can be.

With the NetApp hybrid multi-cloud experience, delivered by the Data Fabric and hybrid cloud infrastructure, you’ll drive business success, meeting the demands of your users and the responsibilities of your enterprise. You’ll deliver the best user experiences while increasing productivity, maintaining simplicity, and delivering more services at scale. You won’t be controlled by cloud restrictions; you’ll have your clouds at your command.

And isn’t that the way it should have always been?

Start Your Mission.

Your Clouds at Your Command with NetApp HCI.