netapp-blog11

Enhance and Accelerate VDI with NetApp HCI and NVIDIA

With the increasing demand for graphical content, delivering high-performance end-user service can be a challenge. NetApp® HCI powered by NVIDIA T4 GPUs and Quadro Virtual Data Center Workstation (Quadro vDWS) software accelerates your virtual desktop infrastructure (VDI) workloads. Your business can improve the end-user experience and cost-effectively scale VDI.

This week at VMworld, we’re launching the new NetApp HCI H615C compute node with NVIDIA T4 GPU cards. Along with this release, we have created a technical report to help you get started: NetApp HCI for VDI with VMware Horizon 7. With this solution, you can handle virtualized 3D graphics workloads on various products from Autodesk, Dassault Systèmes, Siemens, and others. With NetApp HCI, you can overcome the challenges of virtualized 3D graphics workloads and quickly explore deep learning with NetApp Kubernetes Service and NVIDIA GPU Cloud (NGC). And by using your data fabric delivered by NetApp technology, you can gain even more value from NetApp HCI.

Deliver Realistic Images with NVIDIA T4

The NVIDIA T4 GPU accelerates your diverse cloud workloads, including high-performance computing, deep learning training and inference, machine learning, data analytics, and graphics.

The NVIDIA T4 has 40 RT Cores that give you the computation that you need to deliver real-time ray tracing. The NVIDIA T4, when combined with Quadro vDWS software, enables artists to create photorealistic imagery that shows light bouncing off surfaces just as it would in real life. This RTX-capable GPU produces real-time ray tracing of up to 5 giga rays per second. With RTX, artists who work with Quadro Virtual Workstation can create photorealistic designs with accurate shadows, reflections, and refractions. And they can do it on any device, from anywhere.

Tensor Cores enable you to run deep learning inference workloads. Combined with accelerated containerized software stacks from NGC, T4 gives you exceptional performance at scale. T4 also introduces the innovative Turing Tensor Core technology with multi-precision computing to handle your diverse workloads. By powering breakthrough performance from FP32 to FP16 to INT8, as well as INT4, precisions, T4 delivers up to 40 times higher performance than CPUs. When you use T4-powered by Quadro vDWS software to run deep learning inference workloads, you can get up to 25 times faster performance than with a virtual machine that’s driven by a CPU-only server. For your graphics and compute-intensive workloads, a NetApp HCI H615C node with three NVIDIA T4’s in one rack unit is an optimal solution.

Check out the following information about NVIDIA GPU cards to compare features.

Improve Performance and Lower Your TCO

With NetApp HCI, you remove the expensive graphics workstation from the desk. Early internal testing indicates that this solution offers some impressive performance advancements:

  • More than 2 times higher performance than the prior generation for certain CAD applications
  • More than 6 times higher performance than the prior generation for certain medical imaging applications
  • Comparable performance to NVIDIA Pascal, using only a third of the power and in only half the size

Whether you’re in the oil and gas, manufacturing, media, or ISP industry, your organization can achieve economic and performance benefits from NetApp HCI with NVIDIA T4 cards. You can easily meet your mobility, security, and performance needs while you lower your TCO.

Start accelerating your VDI today. Review the H615C specifications or learn more in-depth information from the newly released technical report.

Source: https://blog.netapp.com/enhance-and-accelerate-vdi-with-netapp-hci-and-nvidia/

netapp-blog10

Why NetApp HCI is Essential to Your Digital Transformation

How do you stay competitive in the new era of IT? You have to transform. Nearly every enterprise in every industry is exploring digital transformation to drive new pathways to their customers, create new business opportunities, and improve operational efficiency. Modern applications require faster, more flexible infrastructure to meet the needs of today’s customers.

Traditional infrastructure is designed for a world where each application is built like a vast monolith, rolling out a new version once or twice a year in a large change window. Six months of changes would have to be done before you could roll out a new version. This made it possible to plan infrastructure with plenty of time before each upgrade.

Today’s enterprises want to be able to roll out new versions several times a day. A new means of development is needed, where each application is broken down into microservices, small parts of an application that fulfill their roles individually. At the same time, enterprises are introducing container platforms where small microservices are ideally suited to run as containers independently of each other.

In a traditional infrastructure, a company would have a development department for a large application, and once or twice a year, a complete version would be submitted to the operations department. Operations would then make sure that the application ran properly and securely. In order to manage multiple applications, a small number of platforms and databases were standardized. The operations department had significant influence on standards and infrastructure.

With microservices, things work differently. Instead of 2 departments, development and operations, a small team is built up around 1 or 2 microservices. The team is responsible for developing the microservice and for ensuring the code is up to the expected quality. Each team features a combination of development and operations, which is now commonly referred to as DevOps. It’s up to each team to roll out new versions of its microservice in a secure manner, and the team makes its own decisions on rolling back changes, should something go wrong.

So how does this effect the teams and the way they work?

  • Each team is isolated from the other teams. This means that if a team does encounter problems or if developer resources become unavailable, the other teams are able to continue developing their microservices.
  • There is no longer a strong operating organization to impose standards and force teams to choose a specific technology. Instead, there is usually a security team that helps the teams adhere to security policies, but without dictating technology choices.
  • The teams are autonomous and can choose tools and technologies themselves, as long as their microservice does what it should, is accessible, and works with other microservices. A team can choose to build its microservice using Python in AWS with the finished services available there, while another team uses .NET and Microsoft SQL Server as the database in Azure, and a third team develops on-premises in a private cloud because they require integration with traditional systems.
  • To manage the entire chain from code, through testing, to production, these teams use “infrastructure as code,” which means they write a piece of code that describes how the environment should look and the code can then be used to create lots of identical environments. You can also use version management with this code so that you always have up-to-date documentation of how the environment should look.

To successfully implement digital transformation and create new applications, a private cloud is needed, one that is multitenant and that can run all types of applications and workloads with ease at the same time without affecting each other. Enterprises can no longer plan several years in advance and instead must be able to scale in small steps on demand. An enterprise must be able to control the private cloud with API calls and code. Strong integration with container platforms is also vital. Finally, enterprises must be able to connect their private cloud to public clouds, provision a microservice in any of these clouds, and make it work together with other microservices.

This what is driving enterprises to build private clouds with HCI. But it’s no longer enough to look at HCI as “hyper converged infrastructure:” a tool for sharing compute and storage resources that can start small and grow into something medium-sized. Instead, HCI should be seen as “hybrid cloud infrastructure,” the private cloud in your multicloud environment. An HCI solution needs to be ready to receive all kinds of applications and microservices with guaranteed performance, integrated into the public clouds and container platforms.

NetApp HCI with the NetApp Kubernetes Service provides a private cloud that adheres to these new requirements. The NetApp Kubernetes Service meets the modern challenges of DevOps organizations by providing a global control plane for container platforms in AWS, Azure, and Google Cloud Platform. No matter where your container platforms are, you can connect them and allow applications and microservices to interface between various public clouds.

NetApp HCI is the Hybrid Cloud Infrastructure required for success in your digital transformation. And with new services like NetApp Kubernetes Service, you can be sure that your infrastructure is ready to face the challenges of DevOps while exceeding customer expectations.

netapp-blog9

Need to Simplify Your Hybrid Multicloud Experience? NetApp Keystone Delivers.

Today’s business environment demands continuous IT transformation. Our customers are turning to NetApp to help them modernize traditional IT infrastructure, build private clouds and harness the power of public clouds. But along with this IT transformation, they expect simple and flexible cloud-like experiences whether they chose to build their own infrastructure or buy as a service—on premises or in the public cloud.

At NetApp, we see hybrid multicloud as the de facto operating model for every customer. The first thing that I would advise if you are looking to consume NetApp technology as a cloud service is: look to the public clouds. NetApp’s Cloud Data Services—natively integrated enterprise storage services in Microsoft Azure, Google Cloud, and AWS—give you the performance and availability to power even the most demanding apps.

Native Integration with Major Cloud Services

Of course, we understand that not every customer or application is moving to the public cloud. So, with NetApp Keystone we now offer two new ways to acquire and operate NetApp technology that reinvent the NetApp customer experience and deliver the innovation that our customers and partners need to thrive in a hybrid multicloud world. 

What are the key things that we announced today with NetApp Keystone?

Subscription Services. With NetApp Keystone we now give customers the ability to consume NetApp systems and software on premises with cloud-like subscription models. You get NetApp solutions delivered as a service for a monthly fee, the ability to burst capacity when needed and the flexibility to grow into the public cloud when necessary. Simple, flexible, affordable. Exactly what our customers have been asking for. The NetApp Keystone subscription services are really as simple as 1, 2, 3.

1. Choose a performance tier (not a system).
2. Choose a storage service type—block, file, or object.
3. Choose who manages it—you or us.

All with a simple 1-year commitment.  It doesn’t get much easier than that.

Ownership Experience. For customers who want to continue to own and operate NetApp technology in traditional purchasing models, we have radically simplified the experience. It is now easier to buy, optimize and grow our solutions with: streamlined configuration and quoting processes; cloud-like service guarantees for efficiency, performance and availability; AI-driven insights to optimize system health; and dynamic scaling of systems node-by-node or to the public cloud. This is a truly modern purchasing and ownership experience that will set us apart in the industry.

So, what does this new ownership experience mean for you?

  1. Buy with confidence. Doing business with NetApp is easier than ever. We’ve dramatically streamlined our ordering and quoting process. Our unique combination of efficiency, performance and availability, backed by a guarantee, helps protect your storage investment.
  2. Optimize with ease. Once you’re up and running, AI-driven insights help you optimize the health of your system while predicting capacity and performance bottlenecks. Use NetApp Active IQ® Monitoring to keep tabs on storage usage, data reduction ratios, node utilization and data availability—all from one easy-to-use dashboard. AI-powered Active IQ assessments trigger proactive parts delivery, predictive alerts, optimal storage setup and configuration recommendations. This gives you maximum performance and efficiency with lower costs and less downtime.
  3. Scale your infrastructure and data services. NetApp’s unique scale-out architecture lets you dynamically grow your data center infrastructure. Or you can scale to the public cloud with our cloud-activated systems. The first 10TB of tiering is free! Use our intelligent Active IQ product recommender and flexible, budget-friendly upgrade credits to invest in anything from small upgrades to a whole new infrastructure strategy or public cloud migration.

Our new offerings, combined with our existing portfolio of Cloud Data Services delivers on a simple promise to our customers:

Whatever your preferences, NetApp Keystone connects your hybrid cloud environment in a common experience—so you can operate like a cloud everywhere.

My parting question to you is:  What is your biggest concern in moving to the cloud or cloud-like experience? Let us know here.

To learn more about NetApp’s flexible data consumption models and new ownership experience, watch the 1-minute video below or visit our website for more details and content.

Source: https://blog.netapp.com/simplify-your-hybrid-multicloud-experience-with-netapp-keystone/

netapp-blog8

Reinvent Your Modern IT with the latest AFF NVMe All-Flash Storage

As you embark on your digital transformation journey, modernizing and simplifying IT to accelerate business-critical applications is the new imperative. NetApp has been at the forefront of helping you modernize your data center and has been recognized as a storage leader. Although this recognition is an outstanding achievement, NetApp is not resting on its laurels. Instead, we are raising the standard with the launch of the new NetApp® AFF A400 system.

Primary storage that’s simple, fast and intelligent. Discover why NetApp has been named a Leader in the 2019 Gartner Magic Quadrant for Primary Storage. #DataDriven https://t.co/HYdxxlElNO

— NetApp (@NetApp) September 23, 2019

The new AFF A400 offers the power of secure data acceleration in a AFF storage system for the first time. The AFF A400 data acceleration capability offloads storage efficiency processing, thus delivering significantly higher performance. This capability enables you to capitalize on the exploding data growth from emerging technologies such as artificial intelligence (AI), machine learning (ML), and big data. You can also harness the benefits of this data acceleration technology for enterprise apps and databases.

In addition to the new data acceleration technology, the AFF A400 system enables you to modernize SAN deployments with ultralow latency, end-to-end NVMe. The front-end NVMe/FC makes it possible to achieve the performance, scale, and operational efficiency goals of emerging workloads such as AI/ML, real-time analytics, and MongoDB. AFF A400 self-encrypting disk (SED) technology allows you to easily incorporate data-at-rest encryption in all your deployments with either the onboard key manager (OKM) or Key Management Interoperability Protocol (KMIP) servers. Finally, while enabling the latest end-to-end NVMe benefits, the AFF A400 also provides investment protections for AFF customers who want to continue to use existing SAS-attached SSD storage.

Today, we are also taking a giant leap forward in simplifying storage array management with the new NetApp ONTAP® System Manager. ONTAP System Manager is easy to use and insightful: With an intuitive GUI and smart defaults, it gives IT generalists an extremely simplified way to accomplish key storage tasks such as day-zero setup, storage provisioning by service levels, data protection, and performance awareness. Workflow optimizations such as the performance dashboard allow a quick and easy view of performance data for up to 1 year. And an intuitive global search, sort, and filtering capability at scale enables you to quickly find useful information.

In fact, we are simplifying storage across the board, starting with how the AFF A400 system is sold, configured, managed, and supported. The new ONTAP software offerings available with the AFF A400 are designed to allow you flexibility to acquire software based on your deployment requirements. We are also introducing new support offerings with flat pricing for the life of the system that also include a new digital advisor with predictive capabilities and a high-touch support tier, setting a new standard.

As part of a digital transformation to modernize mission-critical applications running on databases such as Oracle and SQL, customers are looking for a cost-effective, no-compromise resiliency solution that is future-proof and has an easy path to cloud. The new NetApp AFF All SAN Array has the rich data management capabilities of ONTAP, and continuous data availability through symmetric active-active host connectivity and the instantaneous failover capabilities of ONTAP 9.7 make it the premier choice for anyone looking for investment protection with future-proof products and services. The All SAN Array is a new AFF configuration tuned for mission-critical SAN applications, and it will bring resiliency capabilities traditionally found in high-end SAN arrays to the new midrange system.

After testing the new NetApp AFF A400 storage system, World Wide Technology Solutions Architect Chad Stuart summarized its capabilities perfectly: “The AFF A400 combines NetApp’s leadership in end-to-end NVMe storage performance with simple data management of the new ONTAP System Manager. This makes the A400 a top choice for our customers as they upgrade their IT infrastructure for their critical workloads running in a hybrid multicloud world.”

With the AFF A400 system, NetApp is introducing a storage array that lets you modernize your IT. It merges the latest data acceleration technology with the ultralow latency of end-to-end NVMe storage. You’ll continue to get the benefits of the industry-leading cloud integration and KeyStone consumption model to advance your journey into digital transformation. You can confidently start using AFF A400 systems for all applications and workloads without compromising on the reliability, availability, and rich data management capabilities that you’ve come to expect from NetApp technology. To learn more about NetApp AFF NVMe all-flash storage, visit the AFF A-Series All Flash Arrays page.

Source: https://blog.netapp.com/reinvent-your-modern-it-with-the-newest-aff-nvme-all-flash-storage/?linkId=100000008767891

netapp-blog7

FlexPod: The Converged Infrastructure Swiss Army Knife for Your Data Center

More than a hundred years ago (1880), the Swiss army no longer wanted to send its soldiers into the field with a multitude of individual tools such as knives, saws, screwdrivers (for dismantling rifles), can openers, and awls, because they were cumbersome, time consuming, and ultimately easy to lose. This problem led to the idea of designing an extremely practical, resilient, and flexible multi-tool that would support soldiers in their daily exercises and any military tasks. Thus the validated Swiss Army Knife from Victorinox was born, and it has since been developed further in many variants.

We see similar requirements in the adventure of digital transformation, which has drastically changed the way companies serve markets with their products or services. When you look at the number of applications, frameworks, and software variants in IT, it quickly becomes clear that, as with a complex adventure, you can no longer use a multitude of tools in cumbersome silo architectures. A kind of Swiss Army Knife for IT is needed.

This is where flexible converged infrastructures come into play. IDC recently conducted a survey to evaluate the usability and market acceptance of these converged infrastructure architectures—in particular, FlexPod® from NetApp and Cisco. Both the operational and the economic characteristics were considered. According to the study, the concept of converged infrastructures for IT architectures is highly relevant and useful, especially in regard to all topics around digital transformation, like hybrid cloud environments, artificial intelligence, or new ERP systems.

According to the IDC study, the most important features for a multi-tool like converged infrastructure are:

  • High availability. The converged infrastructure solution must have built-in fault tolerance that guarantees maximum availability of applications.
  • Rapid deployment. The system should enable a faster rollout of new applications and services.
  • High performance. The solution must support flash drives, graphics processing units (GPUs), and memory classes to enable high-performance applications such as databases and analytics.
  • All-flash arrays for sustainable and predictable performance. Because of the increased requirement profiles (for example, for AI/ML), all-flash arrays for workloads have become necessary for many enterprise applications.
  • Software-defined infrastructure. Companies are increasingly demanding software-defined compute, storage, and networking; converged infrastructure models must take this concept into account.
  • Automation. The solution should provide mature software that assists IT administrators in automating complex or repetitive tasks such as backup, migration, replication, or resource provisioning through a self-service portal.
  • Integration with multiple public clouds. The solution should support the right protocols and APIs to enable it to work with various public cloud service providers—Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform, and so on.
  • Common data structure for private and public clouds. As the hybrid cloud becomes the gold standard for cloud implementation, there must be a unified data structure that seamlessly connects all public and private clouds. Such a common data structure will provide complete visibility, data mobility, robust security, easy management, and a significant improvement in return on investment (ROI).

FlexPod exceeds the requirements set forth by IDC, helping companies to survive the new “adventures” in the market.

netapp-blog6

How to Lower Your Storage Costs by Tiering Hot & Cold Data

According to IDC’s Worldwide Global DataSphere Forecast, 2019–2023: Consumer Dependence on the Enterprise Wideningthe amount of data that will be stored in the core data center is forecasted to double between 2014 to 2023.

While organizations are planning capacity builds to accommodate this, let’s look at how to determine the right balance of core capacity and cloud capacity, as well as the measures taken when budgets are being reallocated or reprioritised.

IT Leaders need to continue to modernize their on-premises data centers to handle the growth of their on-prem data. In addition, that data needs to be quickly accessable to internal teams who need to use it to deliver value to the organization. However, a recent surveyconducted by IDC with 500 decision makers stated that 30% of those surveyed identified budget constraints on capital requirements as the main inhibitor of IT modernization.

Herein lies the proverbial conundrum of doing more with less. With data capacity requirements growing at the core, IT Leaders will need to make smarter decisions to ensure that their infrastructure is able to handle the growth in a tight budget environment.

Storage Efficiencies – your first move

Just like any other storage, storing data can be inefficient without on-going housekeeping. Through this process, organizations might realize that the amount of space they actually need is far less than what you have purchased.

As a first step, Storage Admins can switch to NetApp clusters running ONTAP® data management software to free up more than 30 times the required storage capacity. This is possible by leveraging the built-in storage efficiency technologies in ONTAP®, such as inline data deduplication, compression, compaction as well as by applying space saving snapshots.

Data Tiering makes data efficiencies more efficient

In addition to storage efficiencies, freeing up storage capacity can also be realized through data tiering. Data tiering is the tiering of inactive data, or cold data, to lower cost secondary object stores available in both private and public clouds. As a result, highly performant storage capacity on the all-flash array is freed up for additional low-latency use cases or applications accessing hot data.

The results of tiering data are significant. Online reportssuch as Evaluator Group’s simulation, shows one can achieve 30% TCO savings by using data tiering capabilities like FabricPool, the data tiering feature in ONTAP®, to tier 80% of inactive data to a public cloud over a five year period.

FabricPool, now provides even more options for hybrid cloud deployments, making and further makes data tiering simple to deploy and manage.

Data tiering in a hybrid cloud

With FabricPool, Storage Admins can have more choices across more cloud providers to determine the best costs. Leveraging partnerships with all the major cloud service providers, ONTAP 9.6 tiering is available in Google Cloud Storage and Alibaba Cloud Object Storage Service, as well as NetApp®StorageGrid®, Microsoft Azure Blob Storage and Amazon S3, which were previously available as cloud targets. Flexible options include tiering as a pay-as-you-go model through NetApp’s Cloud Tiering Service where one can simply extend an OPEX cost model to data tiering.

Simplicity in the modern data center

FabricPool identifies inactive data and provides an automated report that gives an immediate view of potential space savings across Tier 1 aggregates. Once policies are set for that inactive data, full automation kicks in for tiering to the cloud – freeing up precious time and resources.

In the ONTAP design, metadata associated with datasets are kept on Tier 1 storage so that data on secondary storage can be quickly accessed when requested by the application. This also allows Storage Admins to effectively scale tiering capacities to the cloud, leveraging up to 50 times the amount of secondary storage beyond the primary tier.

Your tiered data also remains safe and protected. No one wants to experience a disaster or blackout, and getting operations back online need not be a complex exercise. By mirroring ONTAP® volumes, new destination volumes can now be restored quickly – making the recovery process simple and quick.

FabricPool and the Data Fabric

The amount of data stored on-premises is growing and will continue to grow, so it’s very important to modernize and future-proof your data center with integration into a hybrid cloud world. With NetApp’s Data Fabric, consistent data services to support the unique demands of your business are provided across a choice of endpoints spanning on-premises and multiple cloud environments.

First introduced in 2017 as part of the ONTAP 9.2 release, FabricPool, together with the Data Fabric, have driven wide adoption of ONTAP as the leading storage resource management platform, as reported by IDC. Hybrid cloud is here to stay, and making use of data tiering to the cloud can help free up high-performant all-flash storage in a cost effective manner and lower your storage TCO in the process.

To find out more about how FabricPool can help you modernize your data center with data tiering, visit our website or the NetApp Document Center.

netapp-blog5

Learn How You Can Start Building Your Data Fabric Today

There seems to be a lot of noise in the IT industry today about the fact that hybrid cloud is the only way to go, that it is quickly becoming the new norm. But is anyone actually doing it rather than just talking about it? And if they are, how successful were they, and how difficult was it to implement?

To succeed in pivoting to a hybrid cloud architecture, there are several hurdles to clear. You need to think about what cloud to use and what the crucial decisions are that need to be addressed. This is a key aspect to the NetApp Data Fabric that many overlook.

Over the last several years, NetApp has been making it easier and faster for customers to utilise the benefits of cloud whilst keeping governance over their data. It shows with the numbers of customers that are adopting the solutions that NetApp brings to the market.

During this period, NetApp has been promoting its Data Fabric strategy to its customers. This approach is becoming more of a reality with every day that passes. More and more success stories are coming out from around the globe. As NetApp become the data authority for the hybrid cloud they are helping businesses with traditional infrastructure achieve their business objectives with modern and next-gen data centre capabilities—whilst at the same time helping the CIOs and Cloud Architects fully realise the benefits of the cloud.

We are now moving into the era of “cloud-native” where application stacks are designed for the cloud and focus on containers and microservices to improve their delivery and functionality. Companies that adopt this model not only get to market faster than their competition, but are growing at a faster rate than their competitors.

What if your infrastructure could provide a simple and seamless platform that you could automate to deliver flexible integration of hybrid cloud strategy leveraging various hyperscalers? Or what if you want or need to be cloud-native, yet run on the premises?

NetApp has made a raft of announcements that again further expand its Data Fabric strategy. There are now two new ways of consuming cloud services, the first of which is NetApp Cloud Consumption for NetApp HCI.

Cloud Consumption for NetApp HCI is a way to easily build out a private cloud with monthly billing. Based on the H400 and H600 line of NetApp HCI products, NetApp Cloud Consumption for NetApp HCI starts with 4 storage and 2 compute nodes minimum. You can then mix and match the various models within this line to meet your desired configuration. IT is charged per node per month with a 12-month minimum contract. Then there is the ability to add nodes at a fixed priced with a term extension

This is just an expansion of the capabilities that NetApp HCI already brings to the table of flexible design, simple operations, and predictable performance tied together with the NetApp Data Fabric. NetApp Cloud Volumes provide high-performance, persistent storage through a streamlined and simplified user experience in all major public clouds, but it doesn’t stop there. With the use of NetApp Kubernetes Services (NKS) you too can start to consume hybrid cloud services from the likes of AWS, Azure, and GCP.

NetApp also announced Cloud Data Services on NetApp HCI. By using NKS to act as an orchestrator, automator, and marketplace, NetApp HCI can be used as a deployable region that you can provide Cloud Volumes to on premises. This shared, common API for both the public and private clouds enables NAS as a Service powered by ONTAP, managed simply for self-service to be coded for and consumed regardless of location.

You get a service that can reside on premises, giving you on-demand elasticity accessed via a self-service portal or consumer APIs combined within an operational expenditure model (OPEX). NetApp HCI is hybrid cloud infrastructure that’s delivered by the Data Fabric.

NetApp have also announced Fabric Orchestrator. This is a single control plane connecting all of your data production with your data consumption. It is an extensible user interface for the Data Fabric services you are consuming, allowing you to convert intention into action with no need for advanced administration skills. Under the hood it connects to your data wherever it is via the Fabric API Services; and then through the Data Hub it can monitor, automate, and optimise your resources and provide actionable insights.

Looking at NetApp’s cloud portfolio, there are a vast array of products to consume: from Cloud Volumes in their various forms, to control planes like NKS and Fabric Orchestrator, as well as analytics such as Active IQ and Cloud Insights, and tools like Cloud Sync and SaaS Backup. This is a division within the portfolio that is moving just as fast as a start up and delivering products and services that are not only at the cutting edge, but in high demand.

netapp-blog4

Flash Storage Made Easy: How C190 Helps Small Enterprises Future-Proof Their Business

With technology advancing at a rapid rate, investing in the right IT solution can be challenging, especially for a small enterprise or branch locations at a larger enterprise that often lack the necessary IT specialist. Important investment questions surface such as how can you be sure that the IT solution you invest in today will serve you well tomorrow? And, does a more affordable solution mean it’s less secure, less agile or less scalable?

When it’s time for your small enterprise to invest in storage and data management you want a solution that is simple, smart and highly secure, built for your needs today, and ready to adapt for tomorrow. You also want to align your business with an industry-leading provider that you can trust. Introducing the new NetApp AFF C190 all-flash storage, an affordable, scalable storage solution from the data authority, NetApp.

The Future is Inundated with Data

In a recent survey concerning data storage plans, approximately half of all respondents claimed a data growth of 1-99 TB over the last two years, with an additional 14 percent of respondents claiming growth in the range of 100-999 TB. Fifty-five percent of survey respondents project a data increase anywhere from one to 999 TB over the next two years. Current and projected data growth are significant enough to present a formidable challenge to outdated storage platforms. This means companies need to look at investing in secure, simple, and smart storage solutions.

Invest in Security

The increase of ransomware incidents and data theft hacks over the past few years has highlighted the susceptibility of many enterprise environments. Add to that the European Union General Data Protection Regulation, and similar legislature around the world, require significant changes in how companies worldwide manage personal data. NetApp AFF C190 offers a value-priced solution for simple and secure storage. With C190, more affordable does not mean less secure.

If an incident occurs, with synchronous replication for zero data loss, you’ll never lose your most critical data. Even pending transactions are saved thanks to integrated application-consistent backup. Your application data is encrypted and secure, and you can control data access, which is vital to comply with data protection and privacy laws.

Invest in Simplicity

A small enterprise often means a leaner IT team, perhaps even a singular IT “go-to” generalist. C190’s intuitive management software is easy to deploy without a specialized team and easy to manage, giving you provision storage in under ten minutes. With C190, you can consolidate and manage all of your SME’s files and block data from a single system, increasing efficiency. Connect to your preferred cloud service with ease and automatically backup, tier or migrate your data.

Invest in Scalability

Small enterprises need storage solutions that are scaled to fit their needs. Paying for too much storage is throwing away money, yet, at the same time, you want a solution that can grow as your business grows. C190 allows you to store more data for less with inline data reduction. Flash storage also means a ten-times faster application response time. Increasing efficiency in your SME means more time and more money to put towards other priorities, like growing your business.

Invest Smartly

They say that hindsight is 20/20; however, having accurate foresight also gives a clear picture. C190 can help you see into the future. Data-driven insights help you identify and correct problems before they occur, which can save your organization time, money and unnecessary headaches. With the new NetApp AFF C190 all-flash storage’s capabilities, IT professionals who want a low-cost flash solution to modernize, future-proof and simplify their infrastructure have access to a scalable, efficient, and high-performing file storage system.

Don’t wait until you need more storage. Start future-proofing your company’s IT now. Look into your SME’s future; you should see the NetApp AFF C190 there.

netapp-blog3

NetApp Channel Partners Have a Simple Entry into Flash for Managing Data in Hybrid Multi-cloud

NetApp debuts new channel-led AFF C190 entry-level all-flash storage

NetApp (NASDAQ: NTAP), the data authority for hybrid cloud, today reinforced the company’s commitment to the channel with the announcement of the NetApp® AFF C190 system, a simple, smart and secure all-flash storage solution. The new entry-level solution was developed with the channel in mind. It enables NetApp channel partners to aggressively and self-sufficiently expand their market share by helping smaller organizations modernize their IT infrastructure as part of their data fabric strategy.

“NetApp is focused on making it simpler for our partners to do business with us,” said Jeff McCullough, vice president of Americas partner sales at NetApp. “This new entry-level all-flash storage solution represents our commitment to our channel partners and how we are expanding their opportunities to reach new markets.”

Announced in May, the latest enhancements to the company’s award-winning Unified Partner Program makes it easy for NetApp partners to offer customers a seamless data management experience across private, public and hybrid cloud environments. The new NetApp AFF C190 system extends this support by providing partners with more growth opportunities to deliver an all-flash system that is priced for smaller enterprises and for emerging companies. The ordering, configuring and quoting process for AFF C190 is also simple, so partners can to close deals quickly and efficiently. In addition, it qualifies for NetApp’s program incentives, adding to the multiple ways that partners can be profitable while attaining new customers.

NetApp AFF C190 system make the benefits of all-flash storage with NetApp ONTAP® data management software accessible to organizations that previously thought all-flash was beyond their budget. Simple to deploy and to manage, AFF C190 bundles ONTAP premium software to deliver enterprise-class data services for effortless connectivity to the cloud, storage capacity efficiency for great value and integrated data protection and security.

“We provide channel partners with reach, efficiency and expertise to meet the needs of organizations of all sizes,” said Jessica Yeck, vice president, vendor solutions, Tech Data. “The AFF C190 supports this mission by providing an entry-level all-flash solution developed with the channel in mind. NetApp is making the value of data fabric accessible to all customers, helping them enter the hybrid multi cloud world.”

The AFF C190 system is a flexible flash storage solution that is designed to support business applications, virtual machines (VMs), file systems and mixed workloads. The solution offers:

  • A single system to manage both file and block data, enabling consolidation of workloads for ease of use
  • Comprehensive data protection, including data encryption, helping to secure data and prevent data loss
  • Effortless connection to the major public clouds for simple tiering and backup, enabling organizations to more easily and affordably future-proof their IT environments
netapp-okada-article

Okada Manila

Okada Manila delights guests by delivering a data-driven customer experience with NetApp

This year, over 7.3 million guests will experience a new level of personalization at Okada Manila as they enjoy leisure in one of the largest casino resorts in the world. NetApp’s hybrid flash storage solutions ensure uninterrupted availability of Okada Manila’s business applications and reliable access to mission critical data.

"NetApp solutions not only simplify data management, but also provide a robust and flexible IT infrastructure that can adapt to changing business needs. With these capabilities, Okada Manila will be able to effectively cope with the growing volume and complexity of data, and continue being a data-driven organization."
Dries Scott
Chief Technology Officer, Okada Manila

Okada Manila realized early on that consistently providing such a top-notch guest experience is only possible with a data-driven approach to running all of its operations. However, this initially posed as a challenge. Since it has a multitude of customer touchpoints across its facilities, Okada Manila needed an IT backbone that allows the right data to be delivered to the right channel at the right time.

With NetApp’s hybrid flash storage solutions, Okada Manila systems will run optimally 24 hours a day, seven days a week as it combines highly reliable hardware, software and sophisticated service analytics to alert IT teams of possible issues that could lead to service outages.