Keeping Your Data Protected During Chaotic Times

The world events of the past weeks have given us a clear view of what not having a plan looks like. We are learning that current infrastructures cannot support a pandemic the likes of what we’re currently living through and we’re managing it in crisis mode. Learning this lesson came at a great cost, but it is challenging us to rethink our preparedness. As I sit here on a stay indoor order from our local leadership and doing my due diligence to protect myself and others, I can’t help but draw parallels (being in the technology space) from these life-altering events to digital cyber-criminal events that are occurring right now as I write this. In my last blog, I wrote about the importance of testing your IT network and pointed out some strategies to use to ensure you are well prepared should ransomware or other cyber-attacks infiltrate your datacenter and cause irreparable damage. In this blog, I want to discuss – no, stress once again the importance of testing your backup strategies and business continuity plans.

Training & Preparedness

With a lot of the workforce working remotely, it is crucial that employees are trained to be alert to activity that targets regular users like you and me – watch out for those coronavirus emails that are being used as bait by phishers! There are sites that are using COVID-19 and Coronavirus as a lure to make victims ‘click the link’. Paul Chichester, Director of Operations at the NCSC, said: “We know that cybercriminals are opportunistic and will look to exploit people’s fears, and this has undoubtedly been the case with the Coronavirus outbreak.

Time and time again we’ve heard that cyber-attacks come in different forms such as data breaches, ransomware, phishing campaigns, and even some advanced hacking attacks. Investing in an excellent cybersecurity software plus employee training will play a major role in averting a disaster. Persistent criminals will take advantage of the opportunity to infiltrate your network via the various forms of cyber-attacks so let’s learn from previous incidents causing millions of dollars in damage (see previous blog – link) and avoid the same fate as best we can. Just last week, a report on the NCSC site stated that a global network of bots was brought down and dismantled. These criminals are believed to have infected more than nine million computers worldwide.

The right mix of technology

Let’s start by asking the right questions. First, asses your cyber risk. Check out the NCSC website for guidance . Is your organization prepared to weather a cyber-attack? Is your network not only protected but resilient and able to predictably recover stolen, encrypted or lost data? What are the RPO/RTO’s that need to be met, and can they be met with your current data protection technology? If your network backup copies are compromised, do you have a copy offline and air-gapped? These and many more questions need to be asked to ensure that whatever data protection solution you choose, test your Business Continuity (BC) and Disaster Recovery (DR) to understand efficiency resiliency and predictability so you have the peace of mind that your data is protected.

The experts, highly recommended that you apply the time-tested best practice rule of 3-2-1-1 rule to be safe. Have both disk and tape to ensure a reliable copy is available when you need it. Whether you use cloud or hardware on-prem, be it fast performance technology to quickly process your hot data to cold storage technologies for long-term storage – the most cost-effective way to tier-off your data as it shifts in value is to leverage the different technologies that are available.  Here is an example from Quantum with DXi and object storage for enterprise backup where cost-effectiveness, scalability, and management of unstructured data is of extreme importance.

All these technologies combined will help you meet your RPO/RTO’s but in addition, should you need to call on your backup copy for any reason and your copy on spinning disk is compromised, your insurance will be the copy that is offline and air-gapped.

If we knew when disaster will strike, everyone would prepare. The reality is we never know. Test and practice your response to a cyber-attack. Whether you are small or large organizations, testing your resiliency is critical. Create practice scenarios in a safe environment where you can test your network and backup strategies, there are plenty of online help tools available if you’re organization does not have IT, professionals, to handle this type of exercise. Be prepared to handle a crisis scenario. If you’re in the public sector and funds are tight, leverage organizations like NCSC with their exercise in a box tool to practice your response .

These times call on us to provide you, the IT professional with all the tools and necessary information to help you make the best decision for your organization. Crisis or no crisis preparedness is key!


Four Reasons Why you Need a Modern File System for Media Workflows

In media and entertainment, it’s often difficult to meet even the basic requirements for data storage. You need to ingest content quickly, provide fast access to that content, facilitate collaboration, and cost-effectively preserve and protect content over the long term.

But as viewing habits are changing, and the creative process continues to evolve, meeting the escalating requirements for storage is becoming more difficult than ever before. You need greater performance and flexibility to handle 4K and higher-resolution content from a growing number of sources. You also need to balance this with capacity and cost.  And last but not least, somehow streamline management of an increasingly complex storage environment.

Within the technology ecosystem of a modern storage implementation, there are a number of crucial aspects at play influencing the dynamic interplay of performance, capacity, management and cost. Chief among these is the file system.  So, let’s take a quick look at four key areas where a modern file system can really make a difference to your overall workflow efficiency.


It wasn’t long ago that organizations could make do with storage performance that was “just good enough.” However, as the industry has moved to higher-resolution formats, much more performance is required for real-time editing, color grading, transcoding, and other tasks.

This is where the right file system substantially improve the performance of the underlying storage hardware – no matter if its composed of solid-state drives (SSDs), hard-disk drives, or any combination of the two.  For example, a file system that separates user data and metadata operations—and runs those operations in parallel—enables both tasks to complete faster while helping ensure data transfer is unimpeded. In addition, a file system that slices logical unit numbers (LUNs) and enables the creation of stripe groups can help better match high-performance workloads to high-performance storage.


To enhance the efficiency of workflows and facilitate collaboration among team members, storage solutions must enable simple, flexible access to content. By creating a single, global namespace, a modern file system can give team members direct access to files, images, and other content at high speed—no matter where users are located.

The right file system will also support several types of storage systems and connectivity options. This means that the file system should natively support shared storage area network (SAN) environments with Fibre Channel connections as well as scale-out network-attached storage (NAS) environments over Ethernet. Such flexibility for different storage types and connectivity provides organizations the capability to architect the most ideal infrastructure for their specific use case.  And as most organizations will need to expand and grow as their business becomes more successful, they have the option to migrate to a different topology as future needs dictate. 

Optimization of Storage Resources

How can you effectively balance performance, capacity, and cost for your storage environment? One tried and true method is to architect a multi-tier environment that seamlessly integrates multiple types of storage; from Tier 0 flash-based storage through object-based systems, data tape archives, and cloud archives.  This kind of structure allows for the hottest data to be placed on the fastest storage pools, while cold or less frequently accessed data can be placed on capacity optimized storage pools. 

Another crucial functionality is that the data movement must be automated so it flows seamlessly across those storage tiers.  Ultimately this translates into the ability to keep current, time-sensitive projects on high-performance arrays while moving older, less-frequently accessed content to cost-effective media and large-capacity archives.

Streamlined Management

Very few media organizations have large, dedicated administrative teams to handle storage. As storage environments grow in size and complexity, organizations need ways to simplify management so their IT teams can more easily meet the requirements of their users. The right file system will streamline administration by offering an array of capabilities—from remote management options to analytics tools. This means an entire multi-tier environment can be managed with a small staff or even a single person.

Move forward with StorNext

So as you can see, there are a handful of exceptionally good reasons why investing in a storage infrastructure powered by an industry leading file system could really make a large impact on your business. To learn more how StorNext can deliver the performance, flexibility, balance, and streamlined management I just described, download the new eBook, “Modern File System Functionality That Can Supercharge Your Media Workflows. “