Comprehensive software reviews to make better IT decisions
The NAS Is Back and It Is Bigger Than Ever
The external storage industry is undergoing considerable churn right now spurned by waves of disruptive developments from flash to hyperconvergence to cloud. As enterprises seek to modernize storage, they are increasingly looking at scale-out NAS to meet pressing critical needs for lots of file storage.
For years the epitome of big storage for most data centers was the storage area network (SAN) array. A foundation for data center consolidation, the SAN array started as a platform for block-level storage over a dedicated fibre channel (FC) network. Over a decade, the SAN evolved to be an all-singing, all-dancing unified storage solution typically hosting multiple network transports (FC, IP ethernet, fibre channel over ethernet) as well as block and file storage.
Concurrent with the evolution of unified storage, Info-Tech’s enterprise storage strategy projects typically were about helping the organization assess needs and evaluate requirements for their next big SAN purchase. But a funny thing happened on the road to unified storage bliss. Many have concluded that their current SAN array is likely to be their last.
Here are three examples from recent Info-Tech engagements around storage modernization:
1. An aerospace technology manufacturer assessing their current and future storage needs found that total capacity requirement for one purpose – file storage for product engineering quality assurance requirements – was more than what was needed for all other applications and services combined.
Figure 1: Testing QA Outstrips All Other Storage Need
Source: Info-Tech Research Group
2. A university was finding that research files were stored in a vast hodgepodge of PC hard drives, NAS filers, and removable media. There was no capability to set standards for availability and recoverability for research storage and no way to manage predictable capacity growth.
3. A national multi-location law firm needed a storage platform for a large and growing repository of legal documents. The need was for enterprise class storage that was also “cheap and deep.”
In all three of the above cases, the organization was looking to a scale-out NAS solution. They all have traditional SAN arrays deployed but see those eventually being replaced by storage closer to the processing, such as flash-storage-based hyperconverged appliances.
Not Your Dad’s NAS
Network attached storage has been around for decades. In fact, most organizations first external storage – that is storage outside of a server or workstation where the data was accessible over a network – was likely a NAS storage array. One of the early leaders in selling these network-attached storage appliances basically named their company after the product type (NetApp).
Saying the NAS is “back” is a bit misleading for a technology that has never left. What is “back” is the focus on NAS to address critical needs. Today’s enterprise NAS solutions have evolved in a number of ways while continuing to provide all the enterprise-class resiliency bells and whistles such as hardware redundancy, replication, and data snapshotting. Some defining characteristics include:
- Scale-Out: Instead of one storage controller, the platform is defined by a grid or cluster of hardware nodes. The platform presents on the network as a single name space but behind the one is many. Capacity, and performance, scales with the addition of additional nodes.
- Software Defined: Further to scale-out, the storage management is abstracted – that is, it is not hardwired into one device but running across the cluster. The nodes can be commodity storage/server devices lowering the Capex cost across the storage platform.
- Cloud Storage Tier: Incorporating cloud gateway technology into the NAS enables the treatment of a public cloud target as a tier within the NAS platform.
- Big Storage Meets Big Data: Large-scale NAS is an example of big storage, not big data. Big data is about analytics volume and velocity of unstructured data, not storage. However large-scale NAS can be leveraged as a data lake for big data analytics.
Product Example: Isilon
We see a number of storage veteran and upstart providers for scale-out NAS are vying for your business. NetApp continues to be a player in the NAS game, of course. Others include Panasas and HPE Scality. For purely software-defined scale out NAS on commodity server clusters, RedHat comes up a lot with its Gluster File System (GlusterFS). An example of a recent player that has come up often in storage modernization engagements is Seattle, Washington-based Qumulo.
A leader in this field has to be Dell EMC Isilon. Isilon first came on the scene as a high-performance, high-capacity scale out NAS filer targeted at media companies. As use cases for high-capacity file storage broadened to other industries, Isilon became a go-to for enterprise NAS. All three example organizations listed above evaluated Isilon (as well as some of the others including Qumulo).
Isilon exhibits all of the characteristics listed above. It is scale-out. It has cloud gateway capability (thanks to EMC’s acquisition a few years ago of cloud gateway pioneer TwinStrata). And it can run big data Hadoop File System (HDFS) natively. It is an enterprise-class solution, so cost can be prohibitive. Storage is getting cheaper, but enterprise storage remains expensive. Isilon is always on the shortlist but it isn’t always the final choice.
Isilon was first acquired by EMC in 2010 and has now landed with Dell as part of the acquisition of EMC. Interestingly, the recent player example above, Qumulo, was founded by a group of ex-Isilon engineers.
And Then There Was Object Storage
I would be remiss when discussing big storage buckets if I didn’t also mention object storage. If the search for “cheap and deep” is leading toward complete software abstraction on lowest common denominator commodity hardware, object storage is a contender.
This is how storage is done in public clouds like Microsoft Azure or Amazon S3. Data is chunked up into file-like objects that are replicated and storage across a grid of storage devices. Some object competitors in the file storage space include Dell EMC Elastic Cloud Storage and HPE Scality (again), as well as OpenStack Swift.
In planning your future roadmap, recognize that the big SAN array may not be the single foundation for your storage.
- Start with an evaluation of what you are storing. Start with the data. What business processes is it serving? Is it structured data, system data, file data? What are the applications and lines of business that use the data? What are their availability and performance requirements?
- Model capacity and growth. Model how fast the data is growing and relate that to capacity of current storage devices. Are you going to run out of capacity before end of life? When are you going to need to refresh or replace current storage?
- Analyze for best fit. Look to making a best fit between the data and business requirements with various storage structures. NAS should be on the table, but also traditional SAN arrays, all-flash arrays, hyperconverged appliances, backup storage appliances, and cloud storage.
Network-attached file storage has been around for decades, but it is having a renaissance with scale-out and software-defined technologies. Enterprises are moving away from big unified SAN storage arrays, but the need for big file repositories remains. Evaluate whether scale-out NAS or large object storage grids are in your future.
Want to Know More?
IBM is changing the terms of its ubiquitous Passport Advantage agreement to remove entitled discounts on over 5,000 on-premises software products, resulting in an immediate price increase for IBM Software & Support (S&S) across its vast customer landscape.
Thinking about choosing a new software vendor but don't know where to start? Narrow down your shortlist by focusing on software that has received an Info-Tech Research Group award. New data from SoftwareReviews shows that organizations reported higher satisfaction when they switched to software that had received an Info-Tech award.
The impact of COVID-19, as it became a global pandemic in Q1 of 2020, has affected user sentiment toward software during a growing period of fear, uncertainty, and doubt. To analyze the impact, SoftwareReviews compared Satisfaction (willingness to recommend to a peer), ability to deliver Business Value (fair cost to value), and Likeliness to Renew prior to March 10 and post March 10.
TietoEVRY is using Ayehu NG to automate activities in applications such as SAP. SAP automation has great potential to free up resources in applications teams.
TietoEVRY is using Ayehu for IT automation to avoid locking in its customers. TietoEVRY is betting on its relationship building and quality of service to retain accounts.
Ayehu is leveraging quality of service as a differentiator. Info-Tech expects that service will continue to become more important among software vendors.
Ayehu is expanding its footprint in automation in the applications space. Info-Tech expects that automation vendors will compete based on their application automation capabilities.
TietoEVRY and Ayehu have announced a partnership. This partnership aligns with the strategic priorities of both companies.
Ayehu announced that it will be collaborating with Automation Anywhere, a robotic process automation (RPA) solution provider. This collaboration is a complementary marriage of opposites with the potential to benefit both vendors.