Comprehensive Software Reviews to make better IT decisions
NetApp Is the Storage Comeback Kid. Should You Care?
NetApp is back. A storage vendor that was increasingly dismissed as yesterday’s technology is making waves in two areas where it appeared it had missed the boat: flash storage and hyperconverged infrastructure (HCI).
Just over two years ago NetApp was in trouble. The company was losing money and market share and looked vulnerable as the storage market was being shaken up by an overall decline in sales and innovative newcomers. But in 2017 NetApp turned things around. They are growing their business and their market share again.
- According to figures from market watcher IDC NetApp grew its revenue for external storage systems by 14.5% and grew its share of that market from 10% to 11.2%. This was better growth than any of the top five vendors (Dell, HPE, NetApp, IBM, and Hitachi).
- A key factor in NetApp’s growth has been flash storage. At the beginning of 2016 NetApp acquired all flash array (AFA) start-up SolidFire. In addition to the boost of SolidFire, NetApp solved flash storage throughout their core product lines. Since 2016 they have been the second largest vendor of flash storage after Dell/EMC but ahead of HPE, IBM, and Pure.
- In October 2017 NetApp rolled out its hyperconverged infrastructure (HCI) product based on SolidFire storage. It is still early days for this product but NetApp is saying, to the financial analyst community at least, that it is off to a good start, including a 2018 “seven figure deal” with an automotive manufacturer where HCI figured prominently.
Things didn’t look nearly so promising at the end of 2015. From the point of view of the storage industry analyst, it seemed like nobody was putting NetApp on their shortlists any more. When flash took off it was Pure and ExtremeIO and “hybrid” disk/flash startups like Nimble, Tegile, and Tintri. Then, as HCI gained steam, it was Nutanix and SympliVity on the shortlists.
NetApp was nowhere in these discussions. In fact, NetApp was only mentioned in terms of legacy technology, as in, “We are looking to replace our old NetApp storage. What do you think of Nutanix?”
Like entertainment reporters keeping death watch on an aging movie star, many analysts started working on their NetApp obituary. Here is an example of the “NetApp is old technology” narrative. In 2016 Chuck Dubuque, Senior Product and Solution Marketing Director at Tintri, said that NetApp’s Data Fabric vision sounded future-looking but was just a dressing up the past.
“NetApp is trying to dress-up ONTAP, but the reality is that it’s a legacy solution ill-suited to the modern data center,” wrote Dubuque. “And while SolidFire is the latest addition to the wardrobe, its operating system is built on the same LUNs and volumes as NetApp — originally designed for physical workloads. Rather than being a natural fit, it seems to us that NetApp’s Data Fabric is more like 1970’s polyester.”
NetApp Got Nailed by the Future
So what happened to NetApp and how did they come back? Consider this three phase view of the development of enterprise storage from Info-Tech’s Modernize the Data Center with Software-Defined Infrastructure blueprint.
NetApp was a leader in Phase 1 and Phase 2, but they got nailed by Phase 3.
The story of enterprise storage over the past decade has been very much the story of infrastructure consolidation and server virtualization (Phase 1 and Phase 2 above). The dominant storage for the data center was the big storage area network (SAN) or network attached storage (NAS) disk array. NetApp, with its innovative ONTAP operating system and unified SAN and NAS arrays, was a leader for much of this history.
Manufacturing was a major constituency for NetApp storage but also healthcare, higher education and, in fact, any midsize to large enterprise needing a resilient foundation for a consolidated infrastructure (typically with blade servers and VMware virtualization). The epitome of NetApp’s Phase 2 success is FlexPod, a converged infrastructure architecture that combines NetApp storage with virtualized Cisco servers.
But then two disruptive waves of change — solid state (flash) and hyperconverged infrastructure (HCI) — crashed through the storage market. NetApp did not handle the move to solid state well, initially focusing on solid state as a caching technology in support of, rather than replacing, spinning disks. When hyperconvergence became a thing NetApp did not have a horse in that race.
“Reports of my death have been greatly exaggerated” – Mark Twain
Since mid-2016 NetApp has proven the technology vendor axiom: If you can’t ignore it, you have to embrace it. (Is that really an axiom? It is now!) The very trends that were killing NetApp were embraced and are now fueling growth.
Data Fabric was an embrace of another disruptive trend — software defined storage. Here NetApp also struggled at first, when it introduced the form in version 8 of its Data ONTAP operating system. But they were on the right track. Abstraction is the future, not the past. This is not polyester.
Final note, another disruptive Phase 3 trend not mentioned above is public cloud storage. Here again NetApp has embraced the future. It has had a relationship with Amazon Web Services (AWS) for several years (beginning when NetApp started collocating their storage arrays in AWS data centers). Current offerings (on Amazon and Azure public clouds) include Cloud Volumes (Network File Services [NFS] on public clouds), Cloud ONTAP, Cloud Sync, and Cloud Backup.
The basic recommendation here is that if you are looking to acquire storage — either traditional external array, converged systems, or hyperconverged systems — NetApp is a legitimate competitor for your short list. Here are three considerations:
- Disruption will continue.
I’ve put away my NetApp obituary for now but that does not mean that the churn in the storage market has ended. It will continue. NetApp has succeeded so far in avoiding becoming a victim of these major changes. In fact, they have profited from it. In Europe NetApp ran a successful “Run from EMC” campaign to entice EMC customers uneasy with the Dell acquisition. They have since broadened that to a “Run to Netapp” campaign to go after other vendors such as HPE.
- How do you like your NetApp?
Over the years I’ve encountered many NetApp fans and also some that were anxious to move on. This is still NetApp. Past experience should be an input into future acquisitions. I’m sure there are loyal customers that just wanted to consider HCI (and suddenly the phone would ring from Nutanix. It was kind of spooky).NetApp calls its products based on newer technologies “Strategic”. The “old NetApp”, that five-year-old array you are looking to replace, is part of its “Mature” product line. Currently Strategic products account for 70% of revenue while Mature comprise the remaining 30%.
- NetApp HCI quacks like a duck.
Some claim that NetApp HCI isn’t “real” hyperconvergence. Strictly speaking, HCI is defined by abstracted storage and processing in one appliance unit. The system scales individual units when they are clustered. NetApp’s approach is to put different processing and storage nodes in one box (there is room for four nodes). The processing nodes are servers running VMware. The storage nodes are flash arrays running SolidFire’s operating system. NetApp argues that this approach is more flexible and addresses a limitation of HCI (What if your storage and processing are growing at different rates?). Functionally this is HCI even if technically it isn’t. If it walks like a duck and quacks like a duck . . .
Though definitely late to the party, NetApp has managed to weather the storms of hyperconvergence, flash storage, software defined, and cloud. Rather than being crushed or marginalized by these trends, it has managed to embrace change. While the whole storage industry remains in disruption (and some say decline), NetApp has performed better than any of the other ‘usual suspect’ top line enterprise storage vendors (IBM, HPE, Dell, Hitachi).
Want to Know More?
Veeam to be acquired by Insight Partners for US$5 billion: On January 9, 2020, Veeam announced that it has entered into a definitive agreement of purchase with Insight Partners.
Osano recently released its SaaS privacy solution aimed at simplifying compliance and vendor assessments. The product feels familiar, but Osano’s ethical commitment sets it apart from the crowd.
Zerto 7.5 adds support for Azure Active Directory Managed Service Identity (MSI), which simplifies authentication to Azure services while improving security.
Quest Software’s NetVault Backup v12.4 release adds support for OneDrive to its existing Office 365 (O365) backup capabilities, enabling its customers to better align backup strategy to business requirements, and helping Quest Software keep up with other O365 backup offerings.
DataStealth is a difficult product to classify. It resembles DLP and privacy software but doesn’t fit neatly in either category. DataStealth focuses on data obfuscation, using a novel approach aimed at limiting sensitive-data acquisition.
Veeam has rolled out new backup features that integrate with Amazon S3 to help users save on storage costs for their backups. This tech is some serious secret sauce.
TrustArc has announced the acquisition of Canadian counterpart, Nymity – a more boutique-style vendor known for its very high standard of privacy research, expertise which manifests in its product offering.
HPE has ported over InfoSight Predictive Analytics platform from its 2017 acquisition of Nimble to its SimpliVity line, adding AI to the hyperconverged infrastructure (HCI), as announced in an Oct. 28 press release.
Privacy by Design (PbD) is a General Data Protection Regulation (GDPR) requirement, but effective implementation requires deep insight into the operation and interconnection of various data collection processes. Thus, PbD can be difficult to document and demonstrate. However, Proteus may help.