nav search
Data Center Software Security Transformation DevOps Business Personal Tech Science Emergent Tech Bootnotes BOFH

Oh, 3PAR. One moment you're gliding along. The next, you're in the rain as HPE woos Nimble

It's been a good 20 years. Time to move on?

By Chris Evans, 9 Mar 2017

Comment This week, HPE offered to acquire Nimble Storage for around $1.09bn, plus another $200m in share options.

Nimble sells all-flash and hybrid storage solutions, with a lot of intellectual property focused around storage analytics in the form of their “InfoSight” SaaS platform. Commentators are seeing this as a good deal for both Nimble and HPE, but is it really as good as it seems?

HPE storage

HPE’s storage portfolio is currently focused heavily around the 3PAR platform, acquired by the then-HP in 2010. 3PAR was founded in 1999 and was successful for both a new architecture and features such as the embedded system ASIC that allowed things like zero detect to be implemented at “wire speed.” HPE acquired 3PAR for $2.35bn after a bidding war with Dell, which today, ironically, has more storage systems than it knows what to do with.

There’s no doubt that the 3PAR asset was sweated, replacing the ageing EVA and being extended upwards to the high end and down to the low end. At the outer edges of the HPE storage portfolio, we have MSA for entry-level systems and XP for high-end availability and mainframe.

Since 2010, HPE has continued to add features to the 3PAR platform (disclosure: I have done work for HPE documenting the new features as they have been released). Most recently, space saving technology was enhanced with compression as part of its “Adaptive Data Reduction” feature set. Throughout the evolution of the 3PAR platform, the system has continued to use a dedicated hardware ASIC for some performance-sensitive processes, including zero detect and some of the deduplication tasks.

Having dedicated hardware is both a blessing and a curse. In 1999, the ASIC was a game changer, however some 18 years later, processor speeds are much higher and one does wonder how much of the ASIC functionality is still required and how much can be done with modern processors.

Nimble Storage

Let’s move over and look at Nimble. The company was founded in 2008 and launched its first product in 2010 at a Tech Field Day event in Seattle. I had my first briefing in 2011 at the EMEA launch. Looking back at the launch deck, it’s interesting to see that the platform was promoted as converged storage, meeting primary, backup and DR needs.

The company founders have a background working at NetApp and Data Domain, so it’s not really surprising that there are some technical similarities to NetApp’s Data ONTAP, with data stored in NVRAM before being committed to disk and kept in flash for fast subsequent reads. The lawsuit claims that arose between the two companies were settled in 2015.

The nice feature of the Nimble architecture is the ability to use flash in a different way that isn’t focused on using it as a standard tier of storage or write cache. Over time the platform has developed to be highly scalable, and of course the company has introduced InfoSight, a SaaS service that provides deep analytics on the activity of applications and data on the Nimble platform. Many people see this as one of, if not the, key pieces of technology the company has developed.

Nimble had an initial public offering of shares in December 2013, opening at around $31 from a launch of $21. However after rising higher, the shares tanked towards the end of 2015 and exactly a year ago today were reported at $7.67, well below the IPO level. Prior to the HPE announcement, shares were trading around $8.50, around an 11 per cent gain over the year.

Unfortunately the storage array market isn’t a growth business (as I discussed last year), so competition is tough, with vendors fighting to effectively buy business from each other.

20-year architectures

On many occasions I’ve started putting fingers to keyboard to write about 20-year architectures, based on the premise that, at some point, it’s more practical to start again than rewrite or amend existing storage array technology. The reason for this is simple: software and hardware design was based on the technology available at the time. Starting with a clean slate means not carrying the baggage of an existing design and so more effective solutions can be developed.

Looking at the 3PAR architecture, it could be argued that this point is being reached. 3PAR is not truly scale-out (although can scale to eight nodes) and was designed in the age of disk. The technology has adapted well to flash, however deduplication and compression weren’t native features, so there is (without giving away trade secrets) some degree of compromise in the design of these features in the platform. At this stage, these compromises aren’t enough to stop the platform being competitive.

Incidentally, many other storage vendors are in the 20-year scenario. NetApp has/had the issue with Data ONTAP (the original version), forcing the company acquire SolidFire and Engenio. Dell EMC acquired XtremIO as an all-flash solution in preference to an all-flash VMAX, however that strategy seems to have been reversed at least in the short term with VMAX all-flash. Dell EMC Unity (ex CLARiiON, ex VNX) was an eventual re-write of the original single-threaded code to take advantage of multi-threaded processors.

The interesting dynamic in the market is that start-ups can create new solutions without having any baggage to worry about. In contrast, the incumbent vendors need to manage the transition for customers from one platform to another, which creates its own set of issues. Conversely, somehow, new startup vendors need to grab market share against an entrenched set of traditional hardware products, that may have more maturity and a bigger customer ecosystem. This is a hard task for any company to fight against.

So vendors want to get the most out of their hardware solutions and be as non-disruptive to customers, while knowing that their hardware storage solutions will at some stage be superseded. Twenty years seems to be about the right lifespan in today’s market.

HPE be Nimble, HPE be quick

Getting back to the acquisition, why would HPE want to make an acquisition at this point in time? There are a few possibilities:

  • Age – as already discussed, 3PAR has had a good run and at some point will need to be replaced. Better to start early and look at customer transitions than get to the point seen with EVA, where the product was no longer fit for purpose. There’s still life left in 3PAR, but succession planning is important.
  • Market – 3PAR fits the middle- to large-enterprise market well. I never really thought it scaled down as successfully. Nimble’s platforms fit the small to medium market well, with overlap, but that’s not a bad thing.
  • Customers – Nimble has been gaining customers, many of which must have been at the expense of HPE. This brings those customers back into the fold (assuming they don’t jump ship again).
  • Technology – Infosight is a good product which can easily be extended to the 3PAR platform. It’s another “value add” for customers that creates further stickiness for HPE storage.
  • Price – Nimble was relatively cheap, at less than half the acquisition cost of 3PAR some 7 years ago. The share price was depressed and nowhere near their market highs.

I think there’s also a question as to how long Nimble could have continued to lose money. Although revenue was growing, so were losses and that’s an unsustainable position for any company.

The not-so positives

So now for some negative sentiment. Nimble’s acquisition isn’t a good buy for those who bought into the company at IPO or subsequently. Some people will have made a significant loss.

There will (naturally) be rationalization of sales and support teams, so some people will lose out there. One other area is what this (and the recent SimpliVity) acquisition says for HPE’s future. Over the past few years, the company has closed down its public cloud and sold off services and software divisions. The focus is squarely on hardware and hardware solutions, which is further emphasized by acquiring Nimble. The question has to be asked as to HPE’s storage software strategy. Where will StoreVirtual fit within this deal? Why hasn’t HPE acquired Scality yet?

The Architect’s View

Storage (hardware) is not a growth business and is in a period of consolidation. This process is a reactionary response to reduce costs and increase customer share.

The problem for vendors like HPE is that we continue to produce exabytes of new data every year, with only a fraction of that new information heading towards traditional storage. It’s getting increasingly harder to see how HPE and others will ever be able to increase their share of the storage market without having a strong portfolio of software products.

The problem here of course is that there’s no margin in this business, unlike the high-end hardware market, where margins float around the 60 per cent mark (one of the attractions of acquiring Nimble), which is clearly a problem for the likes of HPE.

I see this acquisition as no more than a holding position and preplanning for the retirement of 3PAR at some stage in the future. So HPE, what is your long term storage strategy? ®

The Register - Independent news and views for the tech community. Part of Situation Publishing