Nutanix and the Problem of Performance Testing
      
 
  As I wrote  about recently, hyperconvergence vendor Nutanix and VMware have been  battling over testing of their products. That battle took an interesting turn  last week, when one storage testing site took Nutanix to task over its  hesitancy to let the site test its appliances.
 
Performance benchmarking and testing, as you may know, is a  sensitive topic for many vendors. They can fear testing for a number of  reasons: apples-to-oranges tests, pristine test environments that don't account  for real-world scenarios, bad testing methodologies and so on. And they're  right to be cautious; testing can be skewed to give the results a certain  vendor wants; in addition, many reviewers aren't very sophisticated about testing,  and don't understand how to do it right. 
 No Cluster for You!
But Nutanix didn't do itself any favors with its recent  behavior toward StorageReview.com. In an article titled "Why  We Don't Have a Nutanix NX-8150 Review," Brian Beeler discusses the  problems he had getting that particular Nutanix appliance. Here's his summary  of the events:
  
    "Nutanix sent an NX-8150 cluster for  review, our testing revealed performance issues, Nutanix made several software  updates over six months to improve performance, in June they asked to update  our cluster with newer Haswell-based systems, we agreed, Nutanix then  backtracked and refused to send the promised replacements."
  It's a long saga,  and details how Nutanix became increasingly wary of publishing the results of  any performance testing unless it had final approval of StorageReview's  article. Nutanix also asked StorageReview not to use industry-standard  methodologies like VMmark and Sysbench for testing. "Until we have a mutually agreed upon plan,  we ask that you not undertake any performance testing of the Nutanix product,  or publish results of prior performance testing," Nutanix said in an  email, a snippet of which StorageReview posted. 
Mistakes Were Made
  Lukas Lundell, global director of Solutions and Performance Engineering  at Nutanix, responded in StorageReview's forums (Lundell fired the main shots  at VMware's attempts to test Nutanix vs. VMware products). To his great credit, he  apologizes for Nutanix's failure to properly handle its interactions with StorageReview:  "… it appears like we definitely started off on  the wrong foot, and it's very clear we mismanaged this situation.  We  didn't treat them like a customer, and that was a big mistake."
  That's absolutely the right way to do it. Take your lumps where you  should; no one will feel more negative toward a company that admits mistakes;  on the contrary, they tend to trust that company more, rather than less. 
  It's also important to point out that numerous Nutanix customers  responded on the same forum thread, and they were uniformly happy with their  Nutanix experience. Here's a typical example, from "nathanmace": "Our  Nutanix experience has been excellent, rock solid stability, great performance,  and some of the best support I've ever dealt with."
  But StorageReview's experience with Nutanix, combined with Nutanix's vitriol  toward VMware, makes me wonder why the company appears to be so worried about anyone  publishing test results of its products (and Nutanix's attempts to control  what's actually published), and so eager to quash any attempt to evaluate the  performance of its products or compare it to others. It's especially puzzling in  light of the excellent reputation Nutanix generally has in the industry; it's  not like it's some fly-by-night startup with no track record of happy  customers. 
  Questions That Need Answers
Bob Plankers, an analyst  with 
The Virtualization  Practice, has some of the same questions I do, which he posted on his 
personal blog: "Why can't I run a standard benchmark like VMmark on a Nutanix  cluster? Why can't people share performance results? If I bought one of these  would I be able to talk about my performance? Why is Nutanix uncomfortable with  performance results?" 
 The reality is that nearly any test that anyone wants to  run could be blasted as somehow invalid or biased, if one wants to dig into the  testing details to the nth degree. Yes, there's a right way and a wrong way to  test, and the wrong ways should be called out. But nitpicking results in good  tests -- and good testers, like StorageReview -- because of slight variations  here or there, seems unnecessarily defensive. 
The Beginning -- Not the End
  The last thing to say about performance testing is that test results  are a starting point, not an end point, when evaluating storage, networks,  infrastructure or anything else in IT. They should never be the final word for 
your datacenter. They are one factor of many to consider. Nutanix has many fans. So  does VMware. They have them for a reason.  
	
Posted by Keith Ward on 08/10/2015 at 1:09 PM