Lies, damned lies and independent competitive test reports

When the friendly sales guy from your favorite vendor honors you with an “independent test lab” report on the newest wonderful gadget he’s trying to sell you, there’s one thing you can be sure of: the box behaves as described in the report. The “independent” labs are earning too much money verifying the test results to participate in outright lies. Whether the results correlate with your needs is a different story, but we’ll skip this discussion.

However, when you’re faced with a competitive report from an independent test lab “sponsored” (read: paid) by one of the vendors, rest assured it’s as twisted as it can be (you should also suspect the sponsoring vendor has some significant issues he’s trying to cover). The report will dutifully list the test configurations and the test results ... without mentioning that the configurations and the tests were cherry-picked by the sponsoring vendor. You don’t believe me? Put on your most cynical glasses and read the About us statement from the premier independent test lab.

You still want me to prove my point: look at the latest HP-versus-Cisco blade server test results (paid by HP). They took an oversubscribed UCS chassis (it had 4 10GB uplinks and up to 8 servers) and compared it to an HP chassis with 8 servers and 8 10GB uplinks. Furthermore, Kevin Tolly himself admitted in the comments that they’ve really tested the bandwidth between the servers within the chassis (absolute kudos for being so frank), which you might suspect could be somewhat irrelevant in a typical deployment scenario.

5 comments:

  1. Funny, I knew what are you talking about even sooner I have read what is it about :)

    "Oh, they are blaming my favorite toy! Shame on them!" >:o

    Sorry, you have got a cisco seal on your forehead.
  2. You've got it all wrong. UCS is not my favorite toy. I have never seen or touched one; I just happen to understand its architecture and the underlying design goals.

    As I said, I'm positive all vendor-financed "independent" "competitive" tests are (by design) biased enough to be useless; I just happen to understand what this particular vaporware was all about.

    Also, if you had been reading my blog, you'd have realized I happily bash Cisco (for example, due to the lack of IPv6 support in Data Center).
  3. Ok, you may be right on technical things like they didn't used right methodology.

    But when I first saw that report in headlines, I thought "Hmmm, this will really upset cisco guys!" (and I must admit, I haven't read that report), and what a surprise, you were one who reacted angrily (we call it like a hit goose).

    As an independent reporter, I would expect you react like "Hmm, interesting results, but they use wrong methodology..."
  4. Well, you haven't seen me reacting angrily yet ;) ... and I'm not an "independent reporter" but an old extroverted opinionated GONER 8-)

    You've also missed the point; it's never been the question of "methodology" (I could argue individual techical points if that would be the case), it's the question of whether the test setup makes any sense in the real life or whether it's totally irrelevant to what we use outside of ivory towers (ehmm ... labs) and was designed only to prove that one architecture is better tailored to the test than another.

    I had to endure these stunts (sponsored by every imaginable vendor) since these "independent testers" entered the industry (looking at their web site, I'm working with routers longer than they were running their tests) and while I managed to ignore them during the last few years, the latest one reminded me of my past grudges :-E
  5. Thanks for including my blog post (http://bit.ly/bUNe8b) in your post. It's interesting to see how passionate Cisco is for their product line!
Add comment
Sidebar