Beware of (or ignore) commissioned tests
One of the leading Fibre Channel component vendors recently released to the press and prospective customers a comparison test, performed by an "independent" third-party test outfit, that compared its products to two of its competitors` (I won`t mention the vendor or testing outfit because I don`t want to call attention to the study, for the reasons outlined below.)
Not surprisingly, the sponsoring vendor received all A`s in the report card summary, while its competitors received C`s and D`s. The press release did not state that the benchmark tests were commissioned by the vendor (although that was acknowledged near the end of the 100+-page online report), and neither the press release nor the online report mentioned that the study was paid for by the sponsoring vendor.
According to representatives from the competing vendors, they weren`t contacted regarding the study. Of course, they had plenty of complaints about the test methodology and the specific product models chosen for comparison.
If this maneuver sets a precedent, it`ll be bad news for the fledgling Fibre Channel industry. Any vendor can commission and pay for a report that puts that vendor`s products in a favorable light, and any such report`s conclusions should be taken with a grain of salt. Or ignored.
It reminds me of the workstation market prior to the industry-standard SPEC tests were created, or the systems/database market prior to adoption of the TPC benchmarks.
As Fibre Channel and SANs catch on in the IT community, users will need comparative testing from independent companies, because few organizations have the money or resources to test on their own. But vendor-sponsored tests aren`t the answer.
To be fair, much of the report in question could well have been accurate and unbiased. I`m in no position to judge the test suites or methodologies. But I wouldn`t use a paid-for test to make a product selection, nor would I cover the results in a trade publication. (With the exception of BusinessWeek Online, I don`t think that any publications did cover it.)
Put simply, we need industry-standard tests for comparing Fibre Channel devices, and maybe even for SANs themselves. Of course, this would have to be done under the auspices of an organization such as the Storage Networking Industry Association (SNIA) or the Fibre Channel Industry Association (FCIA).
An alternative would be to have a university handle the testing. The University of New Hampshire and University of Minnesota are already doing excellent independent testing, and if the benchmarks were standardized, we`d have a testing arrangement that could be accepted by all vendors. Yet another possibility is to have an independent testing lab handle the job, such as Medusa Labs.
All of these alternatives to commissioned tests pose sticky problems, but the alternative is a never-ending barrage of paid-for test results that prove nothing but the gamesmanship capabilities of individual vendors. Personally, I`m opposed to commissioned comparison tests, but at the very least vendors should disclose that they paid for the tests.