Does HPC Matter to the Commercial World?

Posted on July 23, 2012 By Henry Newman

RssImageAltText

Around 20 years ago, there was a saying in high performance computing (HPC) that what was done in HPC would be seen in the commercial world a few years later. That saying held true for a few decades, but in the 1990s that changed. The commercial world and commodity hardware and software altered all of that, and HPC was chasing the open source community and commodity technologies from CPUs and memory to SATA disk drives and, of course, Linux.

So the question today is, is this all about to change again given Intel's purchases of Fulcrum, QLogic (the Infiniband part of QLogic), and now Whamcloud? Intel has purchased two networking companies and a file system company that all had significant HPC ties. Does this mean Intel is going to use HPC to again push and try out technologies and have them move to the commodity commercial market?

As many before me have speculated, many believe Intel is trying to add the network interconnect directly onto the silicon on the CPU. If this is true, it is likely that the first deployments of this new interconnect will be in HPC. Whamcloud has nothing to do with chips, but it does have to do with doing high performance I/O for mostly HPC applications. Yes, there are a few commercial installations, but HPC is by far the main focus. Large I/O systems are becoming increasingly common for many types of environments, and maybe Lustre could lead the via its HPC roots.

For the past 10 years, HPC was playing catch-up to the commercial world. I think things are about to change, with HPC leading the way for new technologies in the commercial world. The cycle begins again.


Comment and Contribute
(Maximum characters: 1200). You have
characters left.