Henry Newman's Storage Blog Archives for July 2012

Does HPC Matter to the Commercial World?

Around 20 years ago, there was a saying in high performance computing (HPC) that what was done in HPC would be seen in the commercial world a few years later. That saying held true for a few decades, but in the 1990s that changed. The commercial world and commodity hardware and software altered all of that, and HPC was chasing the open source community and commodity technologies from CPUs and memory to SATA disk drives and, of course, Linux.

So the question today is, is this all about to change again given Intel's purchases of Fulcrum, QLogic (the Infiniband part of QLogic), and now Whamcloud? Intel has purchased two networking companies and a file system company that all had significant HPC ties. Does this mean Intel is going to use HPC to again push and try out technologies and have them move to the commodity commercial market?

As many before me have speculated, many believe Intel is trying to add the network interconnect directly onto the silicon on the CPU. If this is true, it is likely that the first deployments of this new interconnect will be in HPC. Whamcloud has nothing to do with chips, but it does have to do with doing high performance I/O for mostly HPC applications. Yes, there are a few commercial installations, but HPC is by far the main focus. Large I/O systems are becoming increasingly common for many types of environments, and maybe Lustre could lead the via its HPC roots.

For the past 10 years, HPC was playing catch-up to the commercial world. I think things are about to change, with HPC leading the way for new technologies in the commercial world. The cycle begins again.

Labels: networking, Intel, InfiniBand, HPC

posted by: Henry Newman

In Memory of Dr. Allan Snavely

There was a flurry of email in the HPC community this weekend about the loss of one of HPC's current pre-eminent computational scientists. I have worked with and around Allan occasionally during the past 10 or so years and have seen the impact he has had on the industry. More than 10 years ago, Allan made some bold statements about performance of applications, stating in effect that you could determine the performance of an application by understanding the memory references and bandwidth required, as memory bandwidth was the limiting factor for application performance on a local CPU.

Allan took this idea and developed a performance prediction framework, and with the help of Dr. Laura Carrington and others, the Performance Modeling and Characterization Laboratory (PMac) lab was born at San Diego Super Computer Center. Allan also realized that SDSC, which was one of the pre-eminent high performance computing universities in the late 1980s and early 1990s, had lost its way. He boldly proposed a new HPC architecture during a National Science Foundation HPC grant competition, and with the SSD-based Gordon system won the competition and put SDSC back on the map after years of neglect. Allan had just moved to a new job earlier this month, helping set the direction for Livermore National Laboratory, which as one of its critical responsibilities ensures the safety and quality of our nuclear stockpile.

Allan and his team have worked with all of the major CPU vendors to look at potential performance improvements based on applications requirements. Therefore, the performance improvements you see today are likely a result of some of the basic research Allan did. He will be missed not only by his family and friends but also by the industry.

Labels: storage management, HPC

posted by: Henry Newman

Is Holographic Storage Really on Its Way?

From the what are they thinking column --

I am not sure what motivates venture capital firms to keep investing in technology that has yet to come to market. For example, I recently saw this article about holographic storage in The Register. My issue is not with holographic storage per se, but the fact that it has been just a short period of time from being available for -- 20 years or so. What market research are these people doing? There are four issues with long-term storage in my opinion.

  1. With density increase, you reduce your available floor space. As storage grows, floor space reduction is a critical area.
  2. As I have said time and time again, it is the interface. How is an operating FC-AL interface that is working in a modern operating system both the HBA and the driver? FC-Al left us about 10 years ago, and it is nowhere to be found. Interfaces do not last as long as the storage.
  3. The robotics around the storage must be serviced, and who has working robotics that are supported 20 years after release?
  4. Last, but not least, is the software interface (e.g., backup or HSM ) and the format of the data. What stands the test of time? I surely do not know.

This is not to say that I am not intrigued by the storage technology, and I do hope that holographic storage is successful in the market. However, like the author of The Register article, I too, have my doubts. What I do not understand is why people invest significant capital without a good understanding of the market they are entering and the market requirements. All the large archival sites I deal with or have heard about are concerned with the four areas I have outlined.

I do not get it.

Labels: venture capital, storage technologies, holographic storage

posted by: Henry Newman