If you have not seen the announcement, Japan is now the home of the world's fastest computer. Until last year, the United States had that honor, and China took the lead back in November.
This got me thinking about what really matters. Does pure speed matter? In my opinion, not really for most applications. Sure there are some that can take advantage of the pure speed, but these are few and far between. What does matter is how usable the system is? Does the system stay up and have only a few failures, which means you need less I/O bandwidth to checkpoint your application and restart it after failure? Does the system have high-performance, low-latency communications hardware and software that allows easy access for applications? Does the system have a scalable compiling and debugging environment to allow rapid code development? Does the system have enough I/O bandwidth for pre- and post-processing of the information generated?
All of these and issues and many more will make or break the amount of real science that can be generated from a system. Remember, these systems are often used for basic research into things like materials science, large earthquake simulation, weather and climate, basic chemistry, and a variety of things in our body from the brain to aging. So are these systems important to the nation that has them? Yes, as basic research is often moved from just that to industry. I have seen it happen over and over for almost 30 years. Is developing a system that is more usable more important than pure speed? From what I have seen, YES.
Labels: high performance computing,speed,supercomputer
posted by: Henry Newman