Future Development

By Henry Newman

Since the 1940s, the U.S.A. has been dominant in the development of most of the basic computational hardware technologies used around the world. Examples include CPUs, memory, disk drives and tape channels. We all know that much of the manufacturing no longer happens in our country, and most major companies, such as Intel, AMD, Micron and Seagate, manufacture their products outside the U.S.A.

My question is, will the U.S.A. continue to be the major center for ideas that are used in computers? Will the U.S.A. be the home for, say, the development not the manufacturing of CPUs from 2020-2100? I honestly think this question has something to do with politics. Sadly, I am going to step in a mess here.

We must have a great investment in education and allow foreign PhDs to stay and have more U.S. citizens getting PhDs. We need more basic research, as companies rarely fund basic research. Do you think we would have gotten the Internet without basic research and U.S. Government support? No, we would have gotten many different networks that did not communicate well with each other. Just look back to the 1970s and see what IBM, DEC, CDC, HP and others were doing.

We do not need to look far to find countries that realize that basic research is not funded by companies, and the tools for that research are not just hardware but people and the educational system. The days are over that we can take for granted the fact that the U.S.A. will continue to lead. We must look at why we lead, and in my opinion, having the best educational system, people that want to use it, and investing in basic research makes or breaks our future.

This article was originally published on September 21, 2011