I believe that we need to update what we are teaching in college to computer science students. I think we must start teaching students about how hardware and software interact. I think this curriculum should include areas such as hardware memory allocation, page coloring, how data gets moved to/from the PCIe bus and how that bus works. For example, I think we should teach about things like 8/10 encoding. How does a 10 Gb Ethernet NIC work or a SAS adaptor? What is a CRC error on a channel or things like SECDED? If you do not know Google, as there are a number of good explanations on the net. The class I envision would start as a one-semester required class for sophomore CS students. I think it is important that this type of information be taught early on. The next required class would be a senior-level class that would address some of the same issues in more detail and look at areas, such as reliability engineering for silent data corruption and some of the standards bodies and the hardware that they control, for example SAS disk drives, Ethernet and other hardware technologies that have well-defined standards.
I have actually suggested this to a number of computer science professors I know, and I even offered to do an overview lecture on the topic. I was told that no one in the computer science department really cares about these topics. This rejection came from two computer science departments in pretty large state universities (unnamed of course). There must be a way for the industry to partner with universities to help them understand what we need from graduates and get ideas on the curriculum.
Although I am a single voice, I am loud and I rant a lot, but I need some help here folks.