Why do the IETF, ANSI T10, T11 and T13 and a whole host of other standards bodies make progress in term of updating standards for new technology while we have archaic standards for user level I/O and file systems? I asked myself that question and have been pondering the potential reasons why. Here is a list of possible reasons why other standards make progress and user level I/O is stuck in the past:
User level POSIX I/O standards are part of the operating system and the C library. These standard used to be controlled by, if I remember correctly, in order: Bell Labs, UNIX System Laboratories, UNIX International, the OpenGroup. There were, again if I remember correctly, at most 10 vendors that wrote operating systems and worked on standards. Today, there are far fewer (my list is MVS, AIX, Solaris, Windows BSD/MacOS, and, of course, Linux). In reality, in terms of volume, there are three today: Windows, Linux and BSD/MacOS. Since the OS vendors are the ones that have to do all the work to develop the standards, make the changes and do the testing, what is their incentive, especially in hard economic times? There is no incentive without a significant reason, and the user community has not demanded it.
posted by: Henry Newman