Why do the IETF, ANSI T10, T11 and T13 and a whole host of other standards bodies make progress in term of updating standards for new technology while we have archaic standards for user level I/O and file systems? I asked myself that question and have been pondering the potential reasons why. Here is a list of possible reasons why other standards make progress and user level I/O is stuck in the past:
- New standard are required for new hardware. The ANSI T1X groups are outgrowths of hardware development of new technologies that vendors need to sell. There is a large group of vendors trying to develop these new technologies from Fibre Channel to SAS to SATA, from disk and tape drives to connectivity.
- The IETF is trying to address many factors including performance, security, and hardware changes and involves, and it has hundreds of involved organizations.
User level POSIX I/O standards are part of the operating system and the C library. These standard used to be controlled by, if I remember correctly, in order: Bell Labs, UNIX System Laboratories, UNIX International, the OpenGroup. There were, again if I remember correctly, at most 10 vendors that wrote operating systems and worked on standards. Today, there are far fewer (my list is MVS, AIX, Solaris, Windows BSD/MacOS, and, of course, Linux). In reality, in terms of volume, there are three today: Windows, Linux and BSD/MacOS. Since the OS vendors are the ones that have to do all the work to develop the standards, make the changes and do the testing, what is their incentive, especially in hard economic times? There is no incentive without a significant reason, and the user community has not demanded it.