Last year, Microsoft shipped about two million copies of NT Server, up 25% from 1998. This meteoric growth is expected to continue in 2000, despite the availability of Windows 2000 and the increasing popularity of Linux.
By John Webster
John Webster -- Illuminata, Inc.
Though NT has clearly taken a place alongside Unix as a major platform for critical applications, IT managers still face major challenges with the operating environment, including cost, management, capacity, and security issues:
- Since its inception, NT has been positioned as a commodity alternative to Unix and OS/390. Yet from a management standpoint, NT has become an expensive computing environment. Nowhere else are management expenses so far out of line with other cost factors such as hardware and software. IT managers must now consolidate their NT servers to reduce costs.
- IT's mandate "Do More With Less" means that managers have to consolidate, centralize, and automate NT management functions.
- The requirement to enhance corporate value will drive e-business, data warehousing, mobile computing, ERP, and CRM projects. These service-oriented initiatives will in turn create a need to capture, integrate, and manage burgeoning volumes of data generated by NT-based applications.
- The security of NT-based data continues to be an issue. It is still difficult for IT managers to ensure reliable and prescribed backups of departmental and remote-office NT servers. Even less assurance can be given to the speed and reliability of the restore process in the event of a server outage or data corruption.
Though IT managers tend to think of consolidation as a server issue, it's really more of an information and management issue. Therefore, storage should be one of the first areas IT managers turn to when facing such issues.
Large storage arrays, for example, can reduce management complexity, encompass valuable corporate information, and minimize exposure to disruption when making a major change.
NT is perceived to be a low-cost, commodity-oriented operating environment. As such, its popularity has driven the proliferation of small, departmental/remote office NT servers.
However, the tremendous appeal of "free" MIPS has resulted in a massive duplication of server hardware, software license fees, and management tasks-ironically driving up the annual cost of NT computing.
Relatively recent advances in server and storage technologies, however, make NT server consolidation a very do-able proposition:
- High-density, high-availability servers. Intel-based NT server hardware has advanced dramatically in the last two years, evolving from four-way 400MHz SMP to eight-way 600MHz SMP (and soon 800MHz to 1000MHz processors). System buses, CPU caches, and memory stores have all doubled. Fibre Channel host bus adapters have begun replacing SCSI I/O channels, yielding additional performance gains. Dense packaging has yielded eight-way SMP systems in stackable enclosures, so that NT servers supporting thousands of users can now be stacked in a 19-inch rack.
- Storage area networking. A Fibre Channel SAN-based storage infrastructure now serves as a foundation for building capabilities more common with large data-center environments, including shared disks for automated fail-over protection and load balancing, simultaneous online backup of multiple servers to multiple tape libraries, and dynamic scaling and reconfiguring of server and storage environments. SAN software that enables administrators to automate storage management functions is also coming on-line and is expected to advance rapidly this year.
The advent of Fibre Channel SANs has reduced I/O bottlenecks and storage scalability problems. Meanwhile, NT 4.0 has matured and settled into a more predictable and reliable state with the shipment of service packs 4 and 5.
Therefore, we believe the platform is now ready to serve as a focal point for IT projects aimed at eliminating NT server redundancy and lowering software and management costs. Departmental NT servers are normally implemented with relatively small (50GB to 100GB) internal storage systems containing fragments of corporate data. Capturing these data fragments using existing LAN connections and middleware between servers can introduce intolerable system latency.
To meet service delivery requirements and maintain a competitive edge, total system latency must be eliminated to the greatest extent possible. SAN technology can be used to eliminate intra-server latency. Centralized data can be made immediately visible to many servers participating in a SAN without moving a single bit from server to server over the network.
In distributed NT environments, disk storage is typically backed up to local, server-embedded tape devices. This practice creates multiple problems, including increased management complexity, excessive reuse of tapes, and security issues resulting from the haphazard storage of tapes.
A centralized backup strategy for NT storage, administered by trained IT rather than departmental staff, will ensure consistent backups and reliable restores. Online, non-disruptive backups to automated tape libraries can also be introduced to NT environments.
Moving to this new environment, NT RAID storage must be externalized and in many cases SAN-attached, to respond quickly to demanding workloads.
Data replication techniques can be used to create test copies of production data stored either on the same or different subsystems and refreshed as required. Storage-based replication can help isolate the test system from the operational environment and reduce the risk of performance degradation during the replication process. A set of attributes similar to those required to support more traditional mission-critical environments should be used when selecting NT storage platforms (see table).
Y2K experiences will also be a valuable asset when migrating NT-based applications to Windows 2000. However, we recommend consolidating before migrating to simplify the process. The same high-density, high-availability NT servers and SAN-based storage components used to consolidate NT servers can be effectively used to non-disruptively test and move applications to Windows 2000.
John Webster is a senior analyst with the Illuminata research and consulting firm (www.illuminata.com) in Nashua, NH.