IT Requirements Drive Information Sharing
Enterprise storage and information-sharing software promise to provide unfettered access to corporate data, regardless of platform or technology.
Shrinking product life cycles. Deregulation and resulting new competition. Unprecedented numbers of mergers and acquisitions. Explosive use of new technologies such as data warehouses and electronic commerce. This deluge of business and technology forces is having a cumulative and sometimes paralyzing effect on corporations across all industries. Strategic efforts to step up competitive programs and to implement growth plans have in many cases stopped short, stymied by the inability to share information across the enterprise.
Case in point: As the number of households conducting online banking in the U.S. inceases by 80% this year, to more than four million (according to Jupiter Communications, NYC), a bank`s ability to compete will become inextricably linked to its ability to quickly feed applications on UNIX and NT servers with customer account information from mainframes.
According to The Meta Group, a consulting firm in Stamford, CT, the number of data warehouses with 500 or more seats will triple this year. As companies struggle to shrink the window of time they have to wait before they can extract, analyze, and act on competitive information from mainframe sources, these new data warehouses continue to be built and deployed primarily on open-systems platforms, further exacerbating the problem.
Due to these and other trends, successful IT strategies are moving rapidly away from a processor-centric or server-centric model, to one in which the corporate information required to advance the business is the primary concern. Organizations are now placing information at the center of their IT and business planning efforts. Information, regardless of its host source, is being captured and managed to best serve the business. IT architectures are being designed and technology is being purchased to provide maximum flexibility and unrestricted access to information.
This information-centric model tran- scends LANs, WANs, UNIX, Windows NT, AS/400, and MVS mainframe environments. The challenge is to integrate data across all these environments and to provide a single view of enterprise information.
The first steps of this "information- sharing" capability have already been taken and will ultimately provide timely and accurate access to state-of-the-business information by all authorized users and applications across the enterprise.
The "end game" in information sharing--an integrated view of all corporate data regardless of the underlying technology--promises to be one of the most critical IT goals and one of the most daunting challenges facing the IT industry over the next several years.
Central to this issue today is the need to move hundreds of gigabytes--even terabytes--of data between heterogeneous systems. For example, mainframe OLTP data must be shared for analysis on open-systems data warehouses. Since mainframes continue to be the primary platform for large corporate databases, most data about customers, sales transactions, flight operations, inventory status, prices, discounts, and payment plans remains in databases created and maintained on the mainframe.
Traditional networks, CPUs, databases, and middleware have fallen short of adequately bridging the technical gaps between the multitude of mainframe and open- systems platform architectures. Although host-based software solutions exist to address certain aspects of information sharing, these complex products fail to deliver complete solutions and they rely on CPU cycles and valuable network bandwidth.
IT managers, therefore, have been forced to move data in a circuitous and cumbersome fashion through the mainframe, across the network, to the open systems, and down to the open-systems database. With this approach and its associated overhead and costs, weeks--sometimes months--elapse from the point-of-sale capture of information to the time the data is available for marketing analysis and competitive action.
As a result, companies have been forced to make a decision: Overburden the mainframe and network with data transfers and slow down the business or load the open-systems data warehouse infrequently and ask business managers to make strategic decisions based on stale information.
Technology based on intelligent enterprise storage systems is now available to, among other things, enable organizations to move data efficiently between heterogeneous systems.
Storage systems of the past offered little value, if any, from their intelligence. Thought of only in terms of their capacity, separate storage systems were purchased for each platform--in most cases automatically from the CPU vendor without a second thought--and were readily discarded when storage needs exceeded their capacities or new servers were deployed. With the advent of intelligent controllers in recent years, storage systems took massive leaps in functionality, performance, availability, and manageability.
Forward-looking IT organizations have turned to intelligent storage systems as the hub of information-centric computing. With highly intelligent resident software and internal processing power of a supercomputer, enterprise storage systems are fast becoming the preferred alternative for delivering the information-sharing solutions that no other piece of the IT mix has been able to achieve. Functionality previously reserved for servers has rapidly begun to migrate off the CPU and onto the central storage subsystem.
Historically, mainframes and open servers were relied on to execute activities such as performance management, disk optimization, storage management, error recovery, disk mirroring, job scheduling, caching algorithms, and data movement across the network. With these functions off-loaded to the enterprise storage system, servers now are liberated to do what they do best--running and managing applications, databases, networks, and systems services.
As a result, investments in storage solutions are paying off in ways that were unheard of only a few years ago. Intelligent enterprise storage systems are now being evaluated on their merits--that is, how they affect the corporate bottom line. Advanced storage capability such as information sharing is becoming central to a company`s ability to improve its time to market, to drive more revenue, to extract more value from information, and to improve decision making.
The resulting cost/benefit ratio has earned intelligent enterprise storage systems the respect of specialized processors and a place as a line item in strategic IT decision making.
Enterprise storage is characterized as high-performance, host-independent bulletproof technology that allows an organization to be as flexible and distributed with applications and servers as it needs to be. Enterprise storage allows all of a company`s information to reside physically and logically in the same place, under the same level of availability, security, management, and backup required for a company`s most important business data.
In addition, enterprise storage allows concurrent storage and retrieval of data to and from all major computing platforms, including mainframes, open systems, and enterprise networks. Enterprise storage must concurrently provide mission- critical availability, dynamically optimized high performance, and scalable capacity.
And enterprise storage must provide real-time business continuance in the event of a disaster, rapidly and nondisruptively migrating data from one system to another and sharing information across platform boundaries.
For example, high-performance enterprise storage solutions exist today for directly accessing mainframe MVS/ESA DB2 relational database information from, say, an HP-UX server and loading it into an Oracle database.
Storage-based software also exists for high-speed bulk file transfers among a wide range of mainframe and open- systems servers. Mainframes and open systems connected to a common storage device use the high-speed channel connections of the storage system to transfer data from one environment to another. In both examples, file transfer speed is significantly improved and network bandwidth is freed up for other business-critical activities.
As the technology matures, the future will likely comprise a blend of enterprise storage platforms and information sharing software with new, creative ways to deliver information to the applications and people who need it.
In some instances, this will involve multiple users accessing a single copy of the data. In others, it will involve high-speed replication between different servers. As long as the value of manageability, performance, rapid data access, and reliability are maintained, the means to the end becomes less relevant.
Now that enterprise storage is an equal partner in the IT architecture, the distinctions between storage systems, server technologies, and networks will continue to blur. Integration of information will happen in previously unlikely places. Enterprise storage is the place where consolidated information will reside from all sources and in all formats and become rapidly accessible from any person or application in any location or any type of computer.
Information-sharing technology will enable companies to read and write information freely between applications on any host CPU and will eventually provide a single view of all enterprise information with integration across applications, processors, and management tools.
Global networks made the case for total consolidation of all IT resources into a single location obsolete, and the same applies to information sharing. Storage leaders are building the path to a single view of corporate data with consolidation happening where it makes most sense--where the information lives--on the enterprise storage system.
John Howard is senior product manager, information sharing solutions, at EMC Corp. in Hopkinton, MA.