By Russ Fellows, Evaluator Group
November 29, 2010 -- The next decade will bring challenges in many areas of information technology. Although new ideas and technologies are constantly emerging, only those that can deliver real value to businesses and consumers will be successful. In working with IT consumers, Evaluator Group has identified several issues facing businesses -- the so-called ‘big IT problems’ of the next decade.
The issues outlined below are topics that Evaluator Group has engaged in detailed conversations about with both IT consumers and IT producers. Nearly every business relies on information technology, and as a result will be affected by the arrival of these changes to the IT landscape.
The three big challenges facing information technology over the coming decade are:
· Data Center Transformation – Enabling businesses to efficiently manage and deploy IT
· Data Analytics – Deriving value and business insights from data that is captured
· Integrated Data Management – Intelligently managing data placement, protection and archiving
Solving these challenges will require significant technology, business, and operational expertise. One of the reasons why these problems are proving difficult is that existing technologies cannot solve these issues cost efficiently. In some cases, the technology required is still emerging and will require integration with existing products and processes.
Evaluator Group has published studies on each of these topics, and the papers are available at no charge on the Evaluator Group’s website.
Solving business challenges will require new technologies that are now emerging. Data center transformation, data analytics and integrated data management all require technologies that optimize cost and performance, while enabling massive scale and security. Solving these challenges has been attempted in the past with various degrees of success. However, the scope and scale of the problems have outgrown the capabilities that existing technologies are able to deliver.
Next generation solutions will require multiple technologies in order to be successful. Convention says IT departments should standardize and consolidate their equipment in order to improve management efficiencies. Next, they should virtualize components in order to improve efficiency without adding to management complexity. With these moves, it is possible to create standard business catalogs of services offerings at specific quality and price levels. Finally, these services can be integrated into ongoing operations through automation in order to maintain efficiencies and productivity gains.
A few of the technologies needed to solve these challenges include:
· Virtualized infrastructure: The key technology that enables scale, efficiency and flexibility
· Scale-out and scale-up: Required to support the growth in data and information processing
· Efficiency: Achieved through process and product standardization and management
· Flexibility: Ability to support changing requirements
· Security: A requirement in hosted, cloud, or distributed work environments,
· Multi –tenant design: Supports multiple clients/tenants simultaneously
The concept of virtualization is nearly as old as computing itself. It has been applied with various degrees of success to computer memory access, processing, storage and networking. However, virtualization is now beginning to be made visible at an external level and is being used to transform major elements of IT.
Other design requirements include efficiency, which requires standardization of interfaces and operational automation. The ability to scale both up and down, while meeting workloads that change over time requires flexibility in both the technology and management of the infrastructure.
Securing information in a multi-user, distributed environment is both challenging and necessary. Without adequate security, the promise of hosted cloud computing cannot succeed. Even within public cloud settings, security and data governance is a growing issue, which must be solved in order for the next wave of IT solutions to deliver value.
Management of information must be performed holistically, across the enterprise regardless of time or place. Data governance, protection levels, placement and security of information must be protected by polices that can span the virtualized environment.
Finally, the ability to request, configure, manage and consume IT resources will require a new wave of tools designed to allow for “virtual system management,” encompassing logical elements rather than physical products. So-called multi-tenant management tools must support securely managing multiple clients and administrators, all with separate logical views, while using common infrastructure.
Data Center Transformation
At its core, transforming the data center is as much about business transformation as it is about technology. Perhaps the biggest component driving this change is the movement to IT as a Service (ITaaS). The most visible example of this is the emergence of the terms “cloud” and “cloud computing.”
Evaluator Group began talking with our clients about the business drivers behind cloud computing and ITaaS, along with the emerging technological changes, and quickly realized that what companies were seeking was a way to transform their data centers and operations, a “data center transformation.”
Updating corporate data centers has been an ongoing process since the inception of IT. Evaluator Group began to use the term “data center transformation” in late 2008 as a way of explaining the fundamental shifts that were emerging in the way IT departments and CIOs were looking to deliver IT to their constituents. The term “cloud computing” began to emerge during this time as well; however, the term meant very little to most people.
What is clear is that business users are looking to gain flexibility in how, when and where they consume IT resources. Businesses in particular are now looking for a better alignment between the needs of their business and the cost and service offerings that IT departments can deliver.
Traditional data processing and data warehousing are narrow examples of an emerging area known as data analytics. Common techniques in place are relatively slow and unable to scale to solve the analytical processing of thousands of data streams in near real-time. Business users are now looking to process information using multiple data sources concurrently.
“Big data” requires the ability to scale out information processing and management, while still providing information protection and security. Standardization of components along with virtualization can help drive efficiencies. Moreover, the technologies outlined are all required to meet the challenge of data analytics.
The challenge is to break problems into those that can be processed or analyzed in parallel. Techniques have emerged, including MapReduce and others, as methods for efficiently processing these types of problems.
Some of the techniques include massively parallel processing, high-speed data access coupled with hardware and software integrated appliances. The first wave of products was typically based on commodity hardware and software with a significant amount of tuning and integration required. More recently, integrated solutions, which rely in part on proprietary hardware or software, are now coming to market.
Integrated Data Management
Data protection, tiering, and archiving have all been topics of discussion within companies and IT organizations for decades. Often, these discussions are independent, focused on solving a particular problem. Business application owners and IT workers alike are now looking for a way to solve these challenges in an integrated fashion.
What is needed is a strategy that encompasses all of these topics holistically. Evaluator Group began using the terminology “Integrated Data Management” (IDM) to discuss these areas of interest. Past efforts have realized that these aspects were related but placed too much emphasis on particular tools or techniques to solve the problem.
Within IDM are three areas of focus:
· Data protection (backup, point-in-time copies, replication, security, etc.)
· Tiering (moving data within a system and between systems for cost, performance, and efficiency)
· Archiving (storing information for long-term preservation)
These three aspects of IDM are all related. They are part of a bigger picture of managing data cost effectively, while meeting business objectives. Understanding their importance and relationships are critical for building and operating optimal IT operations.
The three areas outlined will certainly not be the only challenges within IT. However, these are the topics that we feel have the potential to revolutionize how IT organizations provide and deliver information and how people consume and leverage that information within both personal and business settings.
Russ Fellows is a senior partner with the Evaluator Group research and consulting firm.