The Future of IT Is Automation

By CIO Update Staff

(Back to article)

The one thing you can count on to be constant is change, especially in business. And, as the pace of business change continues to accelerate, organizations need to be increasingly agile in the face of sudden market shifts, new competition, changing customer behaviors, cascading mergers and acquisitions, and fast-evolving technologies and standards.

Agility requires having to change business processes and rules more frequently, and constantly finding ways to leverage new types of information and complex new combinations of information -- including existing enterprise information from mainframes, legacy applications and metadata, plus an unprecedented flow of information from outside the enterprise in the form of supplier, industry and customer data.

New data collection technologies up-the-ante even further. Web services have dramatically enhanced our ability to collect new types of data from outside the enterprise while Radio Frequency Identification (RFID) initiatives promise new magnitudes of unique real-time data that somehow has to be consolidated, cleansed, transformed and deciphered.

Keeping Up

Here's the problem: IT complexity is outpacing the ability of enterprises to keep up. Companies have more sources of information to harness, more data to integrate, and more people clamoring for more types of information (faster, better and cheaper) than ever before.

IT departments are already challenged to react. Changes in information conditions, requirements and environments are coming too quickly and resources are often too stretched for companies to adapt fast enough. IT departments have reached the point where the only recourse is to shift the burden onto existing systems with people providing the direction.IT needs software that can do the nuts-and-bolts adjusting for itself -- not just to enable people to focus on higher value tasks, but also to create lower cost, more reliable, and more adaptive information infrastructures.

Enter 'Intelligence'

To address this growing concern, developers are focusing on next-generation information infrastructures that will, in essence, be intelligent. This new technology will "understand" how to optimally adapt to system and data process changes, and will make necessary adjustments automatically and thus much more quickly than infrastructures that rely heavily on human intervention.

As a result, fewer systems and solutions will break or unnecessarily slow down. And more changes will be able to be effected without major impacts on systems, users, time capital and personnel budgets. This is because change will revolve around the infrastructure, but the infrastructure will stay "glued-in," adapting to whatever happens around it in terms of new information demands, platforms, standards and users.

Moreover, next-generation infrastructures will have the ability to learn as they go. This will help ensure their optimal performance, and the enterprise's agility, over time.

DI Darwinism

Data integration (DI) resides at the nexus of this agility. Without data integration, there is no information infrastructure, not today, not tomorrow. DI moves, consolidates and transforms the most fundamental IT building block -- data. It feeds increasingly voracious business intelligence (BI) systems, executive information systems, and all kinds of data warehouse and data store-based initiatives. It is integral to real-time enterprise computing, ZLE (zero latency enterprise), eCRM, business activity monitoring (BAM), and other high-value initiatives. And it will be critical to RFID as the way to amalgamate and manage the projected torrents of data.In short, DI is in the information infrastructure trenches, 24/7. Hence it is critical that DI platforms and processes be able to adapt to change intelligently, quickly and without requiring manual coding.

Adaptive DI covers a lot of territory. It starts with detecting and adapting automatically to minute-to-minute changes in data -- transactional, operational and metadata. This means detecting and adapting to changing data volumes and to changing patterns in the data in order to optimize its processing. Is the data real-time, batch or changed-data? An adaptive DI platform should be able to adjust intelligently to capture and integrate it all.

Adaptive DI similarly entails adapting automatically to ongoing operating environment changes. This means automatically detecting new servers in the environment, determining which ones are available to share workloads in a server-grid arrangement, and seamlessly picking up processing on different servers in the event of server failure.

Depending on loads, it an adaptive DI platform should be able to decide whether to execute on a mainframe, a UNIX system-based server, a Linux box or a Windows machine, or any combination. In terms of data sourcing and transforming, adaptive integration also means detecting different versions of application software and adjusting accordingly.

Integrating Standards and Error Reduction

Not to be downplayed in a standards-dominated IT world, adaptive integration also involves adjusting automatically emerging standards while minimizing the operational impact. This includes avoiding the downstream ripple effects that can easily accompany the adoption of new standards. The idea is to implement the new standard in just one place and let the software make all the necessary downstream adjustments automatically.Adaptive integration as just described will result in substantial reductions in errors and in implementation and maintenance costs. It will also supercharge an organization's reaction time-to-change. Reduced errors and instantaneous time-to-change is critical for DI platforms underpinning real-time enterprise data hubs or ZLE environments, and will become even more critical as these constructs grow.

A slowdown or momentary halt here -- where thousands of transactions can flow through each second -- can ripple across multiple business processes and put hub operations hopelessly behind in no time.

Pure-play business intelligence environments benefit from adaptive integration as well, particularly during this era of compliance and governance. Rapidly changing end-user information requirements, the increasing real-time nature of business intelligence, and the growing complexity of information source and target environments (fueled partly by the advent of Web services) all argue for an adaptive DI platform supporting the business intelligence front end. And the ability to be adaptive also needs to extend to that business intelligence front end.


Adaptive business intelligence is primarily about flexibility. It means delivering business intelligence to any information device or appliance specified by the end user. It means being able to readily insert business intelligence capabilities into portals and applications, often through Web services. And it means being able to run a business intelligence solution on top of any application server, right out of the box, and being able to switch application servers at will, without having to make costly and time-consuming changes to the business intelligence software.

Interestingly, business intelligence has an enabling interior role to play in the workings of an adaptive infrastructure. It all has to do with visibility.

Setting The Stage

If change is truly to revolve around an adaptive infrastructure while the infrastructure stays "glued-in," then that infrastructure had better be super-glued -- and that demands visibility. Adapting quickly to change is only possible when you thoroughly understand the environment you operate in.

Hence, you need deep and on-going visibility into the total environment so you can assess the impact of any adjustments before allowing the infrastructure to adapt. You need to be able to clearly visualize the complex dependencies between data, processes, and applications before proper changes can be made. And you need to be perpetually wise to which data movements and integrations have to be in real-time and which need to be in batch or changed-data-capture mode.

On-going visibility can be achieved by turning business intelligence tools inwards to illuminate the complex inner-working of the infrastructure. Personalized information dashboards, real-time alarms and alerts and other functionality found in state-of-the-art business intelligence solutions all come into play. So to does data profiling. Robust profiling capabilities, embedded in the DI platform and the business intelligence solution, can provide built-in pattern recognition to help visualize data entity relationships and analyze data-corruption issues.

Graphical cross-system visibility into enterprise metadata also lets you better see how to adapt. Metadata expresses information lineage and usage, and a robust metadata capability is a prerequisite for adaptive integration. Visibility into metadata through the use of business intelligence tools enables you to continually assess the impact of system and process changes, improve operational performance, and pinpoint data redundancies and opportunities for re-use.

Becoming smart about what needs to be real-time and what does not can be accomplished by using business intelligence functionality to help visualize and analyze workflows and decision trees. Once you have this understanding, you can let the adaptive DI software dynamically suggest the right processing paths.

A Self-Perpetuating Cycle

There are other aspects of integration that can be adaptive as well. Security, for example, now must be pervasive in both government and enterprise information environments. This should start with secure Web services processing and RSA encryption for data transmission, but can also extend to such techniques as LDAP authentication for log-ins to automatically adapt security measures to changing user environments.

Integration development should also adapt to accommodate increasingly geographically dispersed development resources. Team development over wide areas needs to be enabled through secure versioning and configuration management.

Today, our information environments are human directed. Tomorrow, automated solutions will detect and respond to change as it happens. The more readily our information environments can adapt, the more we can do as people in terms of being creative, strategically oriented, and productive.

Similarly, the more our environments can adapt, the more benefits they will bring to us in terms of enhanced business process effectiveness, holding the line of costs and risks, and increased business agility. Everything changes (that's a fact), but when your infrastructure is adaptive, the less its core, the adaptive DI solution, will need to change to accommodate changes elsewhere in the environment.

Harriet Fryman is group director of Data Integration Product Strategy for Informatica a provider of data integration and business intelligence software whose customers include 83 of the Fortune 100 and all four branches of the Armed Services.