The Trend Skeptic : Virtualization Is a Sure Bet
Virtualization is all the rage these days. VMwares IPO last summer netted the company nearly $1 billionthe biggest IPO since Google.
Fast forward a mere five and a half months, and VMwares stock plunged, leading to such headlines as VMware Smashed, The Partys over at VMware, and VMware A Wall Street Chainsaw Massacre.
What changed between August 2007 and January 2008? Not much, truth be told. The market in general was down and VMware did miss its projected Q4 revenue mark. Yet, revenues were still upway upover Q4 2006. So, what was all the fuss about? I dont claim to be a stock analyst, but I believe one of the variables that hurt VMware, and virtualization in general, is the technology is being over-hyped. It will help usher in green IT. It enables disaster recovery and business continuity. It hardens security. It reduces operating costs.
All true, but as any IT pro knows from hard-earned experience, the adoption of new technologies always comes with growing pains. Hyping virtualization as a silver-bullet, plug-and-play technology is false advertising. Successful virtualization projects are vastly more complicated than vendors admit, and the risks associated with a poorly implemented effort are serious.
Risk and New Technology
What, then, are the risks? Security is an issue, said Gary Chen, senior analyst,
Most analysts agree and believe that security, while not something to ignore, wont be a huge issue. The real issue is performance and management. With virtualization, performance takes a hit, Chen said. This will improve over time. Hardware is adapting. Operating systems are becoming virtualization aware, but issues like I/O and application compatibility are real problems.
A corresponding problem is that many of these performance issues are hard to pinpoint. From an end user perspective, why is the application underperforming? Its a mystery. End users just know that its not on par with what it used to be. Of course, end users arent expected to figure these things out. They have IT for that.
But what if IT cant figure it out either? Todays virtualization monitoring solutions are blunt tools that can miss key performance variables. Incompatible applications may reside side by side on the same server. Applications may have synchronized traffic peaks that are missed, resulting in micro-saturation. Yet, diagnostic tools will show nothing.
Without better performance monitoring, well all be nostalgic for the traditional approach of over-provisioning and dedicating single servers to single applications. Moreover, if virtual environments arent properly planned, a single server crash could take down multiple business-critical applications at once.
You have to plan on an application-by-application basis, said Richard Jones, VP and service director, Data Center Strategies, for the Burton Group. Some applications arent ready, such as Oracle databases.
In fact, any I/O-intensive application tends to be problematic.
The key word in the performance/reliability discussion, then, is planning. Dont just plan from the perspective of the OS or hardware, as we did in the past. Plan from the perspective of the application or service you intend to deliver, Jones added.
In other words, what sorts of I/O requirements do applications have? How many instances must you have to consider the architecture fault-tolerant? How easy will you be able to migrate that application if the server crashes?
It boils down to management, Jones concluded. You need good processes. You must follow ITIL because once you shift to a virtual environment, its so easy to spin up a server that if you dont have a process in place for server lifecycle management, server sprawl will overwhelm you.
Then Theres Legacy
All warnings aside, Im still all for virtualization, as are nearly all the analysts I spoke with. Virtualization is coming. Itll be an improvement and it will surely be an enabling technology. However, as with any new technology, deciding when to take the leap is critical. How many of you gadget geeks out there wish youd waited a few months before you bought that HD DVD player?
When should you take the plunge? According to Jones, its not a bad idea to take a wait-and-see approach. Most people should plan to move to virtualization when they do a hardware upgrade, he advised.
Theres no need to rush, and waiting could have advantages. Hardware is being designed with virtualization in mind. Software is being architected to not only play well with the virtualization layer but to take advantage of it. And new vendors are flooding into the space.
Today, real-world deployments are synonymous with VMware. However, competitors are entering the fray this year. Microsofts Hyper-V is due out in the Fall. Citrix continues to improve its Xen-based suite, and a number of other large vendors, from Sun to Oracle to Novell, all have virtualization products either hitting the market or in the pipeline.
In fact, the very nature of the enterprise IT-vendor relationship could change as virtualization gains traction. Virtualization will shift the way vendors provide ongoing support for products, said Matt Healey, senior research analyst at IDC. Once you can move applications from one server to another, you dont need a server vendor. What you need is a service provider who can look across platforms to see whether or not your business-critical applications are running in an optimized way.
Jeff Vance is the president of Sandstorm Media, a writing and marketing services company that focuses on emerging technology trends. If you have ideas for future Trend Skeptic columns, contact him at email@example.com or visit www.sandstormmedia.net