According to Garner research, in 2006 data center server utilization was around 18%. Which means that around 80% of all the servers in data centers around the world is either not used or simply used only part of the time. These numbers in fact helped spur on the virtualization movement that seemed like an unstoppable force, plowing across data centers leaving efficiency in its wake. But it was not too be. Now it’s 2010 and Gartner revisited its numbers to find that server utilization is still at 18%.
So what went wrong? Well one could say that it was due to a lack of funds, but that is not entirely true. What the main reason ends up being is something that has plagued the computer industry since its inception, companies just don’t want to take the plunge. Another side of the issue is poor updating of software. A great many software applications that cover such things as order processing and ERP are not written for virtual environments. Therefore, tracking performance and finding availability problems is difficult.
Services that have been virtualized include email, websites, and test servers, but these contribute very little when it comes to business usage of server hardware. In each case though, we see that these applications have been built to be optimized in virtual environments and, in most cases, they are all important, but not mission critical.
Indeed, it would seem that until more IT systems are written to fully utilize virtualization environments we may never see the virtualization movement take root. However, this does seem to be one of the reasons for the increase in the dedicated server hosting space.