The use of virtualized servers in business is now a mainstream approach to providing efficiencies while lowering costs. When such servers were first put into use in the middle of the new millennium's first decade, they were used mostly to handle workloads related to testing and development; over time the core IT solutions making up virtualization were refined to create higher levels of both reliability and performance as well as compatibility with other systems.
By 2008, virtualization was moving into its second major phase, with more businesses trusting virtualization to handle critical information and workloads related to production of IT assets. In addition, this second phase was characterized by increasing consolidation of virtualized resources, which led to a cost savings for companies in terms of both hardware, square footage needed for IT resources, and power consumption for both CPU cycles and cooling. By 2010, more virtual workloads existed among business entities than did physical workloads.
Now, virtualization is moving into its third major phase, which can be more or less summed up in a single word: cloud. The objective now is to take advantage of the instant nature of virtualized resources to provision companies over the Internet in a way that will drive costs down even further. In this era of virtualization, entire data centers can be outsourced to managed services providers, or IT companies that remotely manage and administer IT resources, in some cases replacing entire IT departments with skilled personnel whose total cost to the business represents a significant savings over the traditional in-house approach.