Microsoft is broadening its efforts in the highly technical area of virtualization — a practice that essentially decouples various...

Share story

Microsoft is broadening its efforts in the highly technical area of virtualization — a practice that essentially decouples various parts of a computer system, including hardware and software, to give information-technology managers more flexibility.

The company is acquiring San Jose, Calif.-based Calista Technologies, which works on desktop-virtualization technology.

Among other moves, Microsoft is changing how it prices and licenses its software to encourage virtualization; rolling out new tools for managing virtualization in different situations; and broadening its partnerships with other companies in the area.

Enthusiasm for virtualization, which has been in use for decades, has grown recently and was highlighted when VMware went public last summer in one of the most successful tech IPOs of 2007.

The Palo Alto, Calif., company’s shares more than doubled in its first two months on the stock market and finished last week at $80.27, up nearly 41 percent. It announced an acquisition of its own last week, buying Thinstall, another desktop-virtualization company.

Microsoft executives, while acknowledging the technology can be complex and expensive, are touting virtualization as an answer to a broad range of IT issues, including security, remote access for increasingly mobile workers, disaster recovery and cost containment.

“It is still early for this important technology,” Bob Muglia, senior vice president of Microsoft’s server and tools business, wrote in an e-mail outlining the strategy.

“Ultimately, virtualization will play an important role in improving business agility by making IT systems more flexible and more responsive to changing business needs,” he wrote.

Virtualization uses software to isolate various layers of a computer system, including hardware (desktop computers or servers); software (operating systems and applications); and other elements (data, networks and storage).

“Altering one layer often affects the others, making changes difficult to implement,” Muglia wrote.

One use of virtualization involves separating the software running on a server in a data center away from the physical machine. That way, if the data center were threatened by a natural disaster, the server’s applications could be easily transmitted to a different data center out of harm’s way.

Virtualization also can be used to divide a single server into individual “virtual” machines, each insulated from the other and capable of running its own operating system and application. This improves efficiency by using more of each server’s computing power, 85 to 95 percent of which goes unused in most IT departments.

Microsoft has built these capabilities into its forthcoming Windows Server 2008 software and is partnering with Citrix on tools to make using either company’s server software easier.

Yet another virtualization scenario involves hosting a desktop operating system and applications on a remote server and transmitting it to a workstation.

It’s a setup that eliminates computing horsepower on the desktop machine and recalls the pre-PC era of mainframe computing, when so-called dumb terminals accessed the computing power of a central machine.

Desktop virtualization also can make telecommuting easier by recreating a worker’s desktop computer on a home machine.

Calista’s technology enhances the transmission speed of that desktop “image” over low-bandwidth connections.

To prod more companies toward virtualization, Microsoft lowered the price it charges corporate customers for running Windows Vista on virtual machines. It also changed licensing rules to allow virtualization of consumer versions of Vista.

Benjamin J. Romano: 206-464-2149 or