Perhaps it’s no wonder that organizations have rushed to implement server virtualization in recent years. The technology, in which software is used to create multiple servers, or virtual machines (VMs), inside a single physical server, can cut energy costs, boost efficiencies, and save space.
But security concerns have haunted the popular trend. Many managers fear that virtualization (sometimes called internal cloud computing, as distinct from regular cloud computing that involves third-party contracting) could be blindsided by an attack. Much concern has centered on the strength of the hypervisor, a piece of software that holds the multiple VMs; breaching it could leave far more data exposed than with physical servers.
But such an exploit has yet to occur. One reason could be hypervisors’ relative paucity of code, which makes the software harder to hack, according to Dr. Daniel Menasce, senior associate dean and computer science professor at Virginia’s George Mason University. Menasce, speaking on a panel at a recent Washington, D.C., symposium hosted by Symantec, said that virtualization is far more secure than many people realize. Other panelists agreed, noting that the most important security considerations are adjusting existing policies and processes to fit the new environment.
Jack Nichols, director of enterprise operations, Office of the Chief Administrative Officer of the U.S. House of Representatives, said that when he began shifting to VMs a few years ago, he moved cautiously. One of his first steps was deciding what information could be virtualized. Any personally identifiable information (PII) would be held in separate physical servers, he decided.
Such an approach might not always be needed for reasonable security. Many organizations can secure certain VMs by surrounding them with virtual versions of traditional firewalls, say some analysts.
A primary reason Nichols separated the data was that he felt access control was not as layered on VMs as on their physical counterparts. This may be changing, says Chris Wolf, a senior analyst at the Utah-based Burton Group, as access control and other vendors roll out new software to help integrate VMs into traditional data centers.
Still, access control can present far higher risks in a virtual environment, Wolf says. VMs pack more data into much less space, he says, and VMs can be copied onto removable media and taken off the premises. As with any physical server environment, it’s important to avoid giving any one individual broad access to systems without the appropriate controls
Training staff on best practices is also critical, say analysts. VMs are relatively easy to set up, but that can cause problems, because when they are created too rapidly and without sufficient planning, the VMs may lack proper configuration. VM template software, available from a number of virtual vendors, can help create machine “groups.” These groups then have standardized configurations and the latest patches, management agents, and virus signatures.
With VMs’ rapid growth, some companies frequently take machines offline, according to analysts. Updating such machines alongside active VMs has been an ongoing challenge. Changing dormant machines typically requires that they are individually booted, but activating unpatched or misconfigured devices can be risky. New programs, however, such as VirusScan Enterprise for Offline Virtual Images, from McAfee, are helping streamline the process.
Nichols, who now has 12 staff certified in virtual security, said he began emphasizing training early on. Virtualization is highly effective and secure, he says, as long as companies “keep people ahead of the project.”