Virtual Machines are software emulations of imaginary computers that run within an actual computer. A virtual machine may be used to emulate a computer that is no longer in production, to rationalize the behavior of a machine, to simplify programming by making a machine (or family of machines) appear more consistent than they are, or to allow one computer to emulate a substantially different computer. VMs may also be used to allow multiple operating systems (e.g. Windows and Unix) to run simultaneously in the same computer.

Virtual Machines have been used for thirty years or more -- primarily to extend the life of application software implemented on computers that are no longer available. They are increasingly being used to allow software to run on a wide range of hardware without expensive modifications of the software for each new host.

VMs often severely degrade the performance of the host CPU and may have difficulty accurately reproducing the timing characteristics of the emulated device.

Although VMs often emulate an actual device, they may also be used to emulate an imaginary device. The Java language runs on an imaginary computer that is emulated on the target machines. This allows one set of Java code to run on a variety of (often very) different hardware. Virtual machines have also been used to combine eclectic mixes of inexpensive personal computers into virtual supercomputers.

Three well known examples of virtualization are Java -- which allows one set of software to run on a wide variety of hardware platforms; VMWARE which allows a Windows OS and software to run on a Unix computer; and Windows execution of msdos programs within virtual 8088 computers.

Return To Index Copyright 1994-2002 by Donald Kenney.