In many cases, applications today, perform so well and so fast, even under virtualized environments, that anyone would be pleased for small documents and activities. We focused our efforts on larger tasks that were not only large enough to measure, but will also stress the systems to some extent.
For Microsoft Word, we took a very large document and did a global search and replace of about 95,000 items. For Microsoft Excel, we ran a macro that generated a large quantity of random numbers, and filled cells with them. And, for PowerPoint, we had the application render slides with transitions as quickly as possible (all the other PowerPoint tests worked too fast to measure).
PowerPoint View slide transitions
XP: VMware Fusion 2% faster (range: 0.2 seconds slower to 0.2 seconds faster, fastest: 5.5 seconds)
Vista: Both performed almost the same (range: 4.7-5.6 seconds)
Figure 6: Windows Application Performance
The tests performed used the same web page ... with and without SSL.
Figure 7: Internet Explorer Application Performance
One of the most interesting things in the virtualization market is how little overhead virtualization takes today compared to what it used to. To assess this, we measured in a variety of ways. Specifically, we focused on CPU usage (overall for the Mac), real memory usage (overall for the Mac), and how long the battery would last. CPU and memory usage were measured using "top" (a command line tool that's part of UNIX with a minimum of 50 continuous samples averaged for the result).
We ran three CPU usage tests. The first test was to boot Windows and let it sit for a few minutes to finish up its startup activities. Once done, CPU use was measured while both the Mac OS X and Windows were sitting idle. The next test studied the same thing except both Microsoft Word and Microsoft Outlook were launched and left sitting idle. The last test explored CPU usage while playing a DVD.
(Note: For testing the playing of a DVD, we had a bit of a challenge. Parallels Desktop supports the default application for playing DVDs that comes with Windows (Windows Media Player), but we had problems with WMP under VMware Fusion. Their web site also talks about these issues. As a workaround, VMware suggests the open source VLC Player, but we had trouble getting VLC Player to work in Parallels Desktop. It was important for both virtualization environments to use the same application, so we used Media Player Classic for the tests.)
For the two idle tests, both virtualized environments did very, very well with only 2-5% of the CPU being used. So, while one may be 1/3 or more faster, it's not a significant difference to the user in real terms.
CPU use for sitting idle (in %)
CPU use for VM sitting idle with both Word/Outlook (in %)
CPU use for play DVD (in %)
Figure 8: Virtual Machine CPU Usage
The memory footprint was done for two idle tests, similar to the CPU usage tests above. Here, however, the differences were more meaningful, and would be noticeable to the user. Take note, we're looking at both 1GB and 2GB virtual machines here.
Figure 9: Virtual Machine Memory Usage
Take note, Parallels is doing something interesting here with memory allocation: Parallels Desktop only takes memory from Mac OS X when Windows needs it. In other words, if you have a 2GB virtual machine, it will initially take less than 2GB of memory from the Mac, until you've opened enough application or documents that Windows needs the RAM. By contrast, VMware Fusion appears to allocate all the memory for the virtual machine at launch.
For the exhaust battery test, we ran an "endless loop" macro in Excel that generated random numbers. When approaching the end of battery, Parallels Desktop will pause the virtual machine, and ultimately, the Mac will go into a hibernate mode saving off where things are at. VMware Fusion will suspend the virtual machine before sleeping.
Figure 10: Virtualized Machine Battery Performance
File and Network IO Tests
Originally, we ran file copy tests on all the environments. In analyzing the results, we realized that there was a huge problem. Mac OS X and Windows were interfering, in a good way for typical users, with the results. Both Mac OS X and Windows have some pretty sophisticated caching schemes, but they also made File and Network IO tests unpredictable. As just one example, sometimes a MacBook was faster than a Mac Pro, and other times it was not. In the end, we tossed out the several hundred test times, and re-tested.
Due to time constraints, the retests focused solely on the MacBook Pro. To avoid the interaction problems with Mac OS X and Windows, we tested using data set sizes that were about the same size as the physical memory, or larger. This prevented the Mac and Windows from any type of caching. The data sets we copied from one location to another were 3.7GB total (two 1.85GB files because we needed to stay below the 2GB file limit).
The same set of files was used for all the copies so that you can see the differences between the different methods of copying. (Most of which were in the same relative time frame, except for copying to a USB Flash Drive. See the chart.)
Networking was done via the default NAT setup in both virtualized environments, and the defaults for disk setup for each application were used as well.
Here were the results:
File copy - duplicate on local virtual hard drive
XP: Parallels Desktop 31% faster (75 seconds faster, fastest: 165 seconds)
Vista: Parallels Desktop 43% faster (137 seconds faster, fastest: 181 seconds)
File copy - to local Mac hard drive
XP: Parallels Desktop 11% faster (18 seconds faster, fastest: 149 seconds)
Vista: VMware Fusion 8% faster (14 seconds faster, fastest: 181 seconds)
File copy - from local Mac hard drive
File copy - to LAN Server
File copy - from LAN Server
File copy - to USB Flash Drive
File copy - from USB Flash Drive
Or, said much better on a graph, it looks like this:
Figure 11: Virtual Machine File and Network I/O Performance
< Previous Page