Neil Ticktin wrote the article "Head-to-Head: How do VMware Fusion 4 and Parallels Desktop 7 for Mac compare?" for MacTech. Excerpts from the original article are presented here with permission by MacTech. The entire article can be found on the MacTech website.

VMware Fusion 4 vs Parallels Desktop 7

by Neil Ticktin (Oct, 2011)
Copyright by MacTech

When we were choosing computer models, we chose the current model of Macs that give a good representation of what most people may have. Certainly, the faster models of these computers will perform even better.

We chose five Mac models to compare alongside each other: the "White" MacBook, MacBook Air, MacBook Pro, iMac, and Mac Pro. Given the RAM requirements of Windows 7, the minimum configuration tested (including on the MacBook) was 4 GB.

  • 4 GB MacBook, Intel Core 2 Duo processor ("White MacBook")
    Specifically: MacBook 2.4 GHz 2 GB/250 GB White Unibody (Mid-2010) upgraded to 4 GB
  • 4 GB MacBook Air 13-inch, Core i5 processor
    Specifically: 1.7 GHz Core i5 4 GB/128 GB, Intel HD Graphics 3000
  • 4 GB MacBook Pro, Intel i7 processor ("Unibody MacBook Pro")
    Specifically: MacBook Pro 15.4 in. 2.0 GHz i7 4 GB/500 GB 5400-rpm
    Intel HD Graphics 3000 and AMD Radeon HD 6490M with 256 MB GDDR5
  • 4 GB iMac 27", Intel i5 processor
    Specifically: iMac 27 in. 2.7 GHz Intel Core i5 4 GB/1 TB 7200-rpm
    AMD Radeon HD 6770M with 512 MB
  • 6 GB Mac Pro, Quad-Core Intel Xeon processors
    Specifically: 6 GB/1 TB 7200-rpm / ATI Radeon HD 5770 with 1 GB GDDR5
    Two 2.4 GHz Quad-Core Intel Xeon "Westmere" processors

Memory for virtual machines can be configured with a wide array of settings. As a general rule, both VMware Fusion's and Parallels Desktop's default memory for each configuration (of physical RAM and "guest" OS) were the same. Windows 7 virtual machines ran with 1 GB of virtual machine RAM. Lion ran with 2 GB of virtual machine RAM.

Similarly, for disk allocation we set up all of the virtual machines the same way. We used 40 GB expanding disks in a single file (e.g., not 2 GB chunks), but the disks were pre-expanded so that expanding actions wouldn't affect results. The disk location of the virtual hard drive was in a similar physical location between computers, as that can make a significant difference in disk performance.

The tests compared VMware Fusion 4.0.2 with Parallels Desktop for Mac 7.0.14924.699487, running Mac OS X 10.7.1 with all updates as of the end of September. All Windows updates as of the end of September were installed for Windows 7.

There are a variety of often referred to, and utilized benchmarking suites in the computer industry including SPEC tests, PCMark, WorldBench, Performance Test, Unixbench and others. Each of these tests uses a series of tests that measure in a consistent way to assess specific types of performances. The key to each of these tests is appropriateness, repeatability, and accuracy.

We are sometimes asked how we select the tests we have in benchmarks. The goal is to represent the types of actions that virtualization users are doing on a regular basis. In particular, we focus more on user tasks rather than installation or configuration (because they are typically only done once or infrequently).

In the PC market, PC World (magazine) uses WorldBench as their benchmarking tool of choice. If you don't know about WorldBench, you can see more at http://www.worldbench.com. WorldBench 6 uses a combination of applications bundled together to determine benchmarks. This includes applications like Adobe Photoshop CS2, Autodesk 3ds, Mozilla Firefox 2, Microsoft Office 2003 and others. This may be a very good way for PC World to approach things as it allows older machines to be part of the mix, and it can avoid possible compatibility issues with the newer and current versions of software, but obviously it's not reflecting the latest versions of software. It also doesn't reflect the virtualization part of the experience.

There are a variety of other benchmarks available, and we've looked at many of them as possible tools to include in our mix. Often, we find their measurements simply don't reflect a true user experience. Other times, they don't reflect how a virtualization user may be using Windows, or even they simply report erroneous results. For example, in previous attempts to use PassMark, we found their graphics scores were not at all representative of real life graphics performance in games or other graphic uses. Those tests showed items as faster when, in fact, they were much slower. Benchmarks are only useful if they reflect the real world.

Rather than use WorldBench or others, we focus on the types of measurements we believe best represent the experience we see (in terms of speed). And while it takes practice and some skill, we test virtualization operations with a stopwatch — as the closest representation of how a real user would see it. CPU crunching ability, for example, is measured through tests like zip'ing files directly in Windows. And for Office users, we use the most up to date version of Microsoft Office for Windows (2010), again with all updates thru the end of September.

There are two exceptions to this: graphics/gaming and CPU utilization. In these two cases, we found that testing utilities not only work well but are necessary to give the most repeatable and concrete results. This time, we had the opportunity to use a tool that measures frames per second to verify the numbers reported by 3DMark06 (the industry standard for tuning gaming systems).

Remember, benchmarks are not a review — they are benchmarks. They are meant to just tell you which runs faster. If you are interested in a specific feature, support, the user interface, or any of the other criteria for deciding on a product, that's a different article.

We won't keep you in suspense. When we look at the "big picture" of all the top-level test results, Parallels is the clear winner. If you count up the general tests (including the top 3D graphics scores), Parallels won 60% of the tests by 10% or more. And, if you include all the tests where Parallels was at least 5% faster, as well as the balance of the 3DMark06 graphics tests, Parallels increased the lead further. In other words, Parallels Desktop 7 beat VMware Fusion 4.0.2 in 74.9% of the general tests we ran, and Parallels was double the speed or more in almost a quarter of the top-level tests.

Test Tally: General Virtualization Tests

If you focus exclusively on 3D graphics, as measured by 3DMark06 version 1.2, Parallels won by an even larger margin. Specifically, Parallels won 71% of the tests by 10% or more, and was also a bit faster on an additional 8% more of the tests, and tied on the rest. In other words, Parallels Desktop 7 beat or tied VMware Fusion 4.0.2 in all of the 3D graphics tests we ran.

3DMark06 has three main aggregate scores. The most important of which is the "3DMark Score." In addition, SM2.0 Score measures 3D Shader Model 2.0 performance, and the HDR/SM3.0 Score measures the 3D Shader Model 3.0 & HDR performance.

Test Tally: 3D Graphics Tests

There are a handful of places where VMware Fusion consistently was faster than Parallels Desktop. For example, doing a full Shut Down of Windows 7 was faster in VMware Fusion. And, there were a couple of test configurations for File I/O tests and application launches that VMware was definitely faster, but Parallels was also faster in other instances of similar tests.

Overall, of the top-level tests, VMware Fusion won 7.8% of the tests by at least 10%. For the 3D tests, VMware Fusion never beat Parallels, but tied about 20% of the time. There's no doubt that VMware Fusion 4 is significantly better in graphics than version 3.1, but it still has not caught up to Parallels Desktop.

Parallels Desktop has new power management features that stretch your battery life. On the MacBook Pro, we saw about 25% more battery time on an idling virtual machine (which results in 81 additional minutes of use) than we did on VMware Fusion in the same test.

One of the best ways to visualize the huge amount of data points is through MacTech's well-known "Colored Cell Worksheet Overview." In these worksheets, a cell represents the best result for each test for each version of Windows for each virtualization product. These are then colored according to which product was faster.

Green cell coloring means Parallels Desktop was faster than VMware Fusion. A blue cell coloring indicates VMware Fusion was faster than Parallels Desktop. Scores that were within 5% of one another are considered a tie. Coloring darkness has four thresholds: >5%, >10%, >25% and double the speed or more. (Note: Not all tests were run on all configurations, hence the empty cells.)

Colored Cell Worksheet Overview: Virtual Hard Drive

Click on image above for full enlarged version

Obviously, when you look at the amount of green on the worksheet, you can see that Parallels was faster in the vast majority of tests that we ran.

In the sections below, we'll walk you through what we tested, and the results for each. These tests are designed to arm you with the information so you can make the best decision for your type of use.

For each set of results, you can see the analysis for each model of computer running Windows 7. If you want to see more detail on the tests or on an individual Mac model, you may want to review the spreadsheet for those details.

As you look through the charts below, pay attention to whether the better result is represented by taller or shorter bars (see the lower left corner of each chart).

For the launch tests (launching the VM, Booting Windows, and Suspend/Resume), we had the option of an "Adam" test and a "Successive" test. Adam tests are when the computer has been completely restarted (hence avoiding both host and guest OS caching). Successive tests are repeated tests without restarting the machine in between tests, and can benefit from caching. Both can mimic real use situations depending on a user's work habits.

The tests used were selected specifically to give a real-world view of what VMware Fusion and Parallels Desktop are like for many users. We didn't focus on the tests which were so short in time frame (e.g., fast) that we could not create statistically significant results, or that had unperceivable differences.

For some of the analysis, we "normalized" results by dividing the result by the fastest result for that test across all the machine configurations. We did this specifically so that we could make comparisons across different groups, and to be able to give you overview results combining a series of types of tests, and computer models.

Instead of a plain "average" or "mean", overall conclusions are done using a "geomean", which is a specific type of average that focuses on the central results and minimizes outliers. Geomean is the same averaging methodology used by SPEC tests, PCMark, Unixbench, and others, and it helps prevent against minor result skewing. (If you are interested in how it differs from a mean, instead of adding the set of numbers and then dividing the sum by the count of numbers in the set, n, the numbers are multiplied and then the nth root of the resulting product is taken).

For those interested in the benchmarking methodologies, see the more detailed testing information in Appendix A. For the detailed results of the tests used for the analysis, see Appendix B. Both appendices are available on the MacTech web site.

There are three situations in which users commonly launch a virtual machine:

  • Launch the virtual machine from "off" mode, including a full Windows boot and ending with launching of an application. For testing purposes, we chose the NotePad application.
  • Launch the virtual machine from a suspended state, and resume from suspend (Adam).
  • Launch the virtual machine from a suspended state, and resume from suspend (Successive).

For the first test, we started at the Finder and launched the virtualization applications, which were set up to immediately launch the virtual machine. The visual feedback is fairly different between Parallels Desktop and VMware Fusion when Windows first starts up. Windows actually does its startup for quite some time after reaching the desktop. In some cases, it can take quite some time for Windows to complete its boot process. Most users don't care if things continue so long as they aren't held up.

As a result, we focused on timing to the point of actually accomplishing something. In this case, we configured NotePad to automatically launch. The test ended when the window started to render. This gave us a real world scenario of being able to actually do something as opposed to Windows just looking like it was booted.

The primary difference between the two types of VM launch test is that the computer is fully rebooted (both the virtual machine as well as Mac OS X) in between the "Adam" tests. The successive tests are launching the virtual machines and restoring them without restarting the Mac in between.

Successive tests benefit from both Mac OS X and possibly virtual machine caching and are significantly faster. However, you may only see these types of situations if you are constantly switching in and out of your virtual machine.

As with all of our tests, we performed these tests multiple times to handle the variability that can occur. Of those results, we took the best results for each product.

Virtual Machine Performance

Clearly, virtual machines with more vm memory take longer to restore, so "more" is not necessarily better here. Use the smallest amount that does what you need to do well. In our case, we focused on 1 GB virtual machines in Windows 7.

Most benchmarking suites measure CPU performance using file compression as at least one part of their analysis. We did the same. As a matter of interest, we used compression instead of decompression, because with today's fast computers, decompression is actually much closer to a file copy than it is to CPU work. Compression requires the system to perform analysis to do the compression, and is therefore a better measurement of CPU.

The two solutions are very close to one of another with Parallels having the slightest edge in speed in compression, but nothing substantial.

Here, we tested two of the most common applications used in virtualized desktop environments: Microsoft Office 2010's Word and Outlook. Users often go in and out of Word and Outlook, so we focused on successive launches. Similar to the OS launch tests, a successive launch test is done repeatedly without restarting Windows.

Most applications, including these, launch very quickly, with Parallels having a slight edge. Even so, launches are so fast on both virtualization products that even the worst case was very usable.

Windows Application Launch Performance

In all cases, launch times were quite fast from a user experience. It's nice to see that all the Microsoft Office applications launch (typically) in a few seconds under both Parallels Desktop and VMware Fusion. Often, it felt nearly instantaneous to a user on the successive launches.

In many cases, applications today perform so well and so fast, even under virtualized environments, that anyone would be pleased for small documents and activities. We focused our efforts on one of the more demanding and used applications: Internet Explorer 9.

Microsoft has a series of public benchmarks for seeing how well a browser performs. See http://ie.microsoft.com/testdrive/Views/SiteMap/ for all kinds of demos and performance tests. We chose a set of tests that took enough resources to measure well and had a consistent method of measurement.

Internet Explorer Performance

Across the board, Internet Explorer 9 performed significantly better on Parallels Desktop than it did on VMware Fusion. Only the Sudoku test was close at all. All the rest were a virtual blowout (sorry, couldn't resist).

Because the web is becoming more and more an application platform, we chose to focus on more advanced Web 2.0 technologies rather than simple html.

VMware tells us "IE9 specifically disables hardware acceleration of graphics with VMware and other select Microsoft third party compliant hypervisors. See Microsoft blog here from March 2011: http://windowsteamblog.com/windows/b/bloggingwindows/archive/2011/03/13/internet-explorer-9-hardware-acceleration-and-third-party-hypervisors.aspx. While Microsoft believes they fixed this IE9 hardware acceleration issue in Microsoft certified third party compliant hypervisors, it is still unresolved and we are working though this issue with them. This does not affect non-Microsoft certified hypervisors and is an IE/Microsoft limitation. If you run Google Chrome and run the same HTML5 tests in a virtual machine, you will see hardware acceleration of it in a browser in Windows in a VMware virtual machine." In other words, this may change for VMware in the future, but at the moment, the above is what we are seeing for IE9 performance.

We're always on the lookout for new ways to measure graphics in particular. And this time, again, we did look at some of the results of other benchmark suites, and found that their assessment of graphics were very clearly wrong (we could clearly see visually they weren't right).

As a result, we again used 3DMark06 (version 1.2) by FutureMark. 3DMark06 is a globally recognized and comparable measurement of 3D performance. Gamers use 3DMark to test their machines and tweak them for performance. See http://www.futuremark.com/ for more information about 3DMark06.

Clearly, if you are going to play games and be serious about it, then running in Boot Camp is your best choice running Windows natively. However, Parallels Desktop is fairly close and with a virtualization solution you don't have to reboot, deal with driver issues, and more. We were pleased to see that VMware Fusion 4.0.2 no longer has the rendering problems that it did in prior versions.

Shader and Fill Graphics Performance

Without a doubt, Parallels Desktop's greatest advantage over VMware Fusion is in 3D graphics. Not only is the speed difference huge (Parallels Desktop is often double or more the speed of VMware Fusion), but everything just seems to run more smoothly.

Based on past experiences, we expected things to run well on the Mac Pro and MacBook Pro. We were very pleased to see how well all the Mac models run graphics and games—a noticeable difference from just a couple of years ago.

Obviously, the results for 3DMark06 vary greatly by the hardware they are on because of the graphics capabilities of each machine. As such, the best way to look at these is in the detailed spreadsheet where you can see results for each model for each OS.

Overall, these result charts will give you an idea of some of the things that VMware Fusion did well in. The overwhelming winner in this graphics competition is Parallels Desktop.

HDR, CPU and GT Tests 3DMark06 Suite

There are additional tidbits that we learned through the incredibly complex process of testing virtualization products. Here are some additional insights.

64-bit and Virtual Processors

We chose 64-bit because it has become the standard on Windows to give users the additional addressability they would like. In our experimentations, however, we found the 64-bit default Windows 7 installation was about 2.5 GB larger than the 32-bit in hard disk space. Not a huge deal, but something to note.

These days, the decision is fairly simple. If you have an application that can make use of multiple virtual processors, and this includes 3D Graphics, and your Mac has enough horsepower, then you should use them. Otherwise, it's not necessary.

Then again, if speed is that important to you, you should ask yourself whether to run the app natively on your Mac instead of in a virtual machine. But sometimes, like with some industry-specific software, you may not have an option.

Apple has finally changed their license with Mac OS X Lion so that users can run a virtual machine with Lion as the client OS. As a result, both Parallels Desktop and VMware Fusion tout that they can run Lion in a virtual machine. Of course, it was a moral imperative for us to check this out and see how it runs.

Parallels Desktop 7 Running Mac OS X Lion

In short, they run pretty well. Parallels booted Lion a bit faster, but we also wanted to see about virtual machine speed. Without doing comprehensive benchmarks, we took a quick run of Geekbench. In short, they ran comparably to each other. Clearly, neither company has optimized yet for Lion and I expect we'll see improvements on this front.

VMware Fusion 4.0.2 Running OS X Lion

What would you use this for? One of the greatest uses may be for developers testing different builds of Lion and their product, and wanting to do so in a protected way. Beyond that, well, it's a bit of a strange experience to say the least.

Both VMware Fusion and Parallels Desktop for Mac are excellent products, and both allow you to run many OSes quite well (including Lion now). In the end, when you decide which product to use, you should take into account what's most important to you.

Windows 7 is such a pleasurable experience that unless there's some driving reason otherwise, you should use it rather than XP under either virtualization product.

When it comes to whether you should use multiple processors or 64-bit virtual machines, it depends on your use. If you have a real need for either, and can articulate a reason for it, than use them. They do work well. That said, if you don't have a specific need, then don't bother with multiple virtual CPUs. As for 64-bit, you should use it (especially in Windows 7) unless you have a driving reason not to.

Many people have the feeling of "more is better," but when it comes to RAM in the virtual machine, that is not necessarily the case. More RAM means longer virtual machine launch times, suspends and resumes. For most users, 1 GB of virtual machine RAM will work best for Windows 7. Use more than that only if you really know you need it. Gaming may do best with 1.25-1.5 GB of RAM if you can spare it.

In the vast majority of overall our tests, Parallels Desktop 7 won. Again, if you count up the general tests (including the top 3D graphics scores), Parallels won 60% of the tests by 10% or more. And, if you include all the tests where Parallels was at least 5% faster, as well as the balance of the 3DMark06 graphics tests, Parallels increased the lead further. In other words, Parallels Desktop 7 beat VMware Fusion 4.0.2 in 74.9% of the general tests we ran, and Parallels was double the speed or more in almost a quarter of the top-level tests.

If you focus exclusively on 3D graphics, as measured by 3DMark06 version 1.2, Parallels won by an even larger margin. Specifically, Parallels won 71% of the tests by 10% or more, and was also a bit faster on an additional 8% more of the tests, and tied on the rest. In other words, Parallels Desktop 7 beat or tied VMware Fusion 4.0.2 in all of the 3D graphics tests we ran.

If you are a traveler, Parallels Desktop has new power management features that stretch your battery life. On the MacBook Pro, we saw about 25% more battery time on an idling virtual machine (81 additional minutes of use) than we did on VMware Fusion in the same test.

To be clear, this article is not a product review; it's a benchmarking analysis. You should use it as part of your decision combined with other factors, such as product features, user interface, which OS you want to run, graphics capabilities and support to make your product choice.

One thing is clear: virtualization for the Mac works well. Really well. And I expect that we'll see virtualization products keep getting better and better.