Conclusive ThoughtsAfter testing 8 games using a multitude of Anti-Aliasing settings under both XP and Vista 64-bit we can finally make an informed conclusion on gaming with these high end cards.
Anti Aliasing Performance Scaling
We put all our results together for all games tested under both operating systems we can extract the following findings:
Going from 0xAA to 4xAA
- Geforce GTX 280 under XP: -19.1%
- Radeon HD 4870 X2 under XP: -23.1%
- Geforce GTX 280 under Vista: -21%
- Radeon HD 4870 X2 under Vista: -10.6%
For both cards under XP the drop is pretty much on par; under Vista the HD 4870 X2 shows a remarkable difference, as performance only drops by ~10% on average!
Going from 0xAA to 8xAA
- Geforce GTX 280 under XP: -26%
- Radeon HD 4870 X2 under XP: -25.8%
- Geforce GTX 280 under Vista: -28.7%
- Radeon HD 4870 X2 under Vista: -15.4%
When going from 0xAA to 8xAA the HD 4870 X2 is now better in both XP and Vista, under Vista the performance drop is still quite small, only ~15%! The GTX 280 drops by nearly double.
For those wondering if their next gaming machine should be running Windows XP or Windows Vista, the following numbers will be most interesting:
Performance Scaling XP to Vista
On average with the Geforce GTX 280 you lose -1.6% by switching to Vista 64-bit, so in short: same performance and no loss!
The Radeon HD 4870 X2 does even better, if we leave out the numbers of Stalker and Tomb Raider (as Crossfire failed to run properly under XP) we see a +5.8% boost in average FPS going from XP to Vista. If we add the two game titles and let them enjoy CF-scaling the number jumps up to +27.7%!
This by no means a definitive performance overview, only a snapshot in time using these video cards under XP and Vista. Many of the performance difference can be attributed to driver issues as both these cards are relatively new in the market.
From a gamer’s perspective there should be no reason why you wouldn’t want to use Windows Vista
64-bit. While our test setup with 2gb was sufficient to run all current generation games, there’s no doubt that having access to more system memory will prove useful in the future, and with a 64-bit OS you can profit from it.
We hope this article will be helpful for those looking to upgrade their gaming system with a more capable 3D card! Until next time.
Update 07-09-2008:: Some readers pointed out that NVIDIA uses different AA rendering modes compared to ATI and for fair comparison sake the following needs to be taken into account:
- NVIDIA 8xAA = 4xMSAA, 16xAA = 4xMSAA (CSAA mode), 16xQAA = 8xMSAA (CSAA mode)
- ATI 16xAA is a superAA mode where each core will render the same frame with a different AA pattern resulting in superior image quality.
This does put in perspective the performance of the Geforce GTX 280 when the high quality AA levels are forced to match ATI's levels, as the performance of the GTX 280 is really trailing then. Of course you have to consider how much AA you need for the game to look smooth to you, this changes from person to person.
We like to thank those who detailed the NVIDIA & ATI rendering modes, much appreciated!
...8xAA on ATI should be compared to 8xQAA on nV, not the 8xAA which is 4xMSAA based CSAA mode
...16xAA on ATI effectively turn the card into single chip card which can do 16xMSAA, since both chips render the same frame with different AA patterns
...16xAA on nV is 4xMSAA based CSAA mode and 16xQAA on nV is 8xMSAA based CSAA mode
So 16x and 8x comparisons in your graphs are far from being 'fair' or 'apples-to-apples', the 8xAA should have ATI 8xAA vs nV 8xQAA (8xQAA = 8xMSAA) and 16xAA shouldn't even exist since the GTX280 can't do 16xMSAA which is (practicly) what the HD4870X2 is doing by blending the same frame rendered twice with different AA patterns.