Skyrim at 120 and 240 FPS
by Scott Wasson
— 3:52 PM on December 12, 2012
Our recent article
comparing the Radeon HD 7950 to the GeForce GTX 660 Ti in many of the season's top new games has attracted some new attention to our latency-focused game testing methods. Some folks are skeptical about whether there's added value to testing with any more granularity than an FPS average would provide. Others have wondered about whether the tool we're using to grab frame time data, Fraps, really captures an accurate reflection of how frames are arriving at the display. There are some interesting questions to explore along those lines, but our intention has always been to illustrate the impact of high-latency frames on animation smoothness visually.
And we can do that quite easily, thanks to the high-speed camera I bought a while back for just that purpose. I've waited much too long to put it to use.
One of the test scenarios with the starkest difference between the GeForce and Radeon in our recent tests is our new Skyrim sequence, where we take a walk through the countryside. You can see the data and graphs we've reported from it here
. Neither card performs poorly in this test—the 7950 averages 69 FPS, while the GTX 660 Ti averages 74 FPS—but frame delivery is generally uneven on the Radeon, punctuated by occasional hitches where frames take 60 milliseconds or more to arrive. Such spikes are relatively rare on the GeForce. Here's a look at the frame time plot, which tells the story:
The difference in smoothness between the two cards was obvious as we conducted our play-testing. Since folks were wondering, we figured we might as well capture some high-speed video to show you the difference between the two.
We have a couple videos to share. The first one was filmed at 120 FPS, twice the speed of our 60Hz IPS display. I recommend hitting the "view full screen" button to get a better sense of motion.
I think the occasional hitches on the Radeon are pretty easy to see. The big, obvious slowdowns only happen every so often, but the GeForce avoids them—just as the test results told us.
Remember, the Radeon HD 7950 turns in an average of 69 FPS in this very test run, a rate that has been considered "good" in FPS terms for years. This is why measuring frame latencies, not just average rates, is so crucial. FPS averages don't capture what's happening from moment to moment.
The next video was shot at 240 frames per second, four times the speed of the display.
This one is a little more tedious to watch, I'll admit. However, it should provide many hours of entertainment to those who want to do fine-grained visual comparisons between the two cards. The big hitches are still apparent on the Radeon, but here it may be possible to see the superior moment-by-moment smoothness on the GeForce. I dunno. In some ways, I think it's easier to get a sense of the smoothness at full speed than it is in slow motion.
For what it's worth, Cyril recommends staring at the border between the two videos and unfocusing your eyes a bit in order to best monitor motion on both sides at once. Sounds like a recipe for a headache to me.
Anyhow, you now have a little bit of visual evidence to go with the mountains of data we've provided. Make of it what you will.