When it comes to PC games, there’s a not-so-subtle war being waged between AMD and Nvidia NVDA +1.11% for the continued loyalty of their users. The battlegrounds involve things like driver optimization and the implementation of proprietary features, software, and tools to give each graphics card manufacturer a competitive advantage.
Ubisoft’s Watch Dogs is the latest PC title to take advantage of Nvidia’s GameWorks, a robust collection of tools that allow game developers to produce a visual experience which epitomizes Nvidia’s rallying cry: “The Way It’s Meant To Be Played.” Developers license these proprietary Nvidia technologies like TXAA and ShadowWorks to deliver a wide range of realistic graphical enhancements to things like smoke, lighting, and textures. Nvidia engineers typically work closely with the developers on the best execution of their final code.
Recent examples of Nvidia GameWorks titles include Batman: Arkham Origins, Assassin’s Creed IV: Black Flag, and this week’s highly anticipated Watch Dogs.
As you’re suspecting from the headline, Nvidia’s GameWorks is only good news for Nvidia, their development partners, and their GPU users. That’s logical, and it serves a sizable slice of the market. According to AMD’s Robert Hallock, it’s terrible news for the PC gaming ecosystem on the whole.
“Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products,” Hallock told me in an email conversation over the weekend. But wait, it stands to reason that AMD would be miffed over a competitor having the edge when it comes to graphical fidelity and features, right? Hallock explains that the core problem is deeper: “Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.”
So a partner studio like Ubisoft can suggest or write enhancements to the GameWorks libraries, but AMD isn’t allowed to see those changes or suggest their own.
“The code obfuscation makes it difficult to perform our own after-the-fact driver optimizations, as the characteristics of the game are hidden behind many layers of circuitous and non-obvious routines,” Hallock continues. “This change coincides with NVIDIA’s decision to remove all public Direct3D code samples from their site in favor of a ‘contact us for licensing’ page. AMD does not engage in, support, or condone such activities.”
In my opinion there’s nothing inherently wrong with a company choosing to closely guard the keys which may unlock a competitive advantage. AMD is upset because they adopt the opposite approach. “Our work with game developers is founded concretely in open, sharable code, all of which we make available on our developer portal,” Hallock says. “We believe that enabling a developer with obvious and editable code that can be massaged to benefit everyone not only helps AMD hardware, but makes it possible for all gamers to benefit from our partnerships with a developer. As TressFX Hair runs equally well on AMD and NVIDIA hardware, for example, you can see this is true.”
I believe Hallock isn’t just offering up lip service here. AMD’s “FreeSync” technology aims to improve the working relationship between display and GPU by tapping into the open “Adaptive Sync” specification which will soon be standard on all DisplayPort-enabled monitors. Nvidia’s solution, G-Sync, is proprietary and involves custom hardware built into standard monitors. (I haven’t seen FreeSync in action, and I admittedly love what G-Sync offers. But that doesn’t change the facts surrounding the technologies.) AMD’s Mantle, a low-level API, doesn’t require the company’s GCN architecture to function properly. AMD says it will work equally well on Nvidia cards. The company clearly waves a banner of open-source development and ideals.
With that admittedly verbose background out of the way, let’s dig into Watch Dogs specifically. I’ve been testing it over the weekend on a variety of newer AMD and Nvidia graphics cards, and the results have been simultaneously fascinating and frustrating. It’s evident that Watch Dogs is optimized for Nvidia hardware, but it’s staggering just how un-optimized it is on AMD hardware. I guarantee that when the game gets released, a swarm of upset gamers are going to point fingers at AMD for the sub-par performance. Their anger would be misplaced.
I asked Robert Hallock about this specifically, and he explains that they had “very limited time with the title and [we've] been able to implement some respectable performance improvements thanks to the skill of our driver engineers. Careful performance analysis with a variety of internal tools have allowed us to profile this title, despite deliberate obfuscation attempts, to improve the experience for users.”
AMD will release a new driver to the public this week which reflects those improvements. (It’s the same driver I conducted my testing with.) Unfortunately my conversation with Hallock didn’t end with a silver lining: “I am uncertain if we will be able to achieve additional gains due to the unfortunate practices of the Gameworks program,” he remarked.
Tech journalist Joel Hruska of ExtremeTech summarized why Nvidia’s GameWorks could end up providing a poor experience for consumers and potentially dangerous long-term obstacles for developers, in a stellar investigative piece he wrote last year:
“AMD is no longer in control of its own performance. While GameWorks doesn’t technically lock vendors into Nvidia solutions, a developer that wanted to support both companies equally would have to work with AMD and Nvidia from the beginning of the development cycle to create a vendor-specific code path. It’s impossible for AMD to provide a quick after-launch fix. This kind of maneuver ultimately hurts developers in the guise of helping them.”
Whether or not you agree with that assessment, Hruska successfully prophesied this very day back in December, saying: “And while we acknowledge that current Gameworks titles implement no overt AMD penalties, developers who rely on that fact in the future may discover that their games run unexplainably poorly on AMD hardware with no insight into why.”
What you’re seeing in the benchmarks above is a $500 AMD video card (Radeon 290x) struggling to keep up with a $300 one (GTX 770) from Nvidia using one of the lowest levels of Anti-Aliasing, since Nvidia’s TXAA isn’t available to Radeon users. And the performance deficiences scale down accordingly. Both of the cards tested were reference boards, and the system is an Intel Core i7 4770K with 16GB of 1866MHz RAM running on Windows 8.1 and this week’s game-ready drivers from both Nvidia and AMD.
To further put this in perspective, AMD’s 290x graphics card performs 51% better than Nvidia’s 770 on one of the most demanding PC titles around, Metro: Last Light — which also happens to be an Nvidia optimized title. As you would expect given their respective prices, AMD’s flagship 290x can and should blow past Nvidia’s 770 and compete with Nvidia’s 780Ti on most titles. To really drive the point home, my Radeon 290x can hit 60fps on Metro: Last Light with High quality settings and 4x anti-aliasing, at a higher resolution of 1440p. It points to a poorly optimized game all around; but substantially worse on AMD hardware, as Joel Hruska accurately predicted several months ago. (For those wondering, AMD CrossFire and Nvidia SLI scaling are almost nonexistent at this point, especially in higher resolutions like 1440p and 4K).
I’ll leave you with the fact that Unreal Engine 4, the latest graphics engine from Epic, has Nvidia’s GameWorks built into its core. Epic’s prior version, Unreal Engine 3, was widely adopted and we should expect no less from their newest iteration, which frankly looks stunning. Whether or not this will hamper the performance of future titles on AMD hardware this dramatically would only be conjecture at this point, but I think cause for concern is warranted.
Update: I’ve posted benchmarks for Watch Dogs across a wide range of AMD and Nvidia hardware.
[Fanboy deflection: I own a wide variety of Nvidia hardware and a G-Sync monitor, and am a vocal proponent of products like Shield and software like ShadowPlay. I simply believe this could be bad for the industry in the long run.]