The NVIDIA GeForce GTX 980 Review: Maxwell Mark 2
by Ryan Smith on September 18, 2014 10:30 PM ESTCrysis: Warhead
Up next is our legacy title for 2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.
At the launch of the GTX 680, Crysis: Warhead was rather punishing of the GTX 680’s decreased memory bandwidth versus GTX 580. The GTX 680 was faster than the GTX 580, but the gains weren’t as great as what we saw elsewhere. For this reason the fact that the GTX 980 can hold a 60% lead over the GTX 680 is particularly important because it means that NVIDIA’s 3rd generation delta color compression is working and working well. This has allowed NVIDIA to overcome quite a bit of memory bandwidth bottlenecking in this game and push performance higher.
That said, since GTX 780 Ti has a full 50% more memory bandwidth, it’s telling that GTX 780 Ti and GTX 980 are virtually tied in this benchmark. Crysis: Warhead will gladly still take what memory bandwidth it can get from NVIDIA cards.
Otherwise against AMD cards this is the other game where GTX 980 can’t cleanly defeat R9 290XU. These cards are virtually tied, with AMD edging out NVIDIA in two of three tests. Given their differing architectures I’m hesitant to say this is a memory bandwidth factor as well, but if it were then R9 290XU has a very big memory bandwidth advantage going into this.
When it comes to minimum framerates the story is much the same, with the GTX 980 and AMD trading places. Though it’s interesting to note that the GTX 980 is doing rather well against the GTX 680 here; that memory bandwidth advantage would appear to really be paying off with minimum framterates.
274 Comments
View All Comments
Sttm - Thursday, September 18, 2014 - link
"How will AMD and NVIDIA solve the problem they face and bring newer, better products to the market?"My suggestion is they send their CEOs over to Intel to beg on their knees for access to their 14nm process. This is getting silly, GPUs shouldn't be 4 years behind CPUs on process node. Someone cut Intel a big fat check and get this done already.
joepaxxx - Thursday, September 18, 2014 - link
It's not just about having access to the process technology and fab. The cost of actually designing and verifying an SoC at nodes past 28nm is approaching the breaking point for most markets, that's why companies aren't jumping on to them. I saw one estimate of 500 million for development of a 16/14nm device. You better have a pretty good lock on the market to spend that kind of money.extide - Friday, September 19, 2014 - link
Yeah, but the GPU market is not one of those markets where the verification cost will break the bank, dude.Samus - Friday, September 19, 2014 - link
Seriously, nVidia's market cap is $10 billion dollars, they can spend a tiny fortune moving to 20nm and beyond...if they want too.I just don't think they want to saturate their previous products with such leaps and bounds in performance while also absolutely destroying their competition.
Moving to a smaller process isn't out of nVidia's reach, I just don't think they have a competitive incentive to spend the money on it. They've already been accused of becoming a monopoly after purchasing 3Dfx, and it'd be painful if AMD/ATI exited the PC graphics market because nVidia's Maxwell's, being twice as efficient as GCN, were priced identically.
bernstein - Friday, September 19, 2014 - link
atm. it is out of reach to them. at least from a financial perspective.while it would be awesome to have maxwell designed for & produced on intel's 14nm process, intel doesn't even have the capacity to produce all of their own cpus... until fall 2015 (broadwell xeon-ep release)...
kron123456789 - Friday, September 19, 2014 - link
"it also marks the end of support for NVIDIA’s D3D10 GPUs: the 8, 9, 100, 200, and 300 series. Beginning with R343 these products are no longer supported in new driver branches and have been moved to legacy status." - This is it. The time has come to buy a new card to replace my GeForce 9800GT :)bobwya - Friday, September 19, 2014 - link
Such a modern card - why bother :-) The 980 will finally replace my 8800 GTX. Now that's a genuinely old card!!Actually I mainly need to do the upgrade because the power bills are so ridiculous for the 8800 GTX! For pities sake the card only has one power profile (high power usage).
djscrew - Friday, September 19, 2014 - link
Like +1kron123456789 - Saturday, September 20, 2014 - link
Oh yeah, modern :) It's only 6 years old) But it can handle even Tomb Raider at 1080p with 30-40fps at medium settings :)SkyBill40 - Saturday, September 20, 2014 - link
I've got an 8800 GTS 640MB still running in my mom's rig that's far more than what she'd ever need. Despite getting great performance from my MSI 660Ti OC 2GB Power Edition, it might be time to consider moving up the ladder since finding another identical card at a decent price for SLI likely wouldn't be worth the effort.So, either I sell off this 660Ti, give it to her, or hold onto it for a HTPC build at some point down the line. Decision, decisions. :)