Benchmarked - Metro: Last Light Redux
by Jarred Walton on October 2, 2014 3:20 AM EST- Posted in
- GPUs
- Gaming
- AMD
- NVIDIA
- Benchmarks
- 4A Games
- Benchmarked
Last month 4A Games released updated versions of the two earlier games in the Metro series, Metro 2033 Redux and Metro: Last Light Redux. The games have both been remastered using the latest version of 4A Engine, with updates for the latest generation of console hardware among other things. Fundamentally, that means less for Metro: Last Light than it does for Metro 2033, but there are still some visual changes, and that potentially means performance changes as well. We've been using Metro: Last Light as one of our gaming performance benchmarks almost since it first came out in May, 2013, and it's still one of the most demanding games around. Of course part of that stems from the use of super-sampling anti-aliasing at the highest quality settings, but even without SSAA Metro: Last Light can be a beast.
Something we've wanted to do more in the past is to provide smaller updates looking at the performance of recent game releases. Our GPU reviews do a good job of giving a broad overview of the performance from the latest graphics cards on a smaller subset of games, and it's basically impossible to test every new GPU on every game at the time of launch. But if you're in the market for a new GPU, you probably want to use if for playing games, which means seeing how new games perform on a selection of hardware is useful. To be clear, we're not replacing our GPU reviews, but we hope to augment our other coverage with increased coverage of the recent gaming releases.
It's worth noting that testing gaming performance at the time of launch may also tell an interesting story about the state of drivers from the various GPU companies. AMD and NVIDIA are the two obvious participants, but with Intel continuing to increase the performance of their Processor Graphics solutions it's also important to see how they fare with new releases. In some cases we may see serious performance issues or rendering errors early on, and if/when that happens we may elect to revisit the performance of certain games a month or two after launch to see what has changed. We've encountered instances in the past where drivers tended to target and fix issues with the most commonly benchmarked games, and while things are certainly better these days it's always good to look at empirical data showing how the various companies stack up.
With that out of the way, let's see what has changed with Metro: Last Light Redux, both in terms of graphics as well as performance. Starting with the former, in most areas you'll be hard pressed to see substantial differences. The most noteworthy exception is the use of red lights and smoke in place of white lights/smoke in some areas; this is particularly apparent in the built-in benchmark. There also appears to be more tessellation in some areas, and at the end (when the "train" gets blown up), you can see in Redux that there's more deformation/destruction of the concrete barrier. I've created a split-screen video showing the original Metro: Last Light on the left and Metro: Last Light Redux on the right. The games were both run at 1080p maximum quality settings, with Advanced PhysX disabled. (Note that with video recording I limited the frame rate to 30 FPS, so disregard the performance shown in that clip.)
Other than the aforementioned changes in lighting color for the smoke, it's difficult to say how much the graphics have improved versus simply being different from the initial release. I've benchmarked Metro: Last Light hundreds of times over the past year (perhaps even thousands), but I have to admit that I haven't actually taken the time to play the game that much, so many of the more subtle changes might go unnoticed.
The list of updates notes that there are graphical upgrades, lighting enhancements, improvements to the gameplay and gunplay, and Redux also includes all of the DLC released for the original game. There have been some updates to certain maps/areas as well, all the weapons that were added via DLC are integrated into the game, and there are some minor UI tweaks (e.g. you can check your watch and inventory as in the original Metro 2033). Finally, there are new achievements/trophies along with two new modes – Spartan and Survival – in Redux. Spartan is basically the way the original Last Light worked (more run-and-gun gameplay, more ammo, not as "hard") while Survival mode is more like the original Metro 2033 (less ammo and health, more difficult enemies). From what I can tell, though, having more (or less) ammo in either game doesn't really change things too much.
But what about performance – is Metro: Last Light Redux any faster (or slower) at rendering its updated graphics compared to the original? To answer that, I've got a rather different set of hardware than what Ryan uses for our GPU reviews, as all of the hardware has been purchased at retail over the past year or so. For now I'm going to focus on single GPU performance, and while I do have a moderate collection of both AMD and NVIDIA GPUs, for the time being my hardware is slanted towards the high-end offerings than lower tier parts. On the laptop side, we'd also like to thank MSI for letting us use three of their latest notebooks, the GT70 Dominator Pro with GTX 880M, the GS60 Ghost Pro 3K with GTX 870M, and the GE60 Apache Pro with GTX 860M. Here's the short list of hardware that I've used for testing:
Gaming Benchmarks Test Systems | |
CPU |
Intel Core i7-4770K (4x 3.5-3.9GHz, 8MB L3) Overclocked to 4.1GHz |
Motherboard | Gigabyte G1.Sniper M5 Z87 |
Memory | 2x8GB Corsair Vengeance Pro DDR3-1866 CL9 |
GPUs |
Gigabyte Radeon HD 6970 Sapphire Radeon R9 280 Sapphire Radeon R9 280X Gigabyte Radeon R9 290X EVGA GeForce GTX 770 EVGA GeForce GTX 780 Zotac GeForce GTX 970 Laptops: GeForce GTX 880M (MSI GT70 Dominator Pro) GeForce GTX 870M (MSI GS60 Ghost 3K Pro) GeForce GTX 860M (MSI GE60 Apache Pro) |
Storage | Corsair Neutron GTX 480GB |
Power Supply | Rosewill Capstone 1000M |
Case | Corsair Obsidian 350D |
Operating System | Windows 7 64-bit |
The obvious omission here is the new GeForce GTX 980, though we're also missing GTX 780 Ti, R9 290, not to mention all of the mainstream GPUs like the GTX 750/750 Ti, the whole AMD R7 series, etc. The good news is that the laptops at least give us some idea of what to expect from such cards – the GTX 860M for instance is clocked very similarly to the GTX 750 Ti, and GTX 870M is similar to the OEM GTX 760 192-bit. Again, we'll work on improving the selection of cards tested and try to cover a broader range in the future, but for now let's see how performance differs between the two releases of Metro: Last Light.
We've tested at 1080p with maximum quality (Very High Quality) and we also ran a second test at 1080p with High Quality and without SSAA. In both cases we're testing without enabling Advanced PhysX. While PhysX can make a noticeable difference at times (the Batman games being a prime example), I can't say I've noticed anything but lower frame rates from the feature in the Last Light benchmark – it basically drops performance about 10-15% on NVIDIA cards, and minimum frame rates in particular can be very poor. Advanced PhysX also seems to cause issues with some NVIDIA cards (see below). Our settings then are essentially "Ultra" quality and "High" quality; here's what performance looks like for the two releases on our selected hardware:
So this is where things get interesting. At our maximum quality settings, performance is lower almost across the gamut of hardware with Metro: Last Light Redux. The R9 280 and MSI GE60 are the two exceptions, where performance basically stays the same; everywhere else we see anywhere from a 2% to an 11% drop. When we drop the quality settings a notch and disable SSAA on the other hand, Redux performance is only slightly lower (essentially the same) in one instance, and that's the HD 6970; all of the newer GPUs are anywhere from 10% to 19% faster. That could mean that optimizations have been made for all the modern GPUs but they just don't translate as well to SSAA performance.
As far as AMD vs. NVIDIA, similar to what we saw in our recent GTX 970 review NVIDIA's new "budget friendly high-end GPU" basically offers performance on par with AMD's top of the line R9 290X at a much lower price. GTX 970 also tends to be roughly the same level of performance as GTX 780, with the 780 now being cleared out at lower prices. The GTX 770 meanwhile offers roughly the same performance as the R9 280X, though in this case the AMD GPU has the lower price, but of course GTX 770 is being phased out in favor of GTX 970 as well.
One other item worth mentioning is that I noticed my Zotac GTX 970 GPU was a bit flaky with Redux, particularly at even higher settings (e.g. 2560x1440 maximum or 1080p high quality, with Advanced PhysX). I was running at the card's stock settings initially (which have a mild 26MHz bump on the base GPU clock), and I thought perhaps temperatures were getting too hot on some components. It turns out the real culprit is Advanced PhysX, which tends to crash Redux every few minutes on the GTX 970.
I haven't tested with PhysX extensively, but some additional testing of the GTX 780 also showed crashes with PhysX enabled (but it takes about twice as long as the GTX 970 to crash to the desktop, so 10 minutes instead of five minutes). Either Metro: Last Light Redux has some poorly implemented PhysX code, and/or NVIDIA may need to tweak their drivers for Redux to achieve stability at certain settings with Advanced PhysX enabled. This is definitely a fringe case, however, so it's not likely to affect a lot of users either way.
Overall, the Redux release of Metro: Last Light won't be any more – or less – playable on most systems than the original game. Of course, Metro 2033 Redux saw a much greater overhaul in terms of graphics and gameplay, but in that case it means the system requirements are higher than the original game, likely at the same level as Last Light Redux. In other words, if you're looking for the poster child of why gamers might want SLI or CrossFire builds, the Metro Redux games are right up there with other punishing titles like the Crysis series, at least if you want to crank up every quality settings. SSAA is as usual a major hit to performance, so turning that off can boost performance by almost 100% at the cost of having jaggies.
And on a final note, there's a huge onslaught of games coming, and we're hoping to test many of them in a format similar to this. Your feedback is welcome as always, and if you have any requests for games that are already available or coming soon that you'd like to see benchmarked, let us know. Also let us know if you'd like to see additional settings tested; I confined the results reported to 1080p at High and Ultra quality, but we could certainly run other settings. Since these are all single GPU configurations, 2560x1440 with Redux proves to be too much in most cases, unless we drop SSAA; the laptops meanwhile might benefit from 1920x1080 and medium quality settings, though that's a bit too light on the faster desktop GPUs. Anyway, let us know what you'd like to see.
29 Comments
View All Comments
OrphanageExplosion - Thursday, October 2, 2014 - link
It's a bit of a curious test really. The Metro Last Light benchmark is really not indicative of performance during actual gameplay. Although built around assets from the game's most demanding level, it's a GPU and CPU stress-test using conditions that aren't actually present in the game as you see them in that sequence. It's a benchmark, nothing more.I feel a better comparison would have been Metro 2033, which actually sees radical differences compared to the original, whereas Last Light has less noticeable enhancements. I guess at least the article demonstrates that unless you want the DLC, you don't get a hugely improved experience with Redux over buying the original? But that just makes the case for a 2033 performance comparison that much more apparent.
JarredWalton - Thursday, October 2, 2014 - link
In general I don't plan on doing many of these where the game is an update to an existing game -- it would normally just be a question of "how well does game XYZ run on various GPUs?" Metro Redux was a bit of special case, and since the original Last Light is still a bear to run I figured a short look at how much things have changed (if at all) would be good.You're right that the built-in benchmark isn't necessarily indicative of the actual game, but there are scenes that can be very demanding and so it's basically a worst-case test. If you can run the benchmark and get decent frame rates, you can safely play the rest of the game. It's also not all that CPU-intensive, or at least not so much that going from stock clocks to 4.1GHz on the i7-4770K makes much of a difference on single GPUs at anything beyond the lower settings.
MrSpadge - Thursday, October 2, 2014 - link
Regarding the Zotac GTX970: is it a factory-overclocked one? If raising the fan speed helps a bit, it seems like the clocks have been pushed too high for that chip. Does underclocking solve the stability issues? If that's the case it's a problem of Zotac and their binning, not nVidia. If the card runs at stock clocks it's nVidias problem.TheJian - Thursday, October 2, 2014 - link
Considering less than 3% run above 1920x1200 NV has time to fix this (rather you have time to RMA your card...LOL), and it may just be Zotac's fan that is a problem here or your particular SINGLE sample of a single vendor's cards. Considering further that most of that 3% have more than one card, this comment is even more pointless.http://www.anandtech.com/show/8568/the-geforce-gtx...
Why was there no problems in basically the same game (and all the others) tested previously in Anandtech's 970 review. Seems silly to call NV's whole 970 product line into question (made by many vendors) when you get ONE sample that doesn't do the job in a particular res but your own 970 review shows nothing even OC'ed in any game up to 4K.
http://www.anandtech.com/show/8568/the-geforce-gtx...
If it's right up there with Crisis 3, why doesn't this game crap out in anandtech's 970 review at 1600p or 4k? Or any other game at these resolutions? Oh, right, AMD slant...
But that's anandtech for you (AMD portal and all) ;) Just saying...I mean, when you have a single card sample of AMD with an issue do you call the whole AMD line a problem? You don't even suggest such things. You say you had a problem with your particular card sample (at worst), which is exactly what you should have done here, with a tidbit saying but it probably doesn't affect other cards since so many review sites had ZERO issues at 1600P or even 4K.
http://www.anandtech.com/show/8568/the-geforce-gtx...
Why doesn't it crap out when OC'ed to 1.45ghz at 1600p or 4K, and I mean why doesn't 970 do this at ANY of the TON of websites who tested it? Either your particular card has problems, or perhaps you're just lying because at this point there is no other way to make AMD's xmas get saved? Either way, many websites NOT having the issue would seem to suggest YOU have a problem, not NV's whole 970's card list from many vendors. Your suggestion is complete BS, but no surprise with that AMD slant this site has had since, oh 660ti review or so. :(
Such comments are silly, when evidence from everywhere at all resolutions say 970 is fine even up to ~1.5ghz. How can you question the whole 970 line knowing NO other site (AFAIK) had any problems with a SINGLE game, worse you call the whole 900 series into question...LOL.
http://in.ign.com/nvidia-gtx-980/64915/review/nvid...
A single sample of 980 worked fine in the same game, so I guess based on one sample (using your logic) we now know ALL 980's have ZERO problems in this game even up to 4K right?...ROFL. See how stupid that is? Worse, I could go on and say since they had no problems here, probably 970 has no issues either.
Both your example and mine are stupid, but the tons of sites and a huge number of games tested across all those sites allows me to confidently say, you sir are an idiot or this is just yet another example of this site showing the old Anandtech AMD slant (I don't think you're an idiot, so draw your own conclusions). I can't really see how it could be anything besides these two options.
In the face of all other websites showing 980/970 fine in all resolutions and tons of games, can you please explain how a SINGLE card having an issue means 970's at least, and possibly all 980's could need to have Nvidia tweak their drivers? ROFL.
Left off one more option though I guess...You could just be lying ;) In any case, your assumption is ridiculous. It only took 3 comments for you to be called out on this (MrSpadge already explained what should have been obvious to a long time reviewer on a site like Anandtech). But I digress...
willis936 - Thursday, October 2, 2014 - link
So two systems running the same software environment and the same SKU would expect different results between samples? If their temperatures and clocks are the same? This is a joke right? Drivers will affect data. Other hardware will affect data. Specific samples will always perform identically non-overclocked. Always.Also baseless accusations of falsifying data is disgusting.
JarredWalton - Thursday, October 2, 2014 - link
Holy cow! Let me start by saying that the paragraph causing the most controversy was (in my mind) rather innocuous. Seriously, saying "the game is crashing at times with Advanced PhysX enabled" was not the main point of this post! Anyway, I'm going to confine all of my responses in regards to the comments on stability/drivers to this single post. And for the record, yes, I deleted several of my earlier comments in order to clean things up (there's no edit function for me either, but I can at least delete my comments and post them again in edited form). I repeated myself a few times, so I've wiped those four initial responses and I'm going to try to cover all my bases in this single comment, which is in response to the first 10 or so comments (with a heavy slant towards TheJian).First, while the Zotac is technically not a stock card (GPU clock is 1076 and RAM is 7010, compared to 1050/7000 for true "stock"), I don't think that's the problem. It runs very quiet, but it's a bit too quiet as it's getting relatively hot. I don't know if it's the GPU core or some other element, but where the card tended to crash after 5-10 minutes of heavy load initially in some games, a small bump in fan speed ramps did the trick for fixing that problem.
The exception is MLL Redux with Advanced PhysX enabled, where it's still consistently crashing on the second pass of the benchmark, sometimes even the first pass. It could be the card, it could be the game, or it could be the drivers – or perhaps a little bit of each. However, where all the other settings and games I've tried now run fine on the GTX 970, Redux with PhysX is unstable. This was not the case with other NVIDIA GPUs in my limited testing of PhysX, but the GTX 970 can't even make it through two passes of the benchmark. (I run three passes and use the best result of the second or third pass for the reported results, if you're wondering).
That makes me think it's the drivers, and I've pinged NVIDIA on it so we'll see, but as I noted the Advanced PhysX really doesn't seem to do anything useful so it's not a huge issue. I'm currently doing additional testing, and where Advanced PhysX crashed on the second run at high and very high settings, it managed to make it through three full passes before crashing on the fourth loop when quality was dropped to Medium. Advanced PhysX looks like the main culprit, but it takes longer to manifest at lower quality settings – and we still don't know if the root issue is with the game or the (PhysX) drivers.
Continuing with the above, I swapped GPUs and went back to the GTX 780 to run some tests with Advanced PhysX doing 8 passes of the benchmark. At 1080p High and Very High + SSAA, the benchmark completed all eight passes without crashing to the desktop. At 2560x1440 however, it crashed to desktop on the fourth pass (51 seconds into the benchmark). So instability with Advanced PhysX is certainly present on other NVIDIA GPUs, but it appears to be less of a problem than it is on the GTX 970. (And note: I'm only talking about Redux with Advanced PhysX here – at least initial, limited testing of Batman: Arkham Origins didn't encounter any problems.)
Keep in mind, we're talking about a brand new GPU on a brand (remastered) new game, which is sort of the point of this article -- how does a recent release perform? In this case, Redux is unstable on my Zotac GTX 970, but only at certain settings (basically higher quality settings). I had to bump up the fan speed to address this, and now the only remaining problem is PhysX. The other NVIDIA GPUs didn't encounter this problem in my normal testing, at least not in three passes of the benchmark, but then they've all been around for 6+ months. Again, this is the reason to test new releases on a broader set of hardware in articles like this. What does a hardware reviewer find in terms of performance and stability? In this case, performance is about what you'd expect from a "new" Metro game, and stability issues occurred when enabling Advanced PhysX, particularly on the GTX 970.
On a broader note, this does not affect my opinion of the GTX 970 as a good card. Heck, I went out and bought the Zotac because it was a very compelling card – I wanted one for my own use! So yeah, TheJian, I love AMD so much and I hate NVIDIA so badly that I spent $329 on a Zotac GTX 970. [Rolls eyes] Okay, part of the reason for getting the card was to also run tests like this, and the only reason I bought the Zotac instead of a different GTX 970 was because the others were all out of stock. With my fan speed tweaks, it's running at <70C under full load and still generates far less noise than, say, the GTX 780 – and the R9 290X is in a completely different category when it comes to noise (in a bad way).
Finally, I've gone back and edited/clarified the text a bit just to make sure everyone knows what I'm saying when I discuss potential issues with drivers and the GTX 970 – basically, a more concise version of this comment. If you've got any remaining concerns/comments, let me know.
--Jarred Walton
JarredWalton - Friday, October 3, 2014 - link
Addendum: So I think PhysX is actually the only problem with the Zotac 970. I went back to retest some more, and while I thought the game had crashed at 2560x1440 without PhysX, I can't get that to happen now. Of course I've reinstalled the drivers today so that may have had something to do with things as well. I still like to run my cards a bit cooler, and the fans on the 970 are quiet enough that I don't mind ramping up the RPMs a bit more, but YMMV.Carfax83 - Friday, October 3, 2014 - link
Hi Jarred, yeah it's definitely a bug that affects Maxwell cards only. Running PhysX in CPU mode fixes it. Metro Last Light Redux uses PhysX 3.3 which runs way faster on CPU than previous editions, so turning it on doesn't have any sort of performance hit that I could gather when running on the CPU..JarredWalton - Friday, October 3, 2014 - link
Is there any benefit to even turning Advanced PhysX on in this game? Maybe it's not visible in the benchmark scene, as I can't tell any difference between having it on or off.Carfax83 - Friday, October 3, 2014 - link
Yeah, I would say there's a benefit. You get a lot more particles, debris, destruction, smoke and fog effects plus some cloth as well.. Some of the effects aren't as interactive as they used to be in the original games, but thats because PhysX 3.x is geared more towards the CPU than the GPU. It doesn't really matter though as the overall effect is still solid in terms of how it adds to the atmosphere of the game.I'm still in shock at how well PhysX 3.3 runs on the CPU, because the 2.x versions all ran horribly on it. It scales perfectly across my overclocked 3930K and runs without a hitch! It's a sign of things to come with future PhysX titles to be sure..