The NVIDIA GeForce GTX 1650 Review, Feat. Zotac: Fighting Brute Force With Power Efficiency
by Ryan Smith & Nate Oh on May 3, 2019 10:15 AM ESTWolfenstein II: The New Colossus (Vulkan)
id Software is popularly known for a few games involving shooting stuff until it dies, just with different 'stuff' for each one: Nazis, demons, or other players while scorning the laws of physics. Wolfenstein II is the latest of the first, the sequel of a modern reboot series developed by MachineGames and built on id Tech 6. While the tone is significantly less pulpy nowadays, the game is still a frenetic FPS at heart, succeeding DOOM as a modern Vulkan flagship title and arriving as a pure Vullkan implementation rather than the originally OpenGL DOOM.
Featuring a Nazi-occupied America of 1961, Wolfenstein II is lushly designed yet not oppressively intensive on the hardware, something that goes well with its pace of action that emerge suddenly from a level design flush with alternate historical details.
The game has 5 total graphical presets: Mein leben!, Uber, Ultra, Medium, and Low. Staying consistent with previous 1080p testing, the highest quality preset, "Mein leben!", was used, in addition to Ultra and Medium. Wolfenstein II also features Vega-centric GPU Culling and Rapid Packed Math, as well as Radeon-centric Deferred Rendering; in accordance with the presets, neither GPU Culling nor Deferred Rendering was enabled.
To preface, an odd performance bug afflicted the two Maxwell cards (GTX 960 and GTX 950), where Medium Image Streaming (poolsize of 768) resulted in out-of-memory sub 2-fps performance. Using the Medium preset but with any other Image Streaming setting returned to normal performance, suggesting an issue with memory allocation, and occuring on earlier drivers as well. It's not clear how much this affects sub 2-fps performance at the "Mein leben!" preset, which is already much too demanding for 2GB of framebuffer.
With the VRAM-hungry nature of Wolfenstein II, the GTX 1650's 4GB keeps it from suffering a premature death like all the other 2GB cards. Or even the 3GB card, in the case of the GTX 1060 3GB. Even at medium settings, the bigger framebuffer turns the tables on the GTX 1060 3GB, which is faster in every other game in the suite. So the GTX 1650 takes the lead in both performance and smoother gameplay experience.
The VRAM-centric approach does uncover more oddities; here the GTX 1650 overcomes the RX 570 4GB at higher settings, but the RX 570 8GB remains undeterred.
126 Comments
View All Comments
Marlin1975 - Friday, May 3, 2019 - link
Not a bad card, but it is a bad price.drexnx - Friday, May 3, 2019 - link
yep, but if you look at the die size, you can see that they're kinda stuck - huge generational die size increase vs GP107, and even RX570/580 are only 232mm2 compared to 200mm2.I can see how AMD can happily sell 570s for the same price since that design has been long paid for vs. Turing and the MFG costs shouldn't be much higher
Karmena - Tuesday, May 7, 2019 - link
Check the prices of RX570, they cost 120$ on newegg. And you can get one under 150$tarqsharq - Tuesday, May 7, 2019 - link
And the RX570's come with The Division 2 and World War Z right now.You can get the ASrock version with 8GB VRAM for only $139!
0ldman79 - Sunday, May 19, 2019 - link
Problem is on an OEM box you'll have to upgrade the PSU as well.Dealing with normies for customers, the good ones will understand, but most of them wouldn't have bought a crappy OEM box in the first place. Most normies will buy the 1650 alone.
AMD needs 570ish performance without the need for auxiliary power.
Yojimbo - Friday, May 3, 2019 - link
Depending on the amount of gaming done, it probably saves over 50 dollars in electricity costs over a 2 year period compared to the RX 570. Of course the 570 is a bit faster on average.JoeyJoJo123 - Friday, May 3, 2019 - link
Nobody in their right mind that's specifically on the market for an aftermarket GPU (a buying decision that comes about BECAUSE they're dissatisfied with the current framerate or performance of their existing, or lack of, a GPU) is making their primary purchasing decision on power savings alone. In other words, people aren't saying "Man, my ForkNight performance is good, but my power bills are too high! In order to remedy the exorbitant cost of my power bill, I'm going to go out and purchase a $150 GPU (which is more than 1 month of my power bill alone), even if it offers the same performance of my current GPU, just to save money on my power bill!"Someone might make that their primary purchasing decision for a power supply, because outside of being able to supply a given wattage for the system, the only thing that matters is its efficiency, and yes, over the long term higher efficiency PSUs are better built, last longer, and provide a justifiable hidden cost savings.
Lower power for the same performance at a similar enough price can be a tie-breaker between two competing options, but that's not the case here for the 1650. It has essentially outpriced itself from competing viably in the lower budget GPU market.
Yojimbo - Friday, May 3, 2019 - link
I don't know what you consider being in a right mind is, but anyone making a cost sensitive buying decision that is not considering total cost of ownership is not making his decision correctly. The electricity is not free unless one has some special arrangement. It will be paid for and it will reduce one's wealth and ability to make other purchases.logamaniac - Friday, May 3, 2019 - link
So I assume you measure the efficiency of the AC unit in your car and how it relates to your gas mileage over duration of ownership as well? since you're so worried about every calculation in making that buying decision?serpretetsky - Friday, May 3, 2019 - link
It doesn't really change the argument if he does or does not take into account his AC unit in his car. Electricity is not free. You can ignore the price of electricity if you want, but your decision to ignore it or not does not change the total cost of ownership. (I'm not defending the electricity calculations above, I haven't verified them)