The NVIDIA GeForce GTX 780 Ti Review
by Ryan Smith on November 7, 2013 9:01 AM ESTMeet The GeForce GTX 780 Ti
When it comes to the physical design and functionality of the GTX 780 Ti, to no surprise NVIDIA is sticking with what works. The design of the GTX Titan and its associated cooler have proven themselves twice over now between the GTX Titan and the GTX 780, so with only the slightest of changes this is what NVIDIA is going with for GTX 780 Ti, too. Consequently there’s very little new material to cover here, but we’ll quickly hit the high points before recapping the general design of what has now become the GTX 780 series.
The biggest change here is that GTX 780 Ti is the first NVIDIA launch product to feature the new B1 revision of their GK110 GPU. B1 has already been shipping for a couple of months now, so GTX 780 Ti isn’t the first card to get this new GPU. However while GTX Titan and GTX 780 products currently contain a mix of the old and new revisions as NVIDIA completes the change-over, GTX 780 Ti will be B1 (and only B1) right out the door.
As for what’s new for B1, NVIDIA is telling us that it’s a fairly tame revision of GK110. NVIDIA hasn’t made any significant changes to the GPU, rather they’ve merely gone in and fixed some errata that were in the earlier revision of GK110, and in the meantime tightened up the design and leakage just a bit to nudge power usage down, the latter of which is helpful for countering the greater power draw from lighting up the 15th and final SMX. Otherwise B1 doesn’t have any feature changes nor significant changes in its power characteristics relative to the previous revision, so it should be a fairly small footnote compared to GTX 780.
The other notable change coming with GTX 780 Ti is that NVIDIA has slightly adjusted the default temperature throttle point, increasing it from 80C to 83C. The difference in cooling efficiency itself will be trivial, but since NVIDIA is using the exact same fan curve on the GTX 780 Ti as they did the GTX 780, the higher temperature throttle effectively increases the card’s equilibrium point, and therefore the average fan speed under load. Or put another way, but letting it get a bit warmer the GTX 780 Ti will ramp up its fan a bit more and throttle a bit less, which should help offset the card’s increased power consumption while also keeping thermal throttling minimized.
GeForce GTX 780 Series Temperature Targets | ||||
GTX 780 Ti Temp Target | GTX 780 Temp Target | GTX Titan Temp Target | ||
83C | 80C | 80C |
Moving on, since the design of the GTX 780 Ti is a near carbon copy of GTX 780, we’re essentially looking at GTX 780 with better specs and new trimmings. NVIDIA’s very effective (and still quite unique) metallic GTX Titan cooler is back, this time featuring black lettering and a black tinted window. As such GTX 780 Ti remains a 10.5” long card composed of a cast aluminum housing, a nickel-tipped heatsink, an aluminum baseplate, and a vapor chamber providing heat transfer between the GPU and the heatsink. The end result is the GTX 780 Ti is a quiet card despite the fact that it’s a 250W blower design, while still maintaining the solid feel and eye-catching design that NVIDIA has opted for with this generation of cards.
Drilling down, the PCB is also a re-use from GTX 780. It’s the same GK110 GPU mounted on the same PCB with the same 6+2 phase power design. This being despite the fact that GTX 780 Ti features faster 7GHz memory, indicating that NVIDIA was able to hit their higher memory speed targets without making any obvious changes to the PCB or memory trace layouts. Meanwhile the reuse of the power delivery subsystem is a reflection of the fact that GTX 780 Ti has the same 250W TDP limit as GTX 780 and GTX Titan, though unlike those two cards GTX 780 Ti will have the least headroom to spare and will come the closest to hitting it, due to the general uptick in power requirements from having 15 active SMXes. Finally, using the same PCB also means that GTX 780 has the same 6pin + 8pin power requirement and the same display I/O configuration of 2x DL-DVI, 1x HDMI, 1x DisplayPort 1.2.
On a final note, for custom cards NVIDIA won’t be allowing custom cards right off the bat – everything today will be a reference card – but with NVIDIA’s partners having already put together their custom GK110 designs for GTX 780, custom designs for GTX 780 Ti will come very quickly. Consequently, expect most (if not all of them) to be variants of their existing custom GTX 780 designs.
302 Comments
View All Comments
Wreckage - Thursday, November 7, 2013 - link
The 290X = Bulldozer. Hot, loud, power hungry and unable to compete with an older architecture.Kepler is still king even after being out for over a year.
trolledboat - Thursday, November 7, 2013 - link
Hey look, it's a comment from a permanently banned user at this website for trolling, done before someone could of even read the first page.Back in reality, very nice card, but sorely overpriced for such a meagre gain over 780. It also is slower than the cheaper 290x in some cases.
Nvidia needs more price cuts right now. 780 and 780ti are both badly overpriced in the face of 290 and 290x
neils58 - Thursday, November 7, 2013 - link
I think Nvidia probably have the right strategy, G-Sync is around the corner and it's a game changer that justifies the premium for their brand - AMD's only answer to it at this time is going crossfire to try and ensure >60FPS at all times for V-Sync. Nvidia are basically offering a single card solution that even with the brand premium and G-sync monitors comes out less expensive than crossfire. 780Ti for 1440p gamers, 780 for for 1920p gamers.Kamus - Thursday, November 7, 2013 - link
I agree that G-Sync is a gamechanger, but just what do you mean AMD's only answer is crossfire? Mantle is right up there with g-sync in terms of importance. And from the looks of it, a good deal of AAA developers will be supporting Mantle.As a user, it kind of sucks, because I'd love to take advantage of both.
That said, we still don't know just how much performance we'll get by using mantle, and it's only limited to games that support it, as opposed to G-Sync, which will work with every game right out of the box.
But on the flip side, you need a new monitor for G-Sync, and at least at first, we know it will only be implemented on 120hz TN panels. And not everybody is willing to trade their beautiful looking IPS monitor for a TN monitor, specially since they will retail at $400+ for 23" 1080p.
Wreckage - Thursday, November 7, 2013 - link
Gsync will work with every game past ad present. So far Mantle is only confirmed in one game. That's a huge difference.Basstrip - Thursday, November 7, 2013 - link
TLDR: When considering Gsync as a competitive advantage, add the cost of a new monitor. When considering Matnle support, think multiplatform and think next-gen consoles having AMD GPUs. Another plus side for NVidia is shadowplay and SHIELD though (but again, added costs if you consider SHIELD).Gsync is not such a game changer as you have yet to see both a monitor with Gsync AND its pricing. The fact that I would have to upgrade my monitor and that that Gsync branding will add another few $$$ on the price tag is something you guys have to consider.
So to consider Gsync as a competitive advantage when considering a card, add the cost of a monitor to that. Perfect for those that are going to upgrade soon but for those that won't, Gsync is moot.
Mantle on its plus side will be used on consoles and pc (as both PS4 and Xbox One have AMD processors, developpers of games will most probably be using it). You might not care about consoles but they are part of the gaming ecosystem and sadly, we pc users tend to get the shafted by developpers because of consoles. I remember Frankieonpc mentioning he used to play tons of COD back in the COD4 days and said that development tends to have shifted towards consoles so the tuning was a bit more off for pc (paraphrasing slightly).
I'm in the market for both a new monitor and maybe a new card so I'm a bit on the fence...
Wreckage - Thursday, November 7, 2013 - link
Mantle will not be used on consoles. AMD already confirmed this.althaz - Thursday, November 7, 2013 - link
Mantle is not used on consoles...because the consoles already have something very similar.Kamus - Thursday, November 7, 2013 - link
You are right, consoles use their own API for GCN, guess what mantle is used for?*spoiler alert* GCN
EJS1980 - Thursday, November 7, 2013 - link
Mantle is irrefutably NOT coming to consoles, so do your due diligence before trying to make a point. :)