NVIDIA’s GeForce 700M Family: Full Details and Specs
by Jarred Walton on April 1, 2013 9:00 AM ESTIntroducing the NVIDIA GeForce 700M Family
With spring now well under way and the pending launch of Intel’s Haswell chips, OEMs always like to have “new” parts across the board, and so once more we’re getting a new series of chips from NVIDIA, the 700M parts. We’ve already seen a few laptops shipping with the 710M and GT 730M; today NVIDIA is filling out the rest of 700M family. Last year saw NVIDIA’s very successful launch of mobile Kepler; since that time, the number of laptops shipping with NVIDIA dGPUs compared to AMD dGPUs appears to have shifted even more in NVIDIA’s favor.
Not surprisingly, with TSMC still on 28nm NVIDIA isn’t launching a new architecture, but they’ll be tweaking Kepler to keep it going through 2013. Today's launch of the various 700M GPUs is thus very similar to what we saw with the 500M launch: everything in general gets a bit faster than the previous generation. To improve Kepler NVIDIA is taking the existing architecture and making a few moderate tweaks, improving their drivers (which will also apply to existing GPUs), and as usual they’re continuing to proclaim the advantages of Optimus Technology.
Starting on the software side of things, we don’t really have anything new to add on the Optimus front, other than to say that in our experience it continues to work well on Windows platforms—Linux users may feel otherwise, naturally. On the bright side, things like the Bumblebee Project appear to be helping the situation, so now it's at least possible to utilize the dGPU and iGPU under Linux. As far as OEMs go, Optimus has now matured to the point where I can't immediately come up with any new laptop that has an NVIDIA GPU and doesn't support Optimus; we're now at the point where an NVIDIA equipped laptop inherently implies Optimus support.
The second software aspect is NVIDIA’s GeForce Experience software, which allows for automatic game configuration based on your hardware. You can see the full slide deck in the gallery at the end with a few additional details, but GeForce Experience is a new software tool that’s designed to automatically adjust supported games to the “best quality for your hardware” setting. This may not seem like a big deal for enthusiasts, but for your average Joe that doesn’t know what all the technical names mean (e.g. antialiasing, anisotropic filtering, specular highlighting, etc.) it’s a step towards making PCs more gamer friendly—more like a console experience, only with faster hardware. ;-) GeForce Experience is already in open beta, with over 1.5 million downloads and counting, so it’s definitely something people are using.
Finally, NVIDIA has added GPU Boost 2.0 to the 700M family. This is basically the same as what’s found in GeForce Titan, though with some tuning specific to mobile platforms as opposed to desktops. We’re told GPU Boost 2.0 is the same core hardware as GPU Boost 1.0, with software refinements allowing for more fine-grained control of the clocks. Ryan has already covered GPU Boost 2.0 extensively, so we won’t spend much more time on it other than to say that over a range of titles, NVIDIA is getting a 10-15% performance improvement relative to GPU Boost 1.0.
Moving to the hardware elements, hardware change only applies to one of the chips. GK104 will continue as the highest performing option in the GTX 675MX and GTX 680M (as well as the GTX 680MX in the iMac 27), and GK106 will likewise continue in the GTX 670MX (though it appears some 670MX chips also use GK104). In fact, for now NVIDIA isn’t announcing any new high-end mobile GPUs, so the GTX 600M parts will continue to fill that niche. The changes come for everything in the GT family, with some of the chips apparently continuing to use GK107 while a couple options will utilize a new GK208 part.
While NVIDIA won’t confirm which parts use GK208, the latest drivers do refer to that part number so we know it exists. GK208 looks to be largely the same as GK107, and we’re not sure if there are any real differences other than the fact that GK208 will be available as a 64-bit part. Given the similarity in appearance, it may serve as a 128-bit part as well. Basically, GK107 was never available in a 64-bit configuration, and GK208 remedies that (which actually makes it a lower end chip relative to GK107).
91 Comments
View All Comments
Torrijos - Monday, April 1, 2013 - link
Hope they'll carry on giving mac user drivers quickly.Jorgisven - Monday, April 1, 2013 - link
Having spoken with nVidia technical engineers as part of my job, nVidia does not handle drivers for OSX. They "advise", but don't do any of the actual driver writing. Apple does that in-house. Boot Camp Windows, however, follows the same driver update path as everyone else using Windows.Boland - Tuesday, April 2, 2013 - link
nVidia's job descriptions page says otherwise. They're actually looking at expanding their mac driver team.http://www.nvidia.com/page/job_descriptions.html
cpupro - Tuesday, April 2, 2013 - link
Yeah right, nVidia is giving specs of their GPU's to Apple developers so they can write GeForce drivers for OSX. nVidia is not crazy to share their knowledge to competition, because for writing drivers you need to know how GPU internally work.kasakka - Thursday, April 4, 2013 - link
To my understanding it used to be Apple who wrote the drivers but Nvidia has possibly taken the reins back to themselves. There have been some Nvidia driver update releases that are newer than what is found in Apple's updates.TerdFerguson - Monday, April 1, 2013 - link
I'll never, ever, buy another laptop with a discrete GPU. The extra heat and power drain, together with the inflated prices and dishonest marketing just aren't worth the modest performance increase on a machine that will never really provide the same level of gaming performance that even a dirt cheap desktop machine will.If a pair of 680M cards in SLI performs worse than a single 660TI, then it's just plain dishonest for NVidia to keep branding them thusly. I don't see onboard graphics overtaking desktop video boards any time soon, but for laptops the time is near and it can't come soon enough.
geniekid - Monday, April 1, 2013 - link
There are a number of laptops that let you switch between discrete and integrated graphics on demand so you can save power when you're on-the-go and still have that extra power when you're plugged in.As for value versus desktops, yes there's a premium for mobility and the value of that mobility depends greatly on your lifestyle and job conditions.
Flunk - Monday, April 1, 2013 - link
You have a point when it comes to high-end "gaming" laptops that weight 20+ pounds, cost a fortune and perform poorly. But there is a place for mid-range discrete GPUs in smaller systems that allow you to play games at moderate settings if you're on the go.I think the best option would be a small laptop that connects to an external GPU but it appears that the industry disagrees with me.
nehs89 - Monday, April 1, 2013 - link
I totally agree with you.... all laptops in general and also win8 tablets should connect to an external GPU....that would be the solution to many problems.... you want to play heavy duty games just plug in the external GPU and If you want or need portability then use the ultrabook alone....I have also read that with the current technology this is not possibleKitsuneKnight - Monday, April 1, 2013 - link
Sony shipped a laptop that supported a (low end) external dGPU. Another company showed a generic enclosure that could be used to connect a GPU to a computer via Thunderbolt (I'm not sure if it ever actually shipped, though). It certainly is possible, even if there's currently no link that could provide enough bandwidth to let a top-of-the-line GPU run full tilt.I would think nVidia and/or Intel would want to push that market more, but it doesn't seem like anyone really cares, unfortunately. It would be nice to be able to 'upgrade' a laptop's GPU without having to replace the entire thing.