The NVIDIA SHIELD Tablet Review
by Joshua Ho on July 29, 2014 9:00 AM ESTCPU Performance
As the first Tegra K1 device the Shield tablet is especially interesting. For those unfamiliar with the Tegra K1, NVIDIA integrated four ARM Cortex A15r3 variant cores along with a fifth companion A15r3. While on the surface it seems the CPU configuration it’s largely similar to the Tegra 4, there are some substantial differences. On the process tech side, the move to 28HPm adds SiGe source and drains for PMOS transistors, which dramatically improves drive current and makes it possible to bump clocks up to 2.2GHz for the CPU cores. The new revision of Cortex A15 also means that there’s better power management which should help with power efficiency (and thus battery life). The result is that peak CPU voltage drops from 1.4 volts in Tegra 4 to 1.2V in Tegra K1, and peak clocks are higher in the K1 as well.
Of course, the truly interesting aspect are benchmarks, as those will really show the differences between Tegra 4 and Tegra K1. It will also help to establish how Tegra K1 fares against the competition.
Despite the largely similar clock speeds compared to the Snapdragon 800 we see that the Tegra K1 is generally a step above in performance. Outside of Apple’s A7 SoC and x86 SoCs, NVIDIA is generally solidly ahead of the competition. Of course, as a gaming tablet there’s a strong need for GPU performance, so we'll look at that next.
174 Comments
View All Comments
cknobman - Tuesday, July 29, 2014 - link
Thought Nvidia had a real killer here.Until I saw how crappy the screen is. On a tablet having such crappy color reproduction is just not going to cut it.
ams23 - Tuesday, July 29, 2014 - link
Overall the Shield tablet display is not bad but not great. The black levels, contrast ratio, and saturation accuracy are quite a bit better on Shield tablet compared to iPad Mini Retina. The max brightness and white point accuracy are slightly better on Shield tablet compared to iPad Mini Retina. The grayscale and GMB accuracy are quite a bit worse, however, and are the two areas that need some work.rodolfcarver - Friday, October 3, 2014 - link
I agree that it's not bad, but the truth is that most games will be just as good on some of the top tablets (http://www.consumertop.com/best-tablets/ ), and they will also be better for all other tasks. Therefore I don't see the point of the Nvidia Shield.willis936 - Tuesday, July 29, 2014 - link
You must have skipped the cpu and gpu benchmarks...ddriver - Tuesday, July 29, 2014 - link
Color accuracy is pretty much irrelevant for gaming.B3an - Tuesday, July 29, 2014 - link
Well yeah, if you're a moron.zodiacsoulmate - Tuesday, July 29, 2014 - link
that's mean... also you are wrong color accuracy is so irreverent in gaming...inighthawki - Tuesday, July 29, 2014 - link
Games already use low resolution color palettes. Textures almost never have more than 8 bits per channel (and are often compressed beyond that), and lighting calculations and sampling error is already going to produce generally "wrong" colors with respect to the real world. You're absolutely fooling yourself if you believe you will see a noticeable difference between this and a more accurate display while gaming.mkozakewich - Tuesday, July 29, 2014 - link
"Games" use an incredibly varied set of graphical abilities. Maybe first-person shooters are different, and a lot of hyper-realistic AAA games in general; but there are plenty of games that are bright or cel-shaded, and those look a lot better on a screen with rich colours.You can't just say a display is good or bad. The reason they give us all these specs is so that we can make our own choices. Someone who plays games with muted or washed-out colours can decided that it's fine, and that this works for them based on the tradeoffs it makes.
inighthawki - Tuesday, July 29, 2014 - link
I agree there are cases, typically indie games, where this is true, but this is an incredibly small subset of the game market, and also generally not the target audience of such a device. The shield seems to be targeted more at heavy gamers, especially those who wish to stream games from a high end PC in another room. These are the people who typically have many AAA titles and games where the graphics are so complex, and the amount of estimation used to compute lighting and texture quality is off from realistic values enough to not even realize that in cases of perfect color reproduction by the display, the game could very easily have a high error from the "real world" value anyway.