ASUS Eee Pad Transformer Prime & NVIDIA Tegra 3 Review
by Anand Lal Shimpi on December 1, 2011 1:00 AM ESTThe Display: Perfect
The original Transformer had a display that performed similarly to the iPad, but was far more reflective thanks to a fairly large gap between the outer glass and the LCD panel underneath. I excused the first generation Eee Pad in the display department because it was good enough and $100 cheaper than the competing Apple solution. The Prime reaches price parity with the iPad 2, and as a result it must meet a higher standard. ASUS doesn't disappoint - the Eee Pad Transformer Prime has the best display I've seen on a tablet to date.
The resolution is a Honeycomb-standard 1280 x 800. The 16:10 panel measures 10.1-inches diagonally, giving it a very similar surface area to the iPad 2's 9.7-inch 4:3 display. The increase in resolution more than makes up for the larger screen however, ASUS delivers 145 pixels per inch compared to the iPad 2's now quite-dated ~132 PPI.
It's not all about pixel density here, the Transformer Prime has better white and black levels than anything else in its class. It also sets the new benchmark for contrast ratio at nearly 1200:1. The huge gap between the outermost glass and the IPS LCD panel has been reduced significantly, in turn reducing glare.
ASUS also has a Super IPS+ mode that drives the display to a class-leading 683 nits. The Super IPS+ mode obviously draws more power but ASUS recommends it if you're trying to use your tablet outdoors. In our review of the PlayBook we found that 600 nits was really the cutoff for usability in sunny conditions, and ASUS easily exceeds that. It's also worth pointing out that while Super IPS+ increases black levels as well, the resulting contrast ratio remains the same.
Original TF (left) vs. Super IPS+ enabled on the TF Prime (right)
iPad 2 (left) vs. Super IPS+ enabled on the TF Prime (right)
Viewing angles are absolutely awesome. Yes this is the same ASUS that let us down with the UX panels but it definitely got the panel right when it came to the Transformer Prime. Fingerprints are still going to be evident on the display but they don't seem to be as bad as on the original Transformer, and they do wipe off easily. This time around ASUS bundles a microfiber cloth to aid in keeping your Transformer looking fresh.
ASUS, Apple and the rest of the tablet world are in hot pursuit of even higher resolution panels, the problem is yields on these small 1080p and 2048x1536 panels just aren't high enough yet. The Android crowd will have to wait, although Apple is apparently pushing very hard (and trying to buy up a lot of inventory) to deliver a "retina display" equipped iPad 2+/3 by Q2 next year. I'm hearing Q3/Q4 for everyone else and it's still not a guarantee that Apple will be able to meet its aggressive targets either at this point.
204 Comments
View All Comments
abcgum091 - Thursday, December 1, 2011 - link
After seeing the performance benchmarks, Its safe to say that the ipad 2 is an efficiency marvel. I don't believe I will be buying a tablet until windows 8 is out.ltcommanderdata - Thursday, December 1, 2011 - link
I'm guessing the browser and most other apps are not well optimized for quad cores. The question is will developers actually bother focusing on quad cores? Samsung is going with fast dual core A15 in it's next Exynos. The upcoming TI OMAP 4470 is a high clock speed dual core A9 and OMAP5 seem to be high clock speed dual core A15. If everyone else standardizes on fast dual cores, Tegra 3 and it's quad cores may well be a check box feature that doesn't see much use putting it at a disadvantage.Wiggy McShades - Thursday, December 1, 2011 - link
If the developer is writing something in java (most likely native code applications too) it would be more work for them to ensure they are at most using 2 threads instead of just creating as many threads as needed. The amount of threads a java application can create and use is not limited to the number of cores on the cpu. If you created 4 threads and there are 2 cores then the 4 threads will be split between the two cores. The 2 threads per core will take turns executing with the thread who has the highest priority getting more executing time than the other. All non real time operating systems are constantly pausing threads to let another run, that's how multitasking existed before we had dual core cpu's. The easiest way to write an application that takes advantage of multiple threads is to split up the application into pieces that can run independently of each other, the amount of pieces being dependent on the type of application it is. Essentially if a developer is going to write a threaded application the amount of threads he will use will be determined by what the application is meant to do rather than the cores he believes will be available. The question to ask is what kind of application could realistically use more than 2 threads and can that application be used on a tablet.Operaa - Monday, January 16, 2012 - link
Making responsive today UI most certainly requires you to use threads, so shouldn't be big problem. I'd say 2 threads per application is absolutely a minimum. For example, talking about browsing web, I would imagine useful to handle ui in one thread, loading page in one, loading pictures in third and running flash in fourth (or more), etc.UpSpin - Thursday, December 1, 2011 - link
ARM introduced big.LITTLE which only makes sense in Quad or more core systems.NVIDIA is the only company with a Quad core right now because they integrated this big.LITTLE idea already. Without such a companion core does a quad core consume too much power.
So I think Samsung released a A15 dual core because it's easier and they are able to release a A15 SoC earlier. They'll work on a Quad core or six or eight core, but then they have to use the big.LITTLE idea, which probably takes a few more months of testing.
And as we all know, time is money.
metafor - Thursday, December 1, 2011 - link
/bogglebig.Little can work with any configuration and works just as well. Even in quad-core, individual cores can be turned off. The companion core is there because even at the lowest throttled level, a full core will still produce a lot of leakage current. A core made with lower-leakage (but slower) transistors can solve this.
Also, big.Little involves using different CPU architectures. For example, an A15 along with an A7.
nVidia's solution is the first step, but it only uses A9's for all of the cores.
UpSpin - Friday, December 2, 2011 - link
I haven't said anything different. I just added that Samsung wants to be one of the first who release a A15 SoC. To speed things up they released a dual core only, because there the advantage of a companion core isn't that big and the leakage current is 'ok'. It just makes the dual core more expensive (additional transistors needed, without such a huge advantage)But if you want to build a quad core, you must, just as Nvidia did, add such a companion core, else the leakage current is too high. But integrating the big.LITTLE idea probably takes additional time, thus they wouldn't be the first who produced a A15 based SoC.
So to be one of the first, they chose to take the easiest design, a dual core A15. After a few months and additional time of RD they will release a quad core with big.LITTLE and probably a dual core and six core and eigth core with big.LITTLE, too.
hob196 - Friday, December 2, 2011 - link
You said:"ARM introduced big.LITTLE which only makes sense in Quad or more core systems"
big.LITTLE would apply to single core systems if the A7 and A15 pairing was considered one core.
UpSpin - Friday, December 2, 2011 - link
Power consumption wise it makes sense to pair an A7 with a single and dual core already.Cost wise it doesn't really make sense.
I really doubt that we will see some single core A15 SoC with a companion core. And dual core, maybe, but not at the beginning.
GnillGnoll - Friday, December 2, 2011 - link
It doesn't matter how many "big" cores there are, big.LITTLE is for those situations where turning on even a single "big" core is a relatively large power draw.A quad core with three cores power gated has no more leakage than a single core chip.