The Best of CES 2012
by Jarred Walton on January 17, 2012 6:15 PM EST- Posted in
- Trade Shows
- Lenovo
- Asus
- CES
- Transformer Prime
- CES 2012
- Yoga 13
- 4K
Looking Forward to WUXGA and QXGA Tablets
In a similar vein to the 4K displays, it looks like many tablets are getting a serious resolution bump in the next few months. When I complain regularly about the state of laptop displays (I can count the number of good laptop LCDs we saw in the last year on one hand), it gives me hope to see tablets pushing for higher quality, higher resolution panels. Amazingly enough, ASUS has announced that the Eee Prime Transformer will receive a 1920x1200 update in Q2 this year (and for the record, they’re not the only ones planning on using such a panel). Rumors suggest that the iPad 3 will go one step further and offer a QXGA (2048x1536) panel, sticking with the 4:3 aspect ratio of previous iPads—though of course Apple hasn’t officially announced anything yet—and there's even talk of some QSXGA (2560x2048) and/or QWXGA (2560x1600) tablets shipping later this year.
I had the chance to play with the upcoming Eee Prime Transformer TF700T, and I loved the increased resolution. Surprisingly, the Tegra 3 chipset appeared able to handle WUXGA quite well, though I didn’t get a chance to test any games. Gaming at WUXGA is going to really stress current SoC GPUs, however, at least if you want decent quality settings. Many desktop users—even those with high-end cards like the GTX 570/HD 6970—run at 1920x1200, albeit with significantly higher quality textures and geometry than seen in tablet games. Even so, pushing ~2MP on a tablet at decent frame rates will very likely need more memory bandwidth and faster GPUs; I expect many games will run at a lower resolution and simply scale the image to the screen size. Outside of gaming, however, higher resolutions can be very useful. Browsing the web at 1280x720 is doable, 720x1280 not so much; 1080x1920 on the other hand is wide enough for all the 1024-width websites that you won’t have to zoom out to see it. Plus, text and images in general will be improved.
What really irks me is that all of this comes in a 10.1” IPS package, exactly what I’ve been asking for in laptops for the past several years. What’s more, the price point for these is in the <$600 range, and we’re still getting 16:10 aspect ratio panels instead of being forced into 16:9. I asked several manufacturers, "How is it we're getting 16:10 aspect ratio tablets with IPS WUXGA displays, and you still can't put anything better than a low quality 1366x768 TN panel into your laptops?" Naturally, they blamed the display manufacturers and consumers for not being willing to buy better quality laptops.
There's certainly some truth to that, but it's also a matter of supply and demand; if ASUS for instance were to order a million ~13.3" 1920x1200 IPS laptop displays, I'm sure they could get prices down to <$1000 for a quality laptop. Naturally, they're worried that the laptops wouldn't sell well enough and they’d get stuck with a bunch of “too expensive” laptops. With all the $500 Best Buy laptops floating around they may be right, but I wish I could convince more people to stop settling for low quality displays in their laptops. That brings me to my final top-three device/tech that impressed me at CES.
78 Comments
View All Comments
cheinonen - Tuesday, January 17, 2012 - link
I was covering home theater and video, and only got to spend two days on the show floor, but Sony's CrystalLED prototype was just amazing. Very bright, 180 degree viewing angles with no color or contrast shifts, near infinite contrast ratios, and perfect motion with no blurring or other motion artifacts. I can only hope that Sony decides to release it at an affordable cost, as it's just amazing to see.The OLED sets might have been almost as good, but the off-angles were not as good, and the demo content was not good for getting an idea of the quality compared to Sony. Of course they might ship this year and we have no idea when/if the Sony will be released. The 8K panel from Sharp was also just a proof-of-concept design, but amazingly detailed to the point that you can stick your head next to it and see no pixels. The contrast and angles were not nearly as good as the CrystalLED, though.
Nothing in Blu-ray really amazed me, as the only different feature I really saw was Sony offering 4K upconversion on their new player for their 4K projector, but I'd need a 4K projector to be able to evaluate that anyway. Overall it was the new panel technologies that really stood out to me.
AnnihilatorX - Wednesday, January 18, 2012 - link
ZDnet article said something different regarding the CrystalLED:"Reports from the show floor came away impressed, if not awed. Engadget said the sample set on view failed to show off the speedy refresh rates, and our sister site CNET found that OLED TVs provided a bit more “wow.” CNET also posted a short video examining Sony’s Crystal LED Display in more detail that you can watch here. "
cheinonen - Wednesday, January 18, 2012 - link
Which is fine. The OLEDs might be better, but the way the demo was setup on the floor I just really couldn't get a good idea for it, and the color shift on the LG model was a bit annoying since the Sony LED set had absolutely zero shift. I believe that Samsung had a demo unit setup in a private room that some journalists managed to see, though I did not, so that might have had better material or a better environment and led to a better response than I had. The other AV writers that I talked to during and after the show came away a bit split on the two, though we all want one of them in our living rooms.Unfortunately no video that anyone took will do justice of the motion on the CrystalLED, since you'll be watching it on a conventional display. I imagine it might never come out, but we can all hope Sony finds a way to produce it since the results were amazing.
B3an - Wednesday, January 18, 2012 - link
Whats the difference between OLED and Crystal LED? Is Crystal LED just Sony's marketing BS for OLED? They both seem extremely similar.The Samsung TV at the show had a "Super OLED" display though. Super OLED sets don't use a color filter resulting in pictures with deeper contrasts and finer detail. So it should have been better.
therealnickdanger - Wednesday, January 18, 2012 - link
Realistically CLED will likely never see the light of day. Sony stated that it was a tech demo and that they have no current plans to produce them. Considering each pixel is composed of 3 LEDs (RGB) on a chip, the display would be cost-prohibitive to build and sell in any mass market. Sony can't "choose" to release it at an affordable cost unless they find a way to make cheaper LEDs and find cheaper ways to connect them all.Even if you could buy a single LED for $0.01 (one cent USD - which you can't), you would need 6 million of them. I'll math for you: $60,000 for just one display. And that's only for 1080p, 4K will be mainstream before this tech will. LEDs have been in mass use for decades in all manner of electronics and the prices aren't even close to make LEDs cheap enough for this tech to work.
This is where OLED comes in as a realistic alternative. Although as I understand they still need to work on its retention performance.
demonbug - Tuesday, January 17, 2012 - link
I've seen a lot of discussion of 4k displays following this year's CES, and invariably brief mention is made of the limited source material available. So; what 4k sources ARE available today? What are the demos running off of? What kind of processing power would it take to play, say, a 4k video stream encoded the same way as a blu-ray (I'm assuming 40 Mbit max for 2k video would roughly translate to 160 Mbit for 4k)?Basically, beyond getting the displays into production, what needs to happen before 4k becomes a wider reality? Have we seen some significant improvement in compression technology in the last 5 years that would make 4k satellite broadcasts possible without sacrificing a huge number of channels?
4k sounds great, and and on the one hand it is just the next logical increment after 2k HD. However, it seems that we are still just barely managing the bitrates required by 2k HD in terms of storage, transmission, and playback; how close are we realistically to making the 4x jump in all of these to make 4k useful?
JarredWalton - Tuesday, January 17, 2012 - link
I think 4K will largely be for home theater buffs initially, with Blu-ray players that upconvert to 4K. Then we'll get something post-Blu-ray that will have new DRM and support higher bitrates. Of course, average bitrate of 50Mbps could still fit a two hour movie on a 50GB Blu-ray, so maybe it will use the same disc medium but with new standards? Don't know, but we'll see.hechacker1 - Tuesday, January 17, 2012 - link
Doesn't Blu-ray scale with layers? AFAIK, they've demonstrated versions with 10 or more layers. So we'd just need updated drives to read them.Fanfoot - Tuesday, January 17, 2012 - link
From doing a bit of Googling, it looks like 100GB is the likely requirement for 4K movies, which means 4 layers rather than 2. Apparently most Blu-Ray disk players can only read 2 layers, so would have to be upgraded. I suspect the bit rates would blow them up even if they did support the BDXL format...chizow - Wednesday, January 18, 2012 - link
@Jarred,Why wait for home theater/movie buffs to catch up when PC gaming could take full advantage of this tech today?
We just need 4K/2K to be supported over a single connector or for both IHVs to implement their professional single resolution over multiple display solutions on desktop parts, like the Quadro version described here:
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
"professional level multi-display technology called "NVIDIA Scalable Visualization Solutions" that will allow multiple monitors to function as a single display to the OS and "just work" with any application."