Video Decode

One of the stones we've thrown at NVIDIA is the lack of high profile H.264 decode support. Tegra 2 can decode main profile H.264 at up to 20Mbps, but throw any high profile 1080p content at the chip and it can't do it. This is a problem because a lot of video content out there today is high profile, high bitrate 1080p H.264. Today, even on Tegra 2, you'll have to transcode a lot of your 1080p video content to get it to play on the phone.

With Kal-El, that could change.

NVIDIA's video decoder gets an upgrade in Kal-El to support H.264 at 40Mbps sustained (60Mbps peak) at a resolution of 2560 x 1440. This meets the bandwidth requirements for full Blu-ray disc playback. NVIDIA didn't just make the claim however, it showed us a 50Mbps 1440p H.264 stream decoded and output to two screens simultaneously: a 2560 x 1600 30" desktop PC monitor and a 1366 x 768 tablet display.

Did I mention that this is 12-day-old A0 silicon?

Kal-El also supports stereoscopic 3D video playback, although it's unclear to me what the SoC's capabilities are for 3D capture.

I asked NVIDIA if other parts of the SoC have changed, particularly the ISP as we've seen in both the Optimus 2X and Atrix 4G articles that camera quality is pretty poor on the initial Tegra 2 phones. NVIDIA stated that both ISP performance and quality will go up in Kal-El although we don't know any more than that. NVIDIA did insist that its own development Tegra 2 platforms have good still capture quality, so what we've seen from LG and Motorola may just be limited to those implementations.


The Architecture Final Words
Comments Locked


View All Comments

  • theagentsmith - Wednesday, February 16, 2011 - link

    There is the Mobile World Congress happening right now in the nice city of Barcelona.... almost every company involved in mobile electronics sector is showing off new products, that's why you see only news about smartphones!
  • R3MF - Wednesday, February 16, 2011 - link

    nvidia, you have not lost the magic!
  • Dribble - Wednesday, February 16, 2011 - link

    @40nm the power draw would be too high for a phone so I don't suppose there's much point having this processor in one until 28nm arrives.

    However for the new tablet market you have larger batteries so you can target them with a higher power draw soc (it's still going to be much much smaller then any x86 chip and I expect the big screen will still be sucking most of the power).

    Impressive they got it working first time, puts a lot of pressure on competitors who are still struggling to catch up with tegra 2 let alone compete with this.
  • SOC_speculation - Wednesday, February 16, 2011 - link

    Very cool chip, lots of great technology. But it will not be successful in the market.
    a 1080p high profile decode onto a tablet's SXGA display can easily jump into the 1.2GB/s range. if you drive it over HDMI to a TV and then run a small game or even a nice 3D game on the tablet's main screen, you can easily get into the 1.7 to 2GB/s range.

    why is this important? a 533Mhz lpddr2 channel has a max theoretical bandwidth of 4.3GB/s. Sounds like enough right? well, as you increase frequency of ddr, your _actual_ bandwidth lowers due to latency issues. in addition, across workloads, the actual bandwidth you can get from any DDR interface is between 40 to 60% of the theoretical max.

    So that means the single channel will get between 2.5GBs (60%) down to 1.72 (40%). Trust me, ask anyone who designs SOCs, they will confirm the 40 to 60% bandwidth #.
    So the part will be restricted to use cases that current single core/single channel chips can do.

    So this huge chip with 4 cores, 1440p capable, probably 150MT/s 3D, has an Achilles heel the size of Manhattan. Don't believe what Nvidia is saying (that dual channel isn't required). They know its required but for some reason couldn't get it into this chip.

  • overzealot - Monday, February 21, 2011 - link

    Actually, as memory frequency increases bandwidth and latency improve.
  • araczynski - Wednesday, February 16, 2011 - link

    so if i know that what i'm about to buy is outdated by a factor of two or five not even a year later, i'm not very likely to bother buying at all.
  • kenyee - Wednesday, February 16, 2011 - link

    Crazy how fast stuff is progressing. I want least this might justify the crazy price of a Moto Xoom tablet.... :-)
  • OBLAMA2009 - Wednesday, February 16, 2011 - link

    it makes a lot of sense to differentiate phones from tablets by giving them much faster cpus, higher resolutions and longer battery life. otherwise why get a tablet if you have a cell phone
  • yvizel - Wednesday, February 16, 2011 - link

    " NVIDIA also expects Kal-El to be somewhere in the realm of the performance of a Core 2 Duo processor (more on this later)."

    I don't think that you referred to this statement anywhere in the article.

    Can you elaborate?
  • Quindor - Wednesday, February 16, 2011 - link

    Seems to me NVidia might be pulling a Qualcomm, meaning they are going with what they have and are trying to stretch it out longer and wider before giving us the complete redesign/refresh. You can see this quite clearly at the MWC right now.

    Not a bad strategy as far as I can tell right now. Only threat that I see is that Qualcomm is actually scheduled to release their new core design around the time Nvidia will releasing the Kal-El.

    So who's going to win that bet? ;) More IPC VS Raw Ghz/cores. Quite a reversed world too if you ask me, because Qualcomm was never big on IPC and went for the 1Ghz hype.

    Hopefully NVidia doesn't make the same mistakes as with the GPU market, building such a revolutionary designs that they actually design "sideways" from the market. Making their GPU's fantastic in certain area's, which might not take off at all.

    Mind you, I'm an NVidia fan... but it won't be the first time NVidia releases a revolutionary architecture, which isn't as efficient as they thought it would be. ;)

Log in

Don't have an account? Sign up now