Power Consumption

There's a lot of uncertainty around whether or not Kepler is suitable for ultra low power operation, especially given that we've only seen it in relatively high TDP (compared to tablets/smartphones) PCs. NVIDIA hoped to put those concerns to rest with a quick GLBenchmark 2.7 demo at Siggraph. The demo pitted an iPad 4 against a Logan development platform, with Logan's Kepler GPU clocked low enough to equal the performance of the iPad 4. The low clock speed does put Kepler at an advantage as it can run at a lower voltage as well, so the comparison is definitely one you'd expect NVIDIA to win. 

Unlike Tegra 3, Logan includes a single voltage rail that feeds just the GPU. NVIDIA instrumented this voltage rail and measured power consumption while running the offscreen 1080p T-Rex HD test in GLB2.7. Isolating GPU power alone, NVIDIA measured around 900mW for Logan's Kepler implementation running at iPad 4 performance levels (potentially as little as 1/5 of Logan's peak performance). NVIDIA also attempted to find and isolate the GPU power rail going into Apple's A6X (using a similar approach to what we documented here), and came up with an average GPU power value of around 2.6W. 

I won't focus too much on the GPU power comparison as I don't know what else (if anything) Apple hangs off of its GPU power rail, but the most important takeaway here is that Kepler seems capable of scaling down to below 1W. In reality NVIDIA wouldn't ship Logan with a < 1W Kepler implementation, so we'll likely see higher performance (and power consumption) in shipping devices. If these numbers are believable, you could see roughly 2x the performance of an iPad 4 in a Logan based smartphone, and 4 - 5x the performance of an iPad 4 in a Logan tablet - in as little as 12 months from now if NVIDIA can ship this thing on time.

If NVIDIA's A6X power comparison is truly apples-to-apples, then it would be a huge testament to the power efficiency of NVIDIA's mobile Kepler architecture. Given the recent announcement of NVIDIA's willingness to license Kepler IP to any company who wants it, this demo seems very well planned. 

NVIDIA did some work to make Kepler suitable for low power, but it's my understanding that the underlying architecture isn't vastly different from what we have in notebooks and desktops today. Mobile Kepler retains all of the graphics features as its bigger counterparts, although I'm guessing things like FP64 CUDA cores are gone.

Final Words

For the past couple of years we've been talking about a point in the future when it'll be possible to start playing console class games (Xbox 360/PS3) on mobile devices. We're almost there. The move to Kepler with Logan is a big deal for NVIDIA. It finally modernizes NVIDIA's ultra mobile GPU, bringing graphics API partity to everything from smartphones to high-end desktop PCs. This is a huge step for game developers looking to target multiple platforms. It's also a big deal for mobile OS vendors and device makers looking to capitalize on gaming as a way of encouraging future smartphone and tablet upgrades. As smartphone and tablet upgrade cycles slow down, pushing high-end gaming to customers will become a more attractive option for device makers.

Logan is expected to ship in the first half of 2014. With early silicon back now, I think 10 - 12 months from now is a reasonable estimate. There is the unavoidable fact that we haven't even seen Tegra 4 devices on the market yet and NVIDIA is already talking about Logan. Everything I've heard points to Tegra 4 being on the schedule for a bunch of device wins, but delays on NVIDIA's part forced it to be designed out. Other than drumming up IP licensing business, I wonder if that's another reason why we're seeing a very public demo of Logan now - to show the health of early silicon. There's also a concern about process node. Logan will likely ship at 28nm next year, just before the transition to 20nm. If NVIDIA is late with Logan, we could have another Tegra 3 situation where NVIDIA is shipping on an older process technology.

Regardless of process tech however, Kepler's power story in ultra mobile seems great. I really didn't believe the GLBenchmark data when I first saw it. I showed it to Ryan Smith, our Senior GPU Editor, and even he didn't believe it. If NVIDIA is indeed able to get iPad 4 levels of graphics performance at less than 1W (and presumably much more performance in the 2.5 - 5W range) it looks like Kepler will do extremely well in mobile.

Whatever NVIDIA's reasons for showing off Logan now, the result is something that I'm very excited about. A mobile SoC with NVIDIA's latest GPU architecture is exactly what we've been waiting for. 

Introduction
Comments Locked

141 Comments

View All Comments

  • Yojimbo - Saturday, July 27, 2013 - link

    I have to disagree. I am not sure about your analysis that anything that comes out later is "by definition behind." Rogue is not promising 400 GFLOPS, is it? And it's expected to be out in Q4 2013 isn't it? What we know for sure is that Logan leapfrogs Rogue in terms of API compliance. Suppose Logan devices come out one year later than Rogue devices, and Logan performs upwards of 50% better than Rogue. Would that not be considered being "caught up?" And I never even said that with Logan, Nvidia would be "caught up." I said that Nvidia has a history of catching up. So my claim is that the advantage PowerVR holds over Nvidia immediately after Rogue and Logan, taking into account adjustments made for when the product is first made available to the market, will be less than the advantage that PowerVR has held over Nvidia for the duration of the previous generation, with similar adjustments made. I further claim that there is a good chance that eventually (0, 1, 2 generations?) that gap will be negative. My argument here has nothing to do with Logan being better than Rogue. It is a refutation of kukurachee's dismissal of Nvidia's claims for Logan's performance that he based on the GPUs that Nvidia had on their previous generations SOCs, and the amount of hype/interest they tried to generate for these SOCs.
  • michael2k - Sunday, July 28, 2013 - link

    Yes, actually, Rogue is promising 400 GFLOPS, it promises OpenGL ES 3*/2/1.1, OpenGL 3.x/4.x, and full WHQL-compliant DirectX9.3/10, with certain family members extending their capabilities to DirectX11.1 functionality.

    For Logan to be upwards of 50% better than Rogue it would have to be a 600 GFLOP chip since PowerVR 6 is expected to hit 400 GFLOP. What would not be caught up is if Logan was a 400 GFLOP chip released next year. You see, Rogue is intended to hit 1 TF, possibly in full powered tablet form factors, so for Logan to truly best Rogue it would need to hit 1.5TF in a tablet form factor.

    I don't believe Logan is specced that high.
  • Yojimbo - Sunday, July 28, 2013 - link

    The information I found on Rogue listed 250 GFLOPs. Where did you find 400 GFLOPs?
  • michael2k - Monday, July 29, 2013 - link

    http://www.imgtec.com/powervr/sgx_series6.asp

    PowerVR Series6 GPUs can deliver 20x or more of the performance of current generation GPU cores targeting comparable markets. => iPad currently is about 70GF, so a Series 6 implementation would be 1.4 TF

    PowerVR Series6 GPU cores are designed to offer computing performance exceeding 100GFLOPS (gigaFLOPS) and reaching the TFLOPS (teraFLOPS) range => 100 GF is the bottom of the expected range

    My point holds though that the Series 6 is designed to go over 1TF in performance, which is more than enough to match NVIDIA's Logan for the foreseeable future.
  • TheJian - Monday, August 5, 2013 - link

    You do realize IMG.L exists in phones because they couldn't cut it vs. AMD/NV in discrete gpus right? They have been relegated to cheapo devices by AMD/NV and are broke compared to NV. Apple apparently doesn't give them much profits. They had to borrow money just to buy 100mil mips cpus (borrowed 20mil), meanwhile NV buys companies like icera for 330mil cash, July29th bought Portland Group (compiler teams just got better at NV), with I'm assuming CASH again as they have 3.75B in the bank:
    http://blogs.nvidia.com/blog/2013/07/29/portland/
    I'm guessing this hurts OpenCL some also, while boosting NV in HPC even more. PGI owns 10% of compilers, while Intel owns ~25%. So NV just gained a huge leg up here with this deal.

    Back to socs, none of the competing socs have had to be GOOD at GAMES until now and not even really yet as we're just getting unreal 3 games coming and announced. How do you think this plays out vs. a team with 20yrs of gaming development work on drivers & with the devs? Every game dev knows AMD/NV hardware inside out for games. That can't be said of any SOC maker. All of those devs have created games on hardware that is about to come to socs next year (no new work needed, they already know kepler). We are leaving the era of minecraft & other crap and entering unreal 3/unreal 4 on socs. Good luck to the competition. If AMD can survive long enough to get their soc out they may be a huge player eventually also, but currently I'd bet on NV doing some major damage with T5/T6 etc (if not T4/T4i). T4 is just to hold people off until the real war starts next year as games are just gearing up on android, and T4i will finally get them into phones in greater numbers (adding more devices in the wild based on a very good gpu set).

    That being said, T4 is already roughly equal to all others. Only S800 looks to beat its gpu (I don't count apple, sales dropping and not housing anything but apple hardware anyway). I don't see rogue making any speeches about perf, just features and I don't see them running unreal 4 demos :) I don't hear anything from Mali either. S800 appears to have a good gpu (330) but it remains to be seen if games will fully optimize for it. I see no tegrazone type optimizations so far on Adreno. There is no Adrenozone (whatever, you get the point). All soc makers are about to be seriously upset by gaming becoming #1 on them. Before they needed a good 2D gui etc, not much more to run stupid stuff like minecraft or tetris type junk. We'll see how they all fare in stuff like Hawken etc types. I'd bet on NV/AMD with NV obviously being in the driver seat here, with cash no debt etc helping them and already on T5 by the time AMD ships A1 or whatever they call it.

    Soc venders better get their gaming chops up to snuff in a hurry or the NV GPU train will run them over in the next 3yrs. NV gpus will be optimized for again and again on desktop and that tech will creep into socs a year or two later over and over. All games made for PC's will start to creep to android on the very same hardware they were made for already a few years before on desktops. Devs will start to make advanced games on HTML5 (html6 etc eventually), OpenGL, WebGL, OpenCL, Java etc to ease portability making directx less needed. At that point Intel/MS are both in trouble (we're already heading there). IF you aim a game at directx you have a much harder time porting to everywhere else. That same game made on OpenGL ports easily (same with Html5 etc) to all devices.

    In a few short years we'll be looking at a 500w normal PC type box with Denver or whatever in it (Cortex A57 I'd guess) with an NV discrete card for graphics. With another 3yrs of games under the android belts it should make for a pretty good home PC with no WINTEL at all in it. Google just needs to get an office package for home out that does most of what office does by that time and there is no need for windows/office right (something aimed more at home users)? They need to court Adobe now to port it to android before Denver or Boulders launches (and anyone else aiming at desktops) or come up with their own content creation suite I guess. A lot of content comes from Adobe's suite. You need this on android with an office package also to convert many home users and work pc's. I see horrible stock prices for Intel/MS in less than 5yrs if google courts Adobe and packages a nice office product for home users. They can give away the OS/Office until MS/Intel bleed to death. All they need is games, adobe, NV/AMD gpus (choose either or both) in a tower with any soc vendor. They will make money on ads not the hardware or software MS/Intel need to make profits on. Margins will tank for these two (witness MS's drop on RT already, and Intel's profits tanking), and android will come to the front as a decent alternative to WINTEL. It’s kind of funny Google/(insert soc vendor name here) are about to turn MS into netscape. They gave IE away until netscape bled to death. The same is about to happen to them…ROFL. At the very least Wintel won’t be the same in under 5yrs. While Intel now has android, it would be unwise of Google to help Intel push them around (like samsung does to some degree). Much better to make a soc vender be the next Intel but much weaker. Google can push a soc vender around, not a samsung or Intel (not as easily anyway).

    I’ll remind you that PowerVR has already been killed once by NV (and ATI at the time). That's why they are in phones not PC's :) Don't forget that. I don't see this playing out different this time either. Phones/tablets etc are accidentally moving into NV/AMD territory (gaming) now and with huge volumes making it a no brainer to compete here for these two. It's like the entire market is coming to AMD/NV's doorstep like never before. Like I said, good luck to Qcom/Imagination/Arm/Samsung trying to get good at gaming overnight. There is no Qcom "gaming evolved" or "The way it's meant to be played" games yet. NV/AMD have had 20yrs of doing it, and devs have the same amount of time in experience with their hardware (which with die shrinks will just slide into phones/tablets shortly with less cores, but it’s the same tech!). If Qcom/Samsung (or even Apple) don't buy AMD soon they are stupid. It is the best defensive move they can make vs the NV gaming juggernaut coming and if that happens I'll likely sell my NV stock shortly after...LOL. Apple should do this to keep android on NV from becoming the dominate gaming platform. MS could buy them also, but I don't think Intel can get away with it quite yet (though they should be able to, since ARM is about to become their top cpu competitor). When an ARM soc (or whatever) gets into a 500w PC like box Intel can make that move to buy AMD for gpus. Until then I don't think they can without the FTC saying "umm, what?". I'm also not sure Samsung can legally buy AMD (security reasons for a non American company having that tech?).

    Anyway, it's not just about perf, it's also about optimizations (such as NV buying PGI for even MORE cuda prowess for dev tools). IE - Cuda just got a boost. More prep work for Denver/Boulder is afoot :) Just one more piece of the puzzle so to speak (the software stack). For google, they don't care who wins the soc race, as long as it pushes Android/Chrome as the Wintel replacement in the future for the dominant gaming platform and eventually as a PC replacement also for at least some work (assuming they court Adobe and maybe a few others for content creation on android/chrome or whatever). I’m guessing Denver/Boulder (and their competition) are pipelined to run at 3.5-4ghz in 70-100w envelopes. It would make no sense to compete with Intel’s Haswell/Broadwell etc in 8w. A desktop chip needs to operate at ~70w etc just like AMD/Intel to make the race even. You have a PSU, heatsink/fans and a HUGE box in a PC, it would be stupid not to crank power up to use them all. A Cortex-A57 chip at 4ghz should be very interesting with say, a Volta discrete card in the box :) A free OS/Office suite on it, with say $100 A57 4ghz chip should be easy to sell. A soc chip for notebooks, and a stripped gpu-less version for desktops with discrete cards sounds great. I like the future.

    My 2c. Heck call it a buck....whatever...LOL.
  • ancientarcher - Wednesday, August 7, 2013 - link

    $10.0 even!!
    A very good explanation of your theory. not in the mainstream yet, and Intel (and Anandtech) keep on pushing how great its latest and great chip is that you can get for only $500. This monopoly is going to crash and burn. MS, as you said tried to go the ARM way (the system just wont support monopoly profits for two monopolies, maybe one but not two) but didn't execute well

    Denver won't necessarily be ported to the PC form factor, but who knows. It will surely change the landscape in the smartphone/tablet segments. I also think IMG, Mali and Adreno can learn to compete in games. Yes, they would not have had 20 years, but they do have a lot of momentum and the sheer number of devices running these platforms will push developers to code for them.
  • Mondozai - Saturday, September 7, 2013 - link

    Who says they are measuring it against the iPad 4(which is 80 gigaflops btw)?

    Their wording is so vague that they can choose a midrange model and say that's it's more representative of the GPU performance of most GPU's(at the time that statement was made), which would be correct.

    You're making some pretty wild guesses.
  • MySchizoBuddy - Saturday, August 10, 2013 - link

    mainboard chipsets?
    they exited that market. so you only have one example of Nvidia operating like a locomotive.
  • beck2050 - Monday, August 12, 2013 - link

    Agreed. It looks like Logan will finally put Nvidia on the map in the mobile world big time, and after that it should get very interesting.
  • CharonPDX - Wednesday, July 24, 2013 - link

    Even if the power consumption is double what they claim, and the performance half - it's *STILL* massively impressive.

    The big question is: Do Intel and PowerVR have similar tricks up their sleeve?

Log in

Don't have an account? Sign up now