Nvidia’s Tegra K1 might draw a shocking amount of power

Nvidia’s Tegra K1 might draw a shocking amount of power
Nvidia unveiled its new Tegra K1 system-on-a-chip with a lot of buzz at the Consumer Electronics Show 2014, capturing headlines with its 192 shader cores and with some impressive demos, but one big question remained largely unanswered: what is the real power draw of the K1?

Nvidia claimed its numbers for the K1 are on par with the Apple A7, but experts remained skeptical. If you, too, had some doubts about the power efficiency of the Tegra K1, these latest revelations would only further fuel them. SemiAccurate looked back at earlier demonstrations of the new chipset and spotted some interesting details from Nvidia’s CES demo. Namely, the fact that the gigantic system demoed at CES is not just large in size, but also - actively cooled.

This disturbing fact alone is enough of a red flag, but SemiAccurate also gleaned at the power brick for the Tegra K1 demo box, and it reveals a truly shocking power draw. With a voltage of 12 volts and 5 amps of electric current, we arrive at a total peak power draw of 60 watts for the entire demo box! Truth be told, the actual power draw of the system is not likely to reach that peak - it was probably running at around 35 watts to 40 watts, according to SemiAccurate’s rough estimates. This is the power draw for the whole system, including all other on-board components (and the fan), but the Tegra K1 is obviously one component drawing a sizable amount of that power. It’s also no coincidence that it’s the only component with a heatsink on it.

This huge power consumption of the Tegra K1 is definitely shocking - after all, we have seen the perfectly regularly sized, 7-inch Nvidia Tegra K1 reference tablets running without any huge overheating issues, so we know that at least that is possible. We don't know whether there were any differences between the clock speeds of the Tegra K1 developer box examined by SemiAccurate and the K1 reference tablets, but still, with all this new information, we’d take a slightly more conservative stand on the Tegra K1.

Nvidia itself has said that it'll get the new chip into sub-2watt territory, but - if all these new numbers prove true - the consequences would be that the company still has some big challenges to overcome. Passively cooled mobile devices like smartphones and tablets draw a couple of watts (even the largest and bulkiest of tablets come with power draw limitations of around 4 watts at most), so the aforementioned huge numbers look suspicious at least. It is also possible that the Tegra K1 for devices like tablets will run at much lower clock speeds than the developer boxes on display.

Right below, you'd see images of the Tegra K1 dev board and its power brick. What do you make of all these numbers?


source: SemiAccurate

FEATURED VIDEO

29 Comments

1. kaikuheadhunterz

Posts: 1157; Member since: Jul 18, 2013

Which Tegra K1?

13. hung2900

Posts: 966; Member since: Mar 02, 2012

A cheating SoC from nVIDIA like its predecessor Tegra 4. I mentioned about how sh!tty nVIDIA was with the unreasonable score of T4 and K1, and now it's proved

15. renz4

Posts: 319; Member since: Aug 10, 2013

and how T4 is cheating?

16. vincelongman

Posts: 5720; Member since: Feb 10, 2013

I think he is referring to the Nvidia Shield, which is cooled by a fan, which he's suggesting is cheating as the S800/A7 in tablets/phone are passively cooled (no fan).

19. hung2900

Posts: 966; Member since: Mar 02, 2012

Not only about the fan. If you guys remember the initial benchmark of T4 which is outstanding but absolutely far from the real benchmark of the commercial devices. NVIDIA made some serious manipulation on cores clock with a huge cooling system in order to achieve higher score with the reference device, which cannot happen on reality.

21. renz4

Posts: 319; Member since: Aug 10, 2013

which one you refer to?

28. fireblade

Posts: 717; Member since: Dec 27, 2013

what cheat? it's clocked at 3 GHz.

2. galanoth

Posts: 428; Member since: Nov 26, 2011

Tegra K1 chromebook?

4. _Bone_

Posts: 2155; Member since: Oct 29, 2012

Maybe, bit it'd still draw several times more power than an Exynos 5 or SD801 one...

3. rd_nest

Posts: 1656; Member since: Jun 06, 2010

24. Finalflash

Posts: 4063; Member since: Jul 23, 2013

This article is old in general, like since CES, but PA just got links to it in the article you just linked by commenters.

5. eisenbricher

Posts: 973; Member since: Aug 09, 2012

Hey... judging from power brick's rating isn't a good idea at all. Sure, the brick is rated to provide max 60 watts but does K1 really uses that much? Real figures my a multimeter would be nice.

11. Dr.Phil

Posts: 2431; Member since: Feb 14, 2011

Not to mention that the article somehow thinks that nVidia is way off from having the K1 in tablet form. Did they forget about this?: http://www.phonearena.com/news/Nvidia-Tegra-K1-Reference-Tablet-hands-on_id51209 I doubt that this tablet has a 60 watt power supply going into it. It also did prove that you can have the K1 perform the way nVidia said it would in a tablet form. I guess that's why they call it "semi-accurate"...

22. darkskoliro

Posts: 1092; Member since: May 07, 2012

I dont think you can even call it semi-accurate. More like semi-retarded judgement

6. Federated

Posts: 263; Member since: Mar 06, 2010

Yep, it's definitely fast, but not suitable inside a small device like tablet or smartphones.

20. renz4

Posts: 319; Member since: Aug 10, 2013

if K1 cannot be fitted into tablet then nvidia must be lying when they show K1 reference tablet at CES last january

27. Federated

Posts: 263; Member since: Mar 06, 2010

It's not about fitting it inside. It will fit in. LOL The problem is overheating and power consumption.

7. Chris_Bakke

Posts: 246; Member since: Jan 23, 2013

Maybe Nvidia will introduce the world first external chipset, so you can have it actively cooled outside of your device.

9. Federated

Posts: 263; Member since: Mar 06, 2010

LOL. Or external CAR battery in your back pack (to get enough juice for a day) and Ice packs and small cooling fans.

18. eisenbricher

Posts: 973; Member since: Aug 09, 2012

LOL nice idea ;)

8. darkskoliro

Posts: 1092; Member since: May 07, 2012

Already saw this post somewhere else. Just wanna say that I dont think Nvidia would say 2W and then give you a mobile SoC that drains 45W. First of all no company could even put that in there (because its completely ridiculous) and second of all if you actually read the Anand article on it, it explains how K1 GPU was derived from a previous desktop generation GPU with a power draw of < 2W. The r3p3 has better power efficiency than the previous gen A15 r2p1 so whats to say that it would draw even more power than a current A15 right now? Given my points would you not agree that SemiAccurate’s estimates of 35-40w is just ridiculous?

10. renz4

Posts: 319; Member since: Aug 10, 2013

that is semiaccurate. whenever Charlie D of SemiAccurate are talking about nvidia it is best to take a grain of salt of what he said. sometimes he is right and sometimes he is wrong but everyone in tech industry how much hates he have towards nvidia

12. ArtSim98

Posts: 3535; Member since: Dec 21, 2012

Lol Nvidia is too used to making PC GPU's. I'm sure though that we will something this powerfull in the near future of smartphones. Maybe next year already. We will see where technology takes us!

14. StraightEdgeNexus

Posts: 3689; Member since: Feb 14, 2014

Epic fail. That power draw 60watt is comparable to a grinding mixer lol.

23. livyatan

Posts: 867; Member since: Jun 19, 2013

"after all we have seen the perfectly regularly sized, 7-inch Nvidia Tegra K1 reference tablets running without any huge overheating issues, so we know that at least that is possible" Full stop. The mentioned fact alone renders this whole article pointless. Those same tablets were running Serious Sam game and the game devs can attest that the chip has in fact impressed them a lot. Nvidia Shield also has active cooling, so?

31. Amir1 unregistered

no wonder that no big manufacturer adopt nvidia. their chips are not mature enough to be considered for flagships. see you in 2016 nvidia.

32. brrunopt

Posts: 742; Member since: Aug 15, 2013

Unbelivable how so many fall for this pile of crap disguised as an article

33. ocilfa

Posts: 334; Member since: Aug 03, 2012

I doubt it actually draws that much. On a side note, the power efficiency of Maxwell gets me hyped for the next version of the K1.

34. sarge77

Posts: 202; Member since: Mar 14, 2013

Only good thing i've seen so far is that the gs5 supports 128gb this chip hopefully these devices will be able to support more storage because its ashame to have console quality games when you can only download 3 or 4 and why not have it support more isnt it like a mobile oc what you guys call it probably to much to ask but tease us and with the specs with no storage space.

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.