x PhoneArena is looking for new authors in New York! To view all available positions, click here.
  • Home
  • News
  • Nvidia’s Tegra K1 might draw a shocking amount of power

Nvidia’s Tegra K1 might draw a shocking amount of power

Posted: , by Victor H.

Tags:

Nvidia’s Tegra K1 might draw a shocking amount of power
Nvidia unveiled its new Tegra K1 system-on-a-chip with a lot of buzz at the Consumer Electronics Show 2014, capturing headlines with its 192 shader cores and with some impressive demos, but one big question remained largely unanswered: what is the real power draw of the K1?

Nvidia claimed its numbers for the K1 are on par with the Apple A7, but experts remained skeptical. If you, too, had some doubts about the power efficiency of the Tegra K1, these latest revelations would only further fuel them. SemiAccurate looked back at earlier demonstrations of the new chipset and spotted some interesting details from Nvidia’s CES demo. Namely, the fact that the gigantic system demoed at CES is not just large in size, but also - actively cooled.

This disturbing fact alone is enough of a red flag, but SemiAccurate also gleaned at the power brick for the Tegra K1 demo box, and it reveals a truly shocking power draw. With a voltage of 12 volts and 5 amps of electric current, we arrive at a total peak power draw of 60 watts for the entire demo box! Truth be told, the actual power draw of the system is not likely to reach that peak - it was probably running at around 35 watts to 40 watts, according to SemiAccurate’s rough estimates. This is the power draw for the whole system, including all other on-board components (and the fan), but the Tegra K1 is obviously one component drawing a sizable amount of that power. It’s also no coincidence that it’s the only component with a heatsink on it.

This huge power consumption of the Tegra K1 is definitely shocking - after all, we have seen the perfectly regularly sized, 7-inch Nvidia Tegra K1 reference tablets running without any huge overheating issues, so we know that at least that is possible. We don't know whether there were any differences between the clock speeds of the Tegra K1 developer box examined by SemiAccurate and the K1 reference tablets, but still, with all this new information, we’d take a slightly more conservative stand on the Tegra K1.

Nvidia itself has said that it'll get the new chip into sub-2watt territory, but - if all these new numbers prove true - the consequences would be that the company still has some big challenges to overcome. Passively cooled mobile devices like smartphones and tablets draw a couple of watts (even the largest and bulkiest of tablets come with power draw limitations of around 4 watts at most), so the aforementioned huge numbers look suspicious at least. It is also possible that the Tegra K1 for devices like tablets will run at much lower clock speeds than the developer boxes on display.

Right below, you'd see images of the Tegra K1 dev board and its power brick. What do you make of all these numbers?


source: SemiAccurate

29 Comments
  • Options
    Close




posted on 06 Mar 2014, 02:46

1. kaikuheadhunterz (Posts: 711; Member since: 18 Jul 2013)


Which Tegra K1?

posted on 06 Mar 2014, 03:10 2

13. hung2900 (Posts: 793; Member since: 02 Mar 2012)


A cheating SoC from nVIDIA like its predecessor Tegra 4.
I mentioned about how sh!tty nVIDIA was with the unreasonable score of T4 and K1, and now it's proved

posted on 06 Mar 2014, 03:27 6

15. renz4 (Posts: 227; Member since: 10 Aug 2013)


and how T4 is cheating?

posted on 06 Mar 2014, 03:54 3

16. vincelongman (Posts: 1141; Member since: 10 Feb 2013)


I think he is referring to the Nvidia Shield, which is cooled by a fan, which he's suggesting is cheating as the S800/A7 in tablets/phone are passively cooled (no fan).

posted on 06 Mar 2014, 04:15 1

19. hung2900 (Posts: 793; Member since: 02 Mar 2012)


Not only about the fan. If you guys remember the initial benchmark of T4 which is outstanding but absolutely far from the real benchmark of the commercial devices. NVIDIA made some serious manipulation on cores clock with a huge cooling system in order to achieve higher score with the reference device, which cannot happen on reality.

posted on 06 Mar 2014, 04:19 1

21. renz4 (Posts: 227; Member since: 10 Aug 2013)


which one you refer to?

posted on 06 Mar 2014, 05:34

28. fireblade (Posts: 694; Member since: 27 Dec 2013)


what cheat? it's clocked at 3 GHz.

posted on 06 Mar 2014, 02:46 3

2. galanoth (Posts: 317; Member since: 26 Nov 2011)


Tegra K1 chromebook?

posted on 06 Mar 2014, 02:55

4. _Bone_ (Posts: 2125; Member since: 29 Oct 2012)


Maybe, bit it'd still draw several times more power than an Exynos 5 or SD801 one...

posted on 06 Mar 2014, 02:54 1

3. rd_nest (Posts: 768; Member since: 06 Jun 2010)


Ans I mentioned the same issue in a previous article:

http://www.phonearena.com/news/64-bit-dual-core-Nvidia-Denver-chipset-smashes-the-competition-on-AnTuTu_id53510

posted on 06 Mar 2014, 05:01

24. Finalflash (Posts: 1720; Member since: 23 Jul 2013)


This article is old in general, like since CES, but PA just got links to it in the article you just linked by commenters.

posted on 06 Mar 2014, 02:56 4

5. eisenbricher (Posts: 971; Member since: 09 Aug 2012)


Hey... judging from power brick's rating isn't a good idea at all. Sure, the brick is rated to provide max 60 watts but does K1 really uses that much?

Real figures my a multimeter would be nice.

posted on 06 Mar 2014, 03:02 9

11. Dr.Phil (Posts: 905; Member since: 14 Feb 2011)


Not to mention that the article somehow thinks that nVidia is way off from having the K1 in tablet form. Did they forget about this?:

http://www.phonearena.com/news/Nvidia-Tegra-K1-Reference-Tablet-hands-on_id51209

I doubt that this tablet has a 60 watt power supply going into it. It also did prove that you can have the K1 perform the way nVidia said it would in a tablet form.

I guess that's why they call it "semi-accurate"...

posted on 06 Mar 2014, 04:34 3

22. darkskoliro (Posts: 969; Member since: 07 May 2012)


I dont think you can even call it semi-accurate. More like semi-retarded judgement

posted on 06 Mar 2014, 02:57

6. Federated (Posts: 227; Member since: 06 Mar 2010)


Yep, it's definitely fast, but not suitable inside a small device like tablet or smartphones.

posted on 06 Mar 2014, 04:17 1

20. renz4 (Posts: 227; Member since: 10 Aug 2013)


if K1 cannot be fitted into tablet then nvidia must be lying when they show K1 reference tablet at CES last january

posted on 06 Mar 2014, 05:29

27. Federated (Posts: 227; Member since: 06 Mar 2010)


It's not about fitting it inside. It will fit in. LOL The problem is overheating and power consumption.

posted on 06 Mar 2014, 02:58 4

7. Chris_Bakke (Posts: 202; Member since: 23 Jan 2013)


Maybe Nvidia will introduce the world first external chipset, so you can have it actively cooled outside of your device.

posted on 06 Mar 2014, 03:00 1

9. Federated (Posts: 227; Member since: 06 Mar 2010)


LOL. Or external CAR battery in your back pack (to get enough juice for a day) and Ice packs and small cooling fans.

posted on 06 Mar 2014, 04:08

18. eisenbricher (Posts: 971; Member since: 09 Aug 2012)


LOL nice idea ;)

posted on 06 Mar 2014, 03:00 5

8. darkskoliro (Posts: 969; Member since: 07 May 2012)


Already saw this post somewhere else. Just wanna say that I dont think Nvidia would say 2W and then give you a mobile SoC that drains 45W. First of all no company could even put that in there (because its completely ridiculous) and second of all if you actually read the Anand article on it, it explains how K1 GPU was derived from a previous desktop generation GPU with a power draw of < 2W. The r3p3 has better power efficiency than the previous gen A15 r2p1 so whats to say that it would draw even more power than a current A15 right now? Given my points would you not agree that SemiAccurate’s estimates of 35-40w is just ridiculous?

posted on 06 Mar 2014, 03:01 4

10. renz4 (Posts: 227; Member since: 10 Aug 2013)


that is semiaccurate. whenever Charlie D of SemiAccurate are talking about nvidia it is best to take a grain of salt of what he said. sometimes he is right and sometimes he is wrong but everyone in tech industry how much hates he have towards nvidia

posted on 06 Mar 2014, 03:08

12. ArtSim98 (limited) (Posts: 2738; Member since: 21 Dec 2012)


Lol Nvidia is too used to making PC GPU's. I'm sure though that we will something this powerfull in the near future of smartphones. Maybe next year already. We will see where technology takes us!

posted on 06 Mar 2014, 03:18

14. StraightEdgeNexus (Posts: 3197; Member since: 14 Feb 2014)


Epic fail. That power draw 60watt is comparable to a grinding mixer lol.

posted on 06 Mar 2014, 04:50

23. livyatan (Posts: 691; Member since: 19 Jun 2013)


"after all we have seen the perfectly regularly sized, 7-inch Nvidia Tegra K1 reference tablets running without any huge overheating issues, so we know that at least that is possible"
Full stop.
The mentioned fact alone renders this whole article pointless.
Those same tablets were running Serious Sam game and the game devs can attest that the chip has in fact impressed them a lot.

Nvidia Shield also has active cooling, so?

posted on 06 Mar 2014, 06:18

31. Amir1 (Posts: 262; Member since: 20 Aug 2013)


no wonder that no big manufacturer adopt nvidia. their chips are not mature enough to be considered for flagships. see you in 2016 nvidia.

posted on 06 Mar 2014, 11:18 1

32. brrunopt (Posts: 503; Member since: 15 Aug 2013)


Unbelivable how so many fall for this pile of crap disguised as an article

posted on 06 Mar 2014, 12:22 1

33. ocilfa (Posts: 333; Member since: 03 Aug 2012)


I doubt it actually draws that much. On a side note, the power efficiency of Maxwell gets me hyped for the next version of the K1.

posted on 07 Mar 2014, 10:41

34. sarge77 (Posts: 202; Member since: 14 Mar 2013)


Only good thing i've seen so far is that the gs5 supports 128gb this chip hopefully these devices will be able to support more storage because its ashame to have console quality games when you can only download 3 or 4 and why not have it support more isnt it like a mobile oc what you guys call it probably to much to ask but tease us and with the specs with no storage space.

Want to comment? Please login or register.

Latest stories