Nvidia’s Tegra K1 might draw a shocking amount of power
Nvidia claimed its numbers for the K1 are on par with the Apple A7, but experts remained skeptical. If you, too, had some doubts about the power efficiency of the Tegra K1, these latest revelations would only further fuel them. SemiAccurate looked back at earlier demonstrations of the new chipset and spotted some interesting details from Nvidia’s CES demo. Namely, the fact that the gigantic system demoed at CES is not just large in size, but also - actively cooled.
This disturbing fact alone is enough of a red flag, but SemiAccurate also gleaned at the power brick for the Tegra K1 demo box, and it reveals a truly shocking power draw. With a voltage of 12 volts and 5 amps of electric current, we arrive at a total peak power draw of 60 watts for the entire demo box! Truth be told, the actual power draw of the system is not likely to reach that peak - it was probably running at around 35 watts to 40 watts, according to SemiAccurate’s rough estimates. This is the power draw for the whole system, including all other on-board components (and the fan), but the Tegra K1 is obviously one component drawing a sizable amount of that power. It’s also no coincidence that it’s the only component with a heatsink on it.
This huge power consumption of the Tegra K1 is definitely shocking - after all, we have seen the perfectly regularly sized, 7-inch Nvidia Tegra K1 reference tablets running without any huge overheating issues, so we know that at least that is possible. We don't know whether there were any differences between the clock speeds of the Tegra K1 developer box examined by SemiAccurate and the K1 reference tablets, but still, with all this new information, we’d take a slightly more conservative stand on the Tegra K1.
Nvidia’s Tegra K1 draws a shocking amount of power: not coming to smartphones or tablets? Fullscreen
More popular slideshows
LG G3 - 40 Tips & Tricks for LG's most powerful smartphone ever
25 Jul 2014, 01:36
iOS 7 release date and time are today (Sep 18), get ready to update!
18 Sep 2013, 04:00
5 Android L themes, launchers and icon packs to install while you wait
25 Jul 2014, 03:23
LG G3 vs Samsung Galaxy S5: vote for the better phone
25 Jul 2014, 06:01
The best Android browsers, 2014 edition: design, features, and performance
23 Jul 2014, 09:49
Did you know: 5 interesting facts about Android L
20 Jul 2014, 10:02
Nvidia’s Tegra K1 draws a shocking amount of power: not coming to smartphones or tablets?
1. Notice how the demo box is actively cooled
2. A snap of the power brick for the Tegra K1 demo box
13. hung2900 (Posts: 769; Member since: 02 Mar 2012)
A cheating SoC from nVIDIA like its predecessor Tegra 4.
I mentioned about how sh!tty nVIDIA was with the unreasonable score of T4 and K1, and now it's proved
16. vincelongman (Posts: 951; Member since: 10 Feb 2013)
I think he is referring to the Nvidia Shield, which is cooled by a fan, which he's suggesting is cheating as the S800/A7 in tablets/phone are passively cooled (no fan).
19. hung2900 (Posts: 769; Member since: 02 Mar 2012)
Not only about the fan. If you guys remember the initial benchmark of T4 which is outstanding but absolutely far from the real benchmark of the commercial devices. NVIDIA made some serious manipulation on cores clock with a huge cooling system in order to achieve higher score with the reference device, which cannot happen on reality.
4. _Bone_ (Posts: 2104; Member since: 29 Oct 2012)
Maybe, bit it'd still draw several times more power than an Exynos 5 or SD801 one...
3. rd_nest (Posts: 684; Member since: 06 Jun 2010)
Ans I mentioned the same issue in a previous article:
24. Finalflash (Posts: 1455; Member since: 23 Jul 2013)
This article is old in general, like since CES, but PA just got links to it in the article you just linked by commenters.
5. eisenbricher (Posts: 970; Member since: 09 Aug 2012)
Hey... judging from power brick's rating isn't a good idea at all. Sure, the brick is rated to provide max 60 watts but does K1 really uses that much?
Real figures my a multimeter would be nice.
11. Dr.Phil (Posts: 864; Member since: 14 Feb 2011)
Not to mention that the article somehow thinks that nVidia is way off from having the K1 in tablet form. Did they forget about this?:
I doubt that this tablet has a 60 watt power supply going into it. It also did prove that you can have the K1 perform the way nVidia said it would in a tablet form.
I guess that's why they call it "semi-accurate"...
22. darkskoliro (Posts: 935; Member since: 07 May 2012)
I dont think you can even call it semi-accurate. More like semi-retarded judgement
6. Federated (Posts: 152; Member since: 06 Mar 2010)
Yep, it's definitely fast, but not suitable inside a small device like tablet or smartphones.
20. renz4 (Posts: 202; Member since: 10 Aug 2013)
if K1 cannot be fitted into tablet then nvidia must be lying when they show K1 reference tablet at CES last january
27. Federated (Posts: 152; Member since: 06 Mar 2010)
It's not about fitting it inside. It will fit in. LOL The problem is overheating and power consumption.
7. Chris_Bakke (Posts: 199; Member since: 23 Jan 2013)
Maybe Nvidia will introduce the world first external chipset, so you can have it actively cooled outside of your device.
9. Federated (Posts: 152; Member since: 06 Mar 2010)
LOL. Or external CAR battery in your back pack (to get enough juice for a day) and Ice packs and small cooling fans.
8. darkskoliro (Posts: 935; Member since: 07 May 2012)
Already saw this post somewhere else. Just wanna say that I dont think Nvidia would say 2W and then give you a mobile SoC that drains 45W. First of all no company could even put that in there (because its completely ridiculous) and second of all if you actually read the Anand article on it, it explains how K1 GPU was derived from a previous desktop generation GPU with a power draw of < 2W. The r3p3 has better power efficiency than the previous gen A15 r2p1 so whats to say that it would draw even more power than a current A15 right now? Given my points would you not agree that SemiAccurate’s estimates of 35-40w is just ridiculous?
10. renz4 (Posts: 202; Member since: 10 Aug 2013)
that is semiaccurate. whenever Charlie D of SemiAccurate are talking about nvidia it is best to take a grain of salt of what he said. sometimes he is right and sometimes he is wrong but everyone in tech industry how much hates he have towards nvidia
12. ArtSim98 (Posts: 2267; Member since: 21 Dec 2012)
Lol Nvidia is too used to making PC GPU's. I'm sure though that we will something this powerfull in the near future of smartphones. Maybe next year already. We will see where technology takes us!
14. StraightEdgeNexus (Posts: 2680; Member since: 14 Feb 2014)
Epic fail. That power draw 60watt is comparable to a grinding mixer lol.
23. livyatan (Posts: 657; Member since: 19 Jun 2013)
"after all we have seen the perfectly regularly sized, 7-inch Nvidia Tegra K1 reference tablets running without any huge overheating issues, so we know that at least that is possible"
The mentioned fact alone renders this whole article pointless.
Those same tablets were running Serious Sam game and the game devs can attest that the chip has in fact impressed them a lot.
Nvidia Shield also has active cooling, so?
31. Amir1 (Posts: 243; Member since: 20 Aug 2013)
no wonder that no big manufacturer adopt nvidia. their chips are not mature enough to be considered for flagships. see you in 2016 nvidia.
32. brrunopt (Posts: 396; Member since: 15 Aug 2013)
Unbelivable how so many fall for this pile of crap disguised as an article
33. ocilfa (Posts: 329; Member since: 03 Aug 2012)
I doubt it actually draws that much. On a side note, the power efficiency of Maxwell gets me hyped for the next version of the K1.
34. sarge77 (Posts: 202; Member since: 14 Mar 2013)
Only good thing i've seen so far is that the gs5 supports 128gb this chip hopefully these devices will be able to support more storage because its ashame to have console quality games when you can only download 3 or 4 and why not have it support more isnt it like a mobile oc what you guys call it probably to much to ask but tease us and with the specs with no storage space.