x PhoneArena is looking for new authors! To view all available positions, click here.
  • Home
  • News
  • Silicon warriors: Snapdragon 801 vs NVIDIA Tegra K1

Silicon warriors: Snapdragon 801 vs NVIDIA Tegra K1

Posted: , by Daniel P.

Tags:

Silicon warriors: Snapdragon 801 vs NVIDIA Tegra K1
Qualcomm used the MWC expo to announce its new Snapdragon 801 family of processors, which power spring flagships like the Galaxy S5, or the Xperia Z2. We already compared it to the previous, Snapdragon 800 family, and now it's time for a cagematch with another candidate for glory this year - Nvidia's new Tegra K1 mobile processor.

Tegra K1 is expected to land in Android devices before we dive deep into the summer, and shapes to be a monster chipset, with a 64-bit version on the way, too. We aren't even talking about the quad-core Cortex A15 processor here (plus a low-power companion core), as the beauty lies in the 192-core Kepler GPU. Yep, you read that right, the desktop-grade Kepler graphics are coming in a mobile form, whether you need that much pixel-pushing for Android games, or not. You'll certainly need them for the 4K video rendering, or shenanigans like real-time face expression overlay, like the one you see in our K1 hands-on video below.

The Adreno 330 GPU in Snapdragon 801, however, runs on 578 MHz, while the Kepler GPU in Tegra K1 is clocked at 950 MHz, so there might be issues with power consumption and heat. Nvidia claims that, while K1 outperforms both the Xbox 360 and the PS3, it consumes much less energy. The power consumption might still need work when it comes to mobile devices, though, as the reference kit was reportedly running at the whopping 35 to 40 watts. Nvidia said it will be able to shoehorn the K1 into the sub-2W category under non-peak loads, which should put it on a fairly equal footage with Snapdragon 801.

SoC VersionBuild processCPUGPUOpen GLeMMCMemory interfaceAnTuTu score
Tegra K1
(Lenovo ThinkVision 28)
32-bit28nmQuad-core 2.3 GHz ARM Cortex A15950MHz
192-core Kepler
OpenGL 4.4v. 4.564-bit LPDDR343617
Tegra K1 64-bit reference device64-bit28nmDual-core 2.5 GHz Project Denver950MHz
192-core Kepler
OpenGL 4.4v. 4.52x64-bit LPDDR343851
Snapdragon 801 MSM8974-AC
(Galaxy S5)
32-bit28nmQuad-core 2.45 GHz Krait 400578MHz
Adreno 330
OpenGL 3.0v. 5.02x32-bit LPDDR3-186636469


There aren't many exhaustive benchmarks on K1 and Snapdragon 801, simply because they are quite new chips, with only a few devices carrying them. Still, the following chart should give you a pretty good overview of what to expect from Tegra K1, in comparison with Snapdragon 801, as well as some other current or upcoming chipset.

AnTuTu scores of NVIDIA Tegra K1 reference platforms, compared with popular chipsets

AnTuTu scores of NVIDIA Tegra K1 reference platforms, compared with popular chipsets


Overall, as you can see from the benchmark chart above, the 32-bit and 64-bit versions of Tegra K1 are on equal footing when it comes to raw performance. Granted, they beat Snapdragon 801 with about 20% here, but let's not forget that the measurements have been done on K1 reference platforms, like the one you see in the video below, while Qualcomm's chipset is in actual smartphones at the moment. 

When NVIDIA shoehorns Tegra K1 in a lesser power envelope, suitable for phones and tablets, its general performance could very well even out with this year's Snapdragon crop. Granted, the Kepler GPU is likely to give it an advantage in the graphics department, but the Snapdragon SoC offers plenty of unsurpassed value, like a multiband LTE modem, so it would be preferable for manufacturers to use. In short, we'd have to measure the performance of actual phones and tablets with Tegra K1, before we declare it a winner, even with a 192-core GPU, and even in comparison with Snapdragon 801, let alone the upcoming 805.

53 Comments
  • Options
    Close




posted on 21 Mar 2014, 10:30 5

1. CX3NT3_713 (Posts: 1676; Member since: 18 Apr 2011)


"silicon warriors" ( • Y •) ha

posted on 21 Mar 2014, 12:59 1

25. Ashoaib (Posts: 1432; Member since: 15 Nov 2013)


Good to have competition in processing power... as a result we will benefit from it... I hope a competition in a better bettery technology too

posted on 21 Mar 2014, 13:49

31. Zenzui (Posts: 71; Member since: 13 Feb 2012)


well said!! All benefits goes to consumer for this healthy competitions

posted on 22 Mar 2014, 05:40

44. garz_pa (Posts: 153; Member since: 03 Nov 2011)


Nice one!

posted on 21 Mar 2014, 10:34 2

2. GABIDEN (Posts: 26; Member since: 06 Mar 2013)


Sorry nvidia,Snapdragon wins

posted on 21 Mar 2014, 11:43

17. Finalflash (Posts: 1588; Member since: 23 Jul 2013)


Yea, tablet vs smartphone, and still barely winning.

posted on 21 Mar 2014, 10:38 7

3. ajac09 (Posts: 1362; Member since: 30 Sep 2009)


k1 if they could get some hardware partners be nice..

posted on 21 Mar 2014, 10:38

4. StraightEdgeNexus (Posts: 3074; Member since: 14 Feb 2014)


Snapdragon wins. Qualcomm rocks Nvidia sucks

posted on 21 Mar 2014, 10:55 16

6. mr.techdude (Posts: 542; Member since: 19 Nov 2012)


qualcomm rocks in the mobile industry but no doubt nvidia is the king of graphic cards in the computer segment

posted on 21 Mar 2014, 11:31 2

14. StraightEdgeNexus (Posts: 3074; Member since: 14 Feb 2014)


I'm all about AMD Radeon gpu. I hate nvidia. I had one of the mid end GTX but it overheated horribly while playing GTA San Andreas(Seriously?). Also i experienced some framerate drops playing certain games. Can you expect such performance from king leader of PC's gpu that costs $200. My Radeon costs less and equivalent performance with no issues. Buying nvidia product was my first and last mistake in PC hardware.

posted on 21 Mar 2014, 11:41 1

16. ajac09 (Posts: 1362; Member since: 30 Sep 2009)


I love AMD processors and I love the performance of the APU in my laptop but Nvidia is king of graphics by far. but AMD especailly the APU is doing well

posted on 21 Mar 2014, 11:58 2

18. mr.techdude (Posts: 542; Member since: 19 Nov 2012)


I agree with you 100%, AMD is awesome but I haven't used any of their cards yet, however speaking about the gtx series which one did you have cause I'm using Asus g750jh and it has gtx 780m gpu. The cooling system on this laptop is ridiculously good like I'll play assassins creed black flag for like 2.3 hours straight with no overheating. Sure the graphics card over heats and that's normal but it depends on your unit on how good it is to take out that heat.

posted on 21 Mar 2014, 12:16 1

22. StraightEdgeNexus (Posts: 3074; Member since: 14 Feb 2014)


I dont know the exact model but i was using it in desktop with a massive fan. Anyway i'm not gonna change to nvidia or upgrading my desktop. Currently playing on playstation 3.

posted on 21 Mar 2014, 13:09 2

26. Deaconclgi (Posts: 231; Member since: 03 Nov 2012)


I had a Gigabyte R9 290x that I bought in December 2013 and had to get 2 RMAs due to the cards being unstable and crashing windows constantly. I even bought a new 800 watt power supply thinking that would fix it.

I finally went back to NVIDIA(previous card was a 560TI and 9600GT) in February and bought a EVGA 780 SC ACX and have not had any issues. I will most likely NEVER buy a Gigabyte card again and I probably will not buy another AMD GPU when they are newly released.

I will however, continue to buy laptops with AMD APUs as they provide the best price to performance and I am hoping to see powerful AMD APUs in Windows tablets in the near future.

The R9 290x experience was horribly frustrating! I reinstalled windows 8, windows 7, swept the drivers, used beta drivers, used Mantle drivers, missed gaming for my entire 2 week Christmas vacation, missed Battlefield 4 bonuses because I couldn't play.

It wasn't the end of the world but it was a horrible experience considering it was a $600+ GPU and I never had ANY issues with my NVIDIA GPUs.

posted on 21 Mar 2014, 13:22 3

27. StraightEdgeNexus (Posts: 3074; Member since: 14 Feb 2014)


$600 lol i would never spend it on gpu i would rather buy a playstation 4 and few titles.

posted on 21 Mar 2014, 13:35 5

28. Deaconclgi (Posts: 231; Member since: 03 Nov 2012)


That is a great option as well. I don't play games often and when I do, it is on a PC. I used to be a serious gamer but that was back when I had a Dreamcast, Xbox, Playstation 2, Gamecube, PC, two TVs and it all connected to a 300 watt stereo in my bedroom.

A wife, 2 daughters, a busy career and many years later and I haven't bought a console for myself since the original Xbox.

Now, I basically play Battlefield 4 (used to play 3) and a host of other games such as Far Cry 2, Crysis 2, Assassins Creed Black Flag (free with my 780 SC ACX). I don't even keep up with games anymore beyond the Battlefield series.

My computer is a super powerful machine that is under used but hey, when I do use it, it can hand any game that I buy...even if I only play one game about 4 times a month.

I did buy a Wii for my daughters a few years ago...they never play it....I messed up and bought Sims 3 for PC for them and they never looked back...sigh.

posted on 22 Mar 2014, 10:36

45. cezarepc (Posts: 560; Member since: 23 Nov 2012)


cool story bro :)

posted on 24 Jul 2014, 11:55

51. indianmelon (Posts: 2; Member since: 24 Jul 2014)


Joke's on you. The PS3 uses an nVidia GPU.

posted on 21 Mar 2014, 17:47 2

38. vincelongman (Posts: 1045; Member since: 10 Feb 2013)


It depends on which cooler your card has
For example the Nvidia and AMD reference cooler aren't as good as the non-reference cooolers, especially AMD's for their highend card, their extremely loud and hot.

Personally, I chose a EVGA GTX 760 with EVGA's ACX cooler, because it was only $250 and came with AC4, SP:Blacklist, batman arkham origins and Metro:LL.
At the time the R9 270X were $230 and came with BF4.
So I got the 760 as it performs better and was better value because of the 4 free games

posted on 21 Mar 2014, 21:52

41. renz4 (Posts: 219; Member since: 10 Aug 2013)


hahaha

posted on 21 Mar 2014, 23:48

42. SuperMaoriBro (Posts: 268; Member since: 23 Jun 2012)


Qualcom rocks, nvidia rocks. Straightedgenexus sucks

posted on 22 Mar 2014, 05:16 1

43. StraightEdgeNexus (Posts: 3074; Member since: 14 Feb 2014)


Dont get personal here. I had all kind of horrible experiences with nvidia. The overheating and framerate dropping gpu, then the gpu of my crappy lenovo laptop got screwed up for no reason.

posted on 21 Mar 2014, 10:42 1

5. Duketytz (Posts: 400; Member since: 28 Nov 2013)


Where's our beloved 805? Even if nvidia is better, finding a tablet/smartphone with their chips are kind of rare.

posted on 21 Mar 2014, 11:11 2

8. true1984 (Posts: 586; Member since: 23 May 2012)


asus always uses nvidia chips in their high end tablets

posted on 21 Mar 2014, 10:59 5

7. JMartin22 (Posts: 759; Member since: 30 Apr 2013)


At the end of the day, Qualcomm is the winner because it's better optimized, has greater features and is backed with more developer support.

posted on 21 Mar 2014, 11:19 4

9. true1984 (Posts: 586; Member since: 23 May 2012)


their dual core is running at 3GHz. maybe if you clocked an 800 at the same speed you might get the same results

posted on 21 Mar 2014, 11:21

12. brrunopt (Posts: 471; Member since: 15 Aug 2013)


What matter is the power consumption / performance , not the Ghz / performance

posted on 21 Mar 2014, 15:09

32. true1984 (Posts: 586; Member since: 23 May 2012)


i was just speaking on the higher score it got, the power consumption is already bretty bad with nvidia chips

posted on 21 Mar 2014, 16:35

36. brrunopt (Posts: 471; Member since: 15 Aug 2013)


higher score ~ performance

but you are saying that a 3Ghz S800 would achieve the same results , witch makes no sense since it most likely it wouldn't be a comparable product

posted on 21 Mar 2014, 18:23

40. true1984 (Posts: 586; Member since: 23 May 2012)


might. same as underclocking the k1 to 2.3 or 2.5

posted on 21 Mar 2014, 11:19 1

10. grahaman27 (Posts: 347; Member since: 05 Apr 2013)


K1 is better (even though I dont trust leaked benchmarks), but it doesnt have the integrated modem. For all you snapdragon fans, thats the only reason nvidia hasnt picked up steam.

posted on 21 Mar 2014, 12:02

19. mr.techdude (Posts: 542; Member since: 19 Nov 2012)


I'm no snap fan but I find snapdragon to be way more effiecient in battery consumption in contrast to the more powerful k1, that's why snap is recommended in mobile handsets cause nvidia is not that experienced in the mobile part.

posted on 21 Mar 2014, 12:20 1

23. StraightEdgeNexus (Posts: 3074; Member since: 14 Feb 2014)


Crapxynos is the worst by far. Snapdragons are amazing specificaly those krait powered.

posted on 21 Mar 2014, 11:20

11. brrunopt (Posts: 471; Member since: 15 Aug 2013)


" as the reference kit was reportedly running at the whopping 35 to 40 watts. "

seriously ? reportedly ? i
s was a stupid comment from some idiot.. IT'S IMPOSSIBLE its drawing 35-40W , it would need a generous fan to keep that cooled down. Do you see any fan in there ? i don't...

posted on 21 Mar 2014, 13:43

29. TylerGrunter (Posts: 880; Member since: 16 Feb 2012)


The kit referred is not the one in the video, and acutally had a fan.
The issue at hand is that the scores in Antutu could have been done there, not in the tablet, so till real deives are out we´ll never be sure of how good the Tegra K1 is.

posted on 21 Mar 2014, 15:09 1

33. brrunopt (Posts: 471; Member since: 15 Aug 2013)


this is the kit they mention
http://i-cdn.phonearena.com/images/articles/113072-image/Notice-how-the-demo-box-is-actively-cooled.jpg

where do you see a fan ?

posted on 21 Mar 2014, 15:13 1

34. yowanvista (Posts: 299; Member since: 20 Sep 2011)


It does however feature a heatsink that maintains the temps low.

posted on 21 Mar 2014, 16:45 1

37. brrunopt (Posts: 471; Member since: 15 Aug 2013)


Yes, but in not way is dissipating 35-40W...

the Asus chromebox has a 15W Haswell SOC (initially Asus claimed it was fanless) but in reality has a small fan ,

so yes, its impossible to be using 35-40W with no fan...

posted on 21 Mar 2014, 17:53

39. vincelongman (Posts: 1045; Member since: 10 Feb 2013)


Its hard to compare the K1 to the 805, until Nvidia officially release the K1, and there's phones/tablets with it.

But your are right that its impossible to passively cool a 35-40W cpu like that.
I have a W8.1 hybrid with an Intel i5-4210Y, which is about 12W, and still has a little fan.

posted on 21 Mar 2014, 11:21 1

13. AfterShock (Posts: 2629; Member since: 02 Nov 2012)


So this comes with an extra long power cord to feed it?
I don't see how they'll make it power effecent and keep that GPU where an end user wants it, at full speed.
So no, I don't see this in phones, but large tablets that have car batteries for a supply.

posted on 21 Mar 2014, 11:36

15. yowanvista (Posts: 299; Member since: 20 Sep 2011)


Reference units are on thing but I really doubt that the GPU will be clocked at 950MHz on retail devices. It's just not going to work considering that the TDP will be rather high and the temperatures would very likely hit the 50-60 degrees under load under such circumstances. The production variants will very likely have the GPU downclocked to a much lower dynamic frequency.

posted on 21 Mar 2014, 12:04

20. mr.techdude (Posts: 542; Member since: 19 Nov 2012)


Let alone battery swelling *cough SAM cough cough SUNG*

posted on 21 Mar 2014, 12:14

21. mr2009 (Posts: 25; Member since: 26 Mar 2013)


no matter how good nvidia is, i'll choose qualcomm as i learn my mistake in atrix 4g. never again..

posted on 21 Mar 2014, 12:29

24. jove39 (Posts: 1262; Member since: 18 Oct 2011)


950MHz 192-core Kepler????

seriously 950MHz? let's tone it down to (feasible frequency in phone/tablet) 500MHz range...and then compare again.

posted on 21 Mar 2014, 13:46 1

30. TylerGrunter (Posts: 880; Member since: 16 Feb 2012)


I understand what you mean, but Tegra 4 had a boost frequency of 672 MHz, so I would expect the K1 clocked a north of 700 when in real devices.
No chance for a 500 MHz one.

posted on 21 Mar 2014, 15:14 1

35. brrunopt (Posts: 471; Member since: 15 Aug 2013)


since when is there a max MHz for a tablet to be feasible ? they are based on different architectures..

posted on 23 Mar 2014, 21:14

46. imkyle (Posts: 980; Member since: 18 Nov 2010)


My Tegra 4 Note 7 is a beast. K1 is going to be epic.

posted on 24 Mar 2014, 12:06

47. linaresx (Posts: 82; Member since: 13 Jun 2013)


The new standard in definition is Pores/In².

posted on 25 Mar 2014, 14:35

48. Sadiq (Posts: 6; Member since: 17 Jan 2014)


K1 is love to drain battreys :p

posted on 02 Apr 2014, 19:15

49. renz4 (Posts: 219; Member since: 10 Aug 2013)


it depends on how manufacturer want to implement the soc into their product. but nvidia demonstrate that at the same performance level of iphone 5s and xperia z ultra (representing snapdragon 800 device) TK1 will consume less power to get the same level of performance.

posted on 17 Jun 2014, 16:47

50. SabiXXX (Posts: 1; Member since: 17 Jun 2014)


I have Asus TF701T with tegra 4 and ASUS g74sx with nvidia 560m 3gb. Nvidia is great for my laptop.3 years old laptop and I play Watchdogs on all max,only antialiasing on 4x, instead of 8x. Really hard to make it overheat, unlike my tablet. Runs great but heathens fast on dead trigger 2. Battery lasts long when I do everyday tasks.However drains fast when gaming. Tegra4 runs everything on max, for 10 mins, before it overheats, then it goes on but lags are present from time to time and its hot.Just now it is bang for the buck.

posted on 24 Jul 2014, 11:57

52. indianmelon (Posts: 2; Member since: 24 Jul 2014)


this reference tablet looks really similar to the shield tablet they just put out.

posted on 05 Aug 2014, 01:53

53. KMH520 (Posts: 1; Member since: 05 Aug 2014)


"In short, we'd have to measure the performance of actual phones and tablets with Tegra K1, before we declare it a winner, even with a 192-core GPU, and even in comparison with Snapdragon 801, let alone the upcoming 805."

Well now that the Nvidia Shield Tablet is out the performance has been measured and k1 trounces Snapdragon 805 in every benchmark. You can now declare it a winner.

Want to comment? Please login or register.

Latest stories