Silicon warriors: Snapdragon 801 vs NVIDIA Tegra K1

Silicon warriors: Snapdragon 801 vs NVIDIA Tegra K1
Qualcomm used the MWC expo to announce its new Snapdragon 801 family of processors, which power spring flagships like the Galaxy S5, or the Xperia Z2. We already compared it to the previous, Snapdragon 800 family, and now it's time for a cagematch with another candidate for glory this year - Nvidia's new Tegra K1 mobile processor.

Tegra K1 is expected to land in Android devices before we dive deep into the summer, and shapes to be a monster chipset, with a 64-bit version on the way, too. We aren't even talking about the quad-core Cortex A15 processor here (plus a low-power companion core), as the beauty lies in the 192-core Kepler GPU. Yep, you read that right, the desktop-grade Kepler graphics are coming in a mobile form, whether you need that much pixel-pushing for Android games, or not. You'll certainly need them for the 4K video rendering, or shenanigans like real-time face expression overlay, like the one you see in our K1 hands-on video below.

The Adreno 330 GPU in Snapdragon 801, however, runs on 578 MHz, while the Kepler GPU in Tegra K1 is clocked at 950 MHz, so there might be issues with power consumption and heat. Nvidia claims that, while K1 outperforms both the Xbox 360 and the PS3, it consumes much less energy. The power consumption might still need work when it comes to mobile devices, though, as the reference kit was reportedly running at the whopping 35 to 40 watts. Nvidia said it will be able to shoehorn the K1 into the sub-2W category under non-peak loads, which should put it on a fairly equal footage with Snapdragon 801.

SoC VersionBuild processCPUGPUOpen GLeMMCMemory interfaceAnTuTu score
Tegra K1
(Lenovo ThinkVision 28)
32-bit28nmQuad-core 2.3 GHz ARM Cortex A15950MHz
192-core Kepler
OpenGL 4.4v. 4.564-bit LPDDR343617
Tegra K1 64-bit reference device64-bit28nmDual-core 2.5 GHz Project Denver950MHz
192-core Kepler
OpenGL 4.4v. 4.52x64-bit LPDDR343851
Snapdragon 801 MSM8974-AC
(Galaxy S5)
32-bit28nmQuad-core 2.45 GHz Krait 400578MHz
Adreno 330
OpenGL 3.0v. 5.02x32-bit LPDDR3-186636469


There aren't many exhaustive benchmarks on K1 and Snapdragon 801, simply because they are quite new chips, with only a few devices carrying them. Still, the following chart should give you a pretty good overview of what to expect from Tegra K1, in comparison with Snapdragon 801, as well as some other current or upcoming chipset.


Overall, as you can see from the benchmark chart above, the 32-bit and 64-bit versions of Tegra K1 are on equal footing when it comes to raw performance. Granted, they beat Snapdragon 801 with about 20% here, but let's not forget that the measurements have been done on K1 reference platforms, like the one you see in the video below, while Qualcomm's chipset is in actual smartphones at the moment. 

When NVIDIA shoehorns Tegra K1 in a lesser power envelope, suitable for phones and tablets, its general performance could very well even out with this year's Snapdragon crop. Granted, the Kepler GPU is likely to give it an advantage in the graphics department, but the Snapdragon SoC offers plenty of unsurpassed value, like a multiband LTE modem, so it would be preferable for manufacturers to use. In short, we'd have to measure the performance of actual phones and tablets with Tegra K1, before we declare it a winner, even with a 192-core GPU, and even in comparison with Snapdragon 801, let alone the upcoming 805.

FEATURED VIDEO

53 Comments

1. CX3NT3_713

Posts: 2363; Member since: Apr 18, 2011

"silicon warriors" ( • Y •) ha

25. Ashoaib

Posts: 3309; Member since: Nov 15, 2013

Good to have competition in processing power... as a result we will benefit from it... I hope a competition in a better bettery technology too

31. Zenzui

Posts: 114; Member since: Feb 13, 2012

well said!! All benefits goes to consumer for this healthy competitions

44. garz_pa

Posts: 154; Member since: Nov 03, 2011

Nice one!

2. GABIDEN

Posts: 26; Member since: Mar 06, 2013

Sorry nvidia,Snapdragon wins

17. Finalflash

Posts: 4063; Member since: Jul 23, 2013

Yea, tablet vs smartphone, and still barely winning.

3. ajac09

Posts: 1482; Member since: Sep 30, 2009

k1 if they could get some hardware partners be nice..

4. StraightEdgeNexus

Posts: 3689; Member since: Feb 14, 2014

Snapdragon wins. Qualcomm rocks Nvidia sucks

6. mr.techdude

Posts: 571; Member since: Nov 19, 2012

qualcomm rocks in the mobile industry but no doubt nvidia is the king of graphic cards in the computer segment

14. StraightEdgeNexus

Posts: 3689; Member since: Feb 14, 2014

I'm all about AMD Radeon gpu. I hate nvidia. I had one of the mid end GTX but it overheated horribly while playing GTA San Andreas(Seriously?). Also i experienced some framerate drops playing certain games. Can you expect such performance from king leader of PC's gpu that costs $200. My Radeon costs less and equivalent performance with no issues. Buying nvidia product was my first and last mistake in PC hardware.

16. ajac09

Posts: 1482; Member since: Sep 30, 2009

I love AMD processors and I love the performance of the APU in my laptop but Nvidia is king of graphics by far. but AMD especailly the APU is doing well

18. mr.techdude

Posts: 571; Member since: Nov 19, 2012

I agree with you 100%, AMD is awesome but I haven't used any of their cards yet, however speaking about the gtx series which one did you have cause I'm using Asus g750jh and it has gtx 780m gpu. The cooling system on this laptop is ridiculously good like I'll play assassins creed black flag for like 2.3 hours straight with no overheating. Sure the graphics card over heats and that's normal but it depends on your unit on how good it is to take out that heat.

22. StraightEdgeNexus

Posts: 3689; Member since: Feb 14, 2014

I dont know the exact model but i was using it in desktop with a massive fan. Anyway i'm not gonna change to nvidia or upgrading my desktop. Currently playing on playstation 3.

26. Deaconclgi

Posts: 405; Member since: Nov 03, 2012

I had a Gigabyte R9 290x that I bought in December 2013 and had to get 2 RMAs due to the cards being unstable and crashing windows constantly. I even bought a new 800 watt power supply thinking that would fix it. I finally went back to NVIDIA(previous card was a 560TI and 9600GT) in February and bought a EVGA 780 SC ACX and have not had any issues. I will most likely NEVER buy a Gigabyte card again and I probably will not buy another AMD GPU when they are newly released. I will however, continue to buy laptops with AMD APUs as they provide the best price to performance and I am hoping to see powerful AMD APUs in Windows tablets in the near future. The R9 290x experience was horribly frustrating! I reinstalled windows 8, windows 7, swept the drivers, used beta drivers, used Mantle drivers, missed gaming for my entire 2 week Christmas vacation, missed Battlefield 4 bonuses because I couldn't play. It wasn't the end of the world but it was a horrible experience considering it was a $600+ GPU and I never had ANY issues with my NVIDIA GPUs.

27. StraightEdgeNexus

Posts: 3689; Member since: Feb 14, 2014

$600 lol i would never spend it on gpu i would rather buy a playstation 4 and few titles.

28. Deaconclgi

Posts: 405; Member since: Nov 03, 2012

That is a great option as well. I don't play games often and when I do, it is on a PC. I used to be a serious gamer but that was back when I had a Dreamcast, Xbox, Playstation 2, Gamecube, PC, two TVs and it all connected to a 300 watt stereo in my bedroom. A wife, 2 daughters, a busy career and many years later and I haven't bought a console for myself since the original Xbox. Now, I basically play Battlefield 4 (used to play 3) and a host of other games such as Far Cry 2, Crysis 2, Assassins Creed Black Flag (free with my 780 SC ACX). I don't even keep up with games anymore beyond the Battlefield series. My computer is a super powerful machine that is under used but hey, when I do use it, it can hand any game that I buy...even if I only play one game about 4 times a month. I did buy a Wii for my daughters a few years ago...they never play it....I messed up and bought Sims 3 for PC for them and they never looked back...sigh.

45. cezarepc

Posts: 718; Member since: Nov 23, 2012

cool story bro :)

51. indianmelon

Posts: 2; Member since: Jul 24, 2014

Joke's on you. The PS3 uses an nVidia GPU.

38. vincelongman

Posts: 5762; Member since: Feb 10, 2013

It depends on which cooler your card has For example the Nvidia and AMD reference cooler aren't as good as the non-reference cooolers, especially AMD's for their highend card, their extremely loud and hot. Personally, I chose a EVGA GTX 760 with EVGA's ACX cooler, because it was only $250 and came with AC4, SP:Blacklist, batman arkham origins and Metro:LL. At the time the R9 270X were $230 and came with BF4. So I got the 760 as it performs better and was better value because of the 4 free games

41. renz4

Posts: 319; Member since: Aug 10, 2013

hahaha

42. SuperMaoriBro

Posts: 533; Member since: Jun 23, 2012

Qualcom rocks, nvidia rocks. Straightedgenexus sucks

43. StraightEdgeNexus

Posts: 3689; Member since: Feb 14, 2014

Dont get personal here. I had all kind of horrible experiences with nvidia. The overheating and framerate dropping gpu, then the gpu of my crappy lenovo laptop got screwed up for no reason.

5. Duketytz

Posts: 534; Member since: Nov 28, 2013

Where's our beloved 805? Even if nvidia is better, finding a tablet/smartphone with their chips are kind of rare.

8. true1984

Posts: 870; Member since: May 23, 2012

asus always uses nvidia chips in their high end tablets

7. JMartin22

Posts: 2392; Member since: Apr 30, 2013

At the end of the day, Qualcomm is the winner because it's better optimized, has greater features and is backed with more developer support.

9. true1984

Posts: 870; Member since: May 23, 2012

their dual core is running at 3GHz. maybe if you clocked an 800 at the same speed you might get the same results

12. brrunopt

Posts: 742; Member since: Aug 15, 2013

What matter is the power consumption / performance , not the Ghz / performance

32. true1984

Posts: 870; Member since: May 23, 2012

i was just speaking on the higher score it got, the power consumption is already bretty bad with nvidia chips

36. brrunopt

Posts: 742; Member since: Aug 15, 2013

higher score ~ performance but you are saying that a 3Ghz S800 would achieve the same results , witch makes no sense since it most likely it wouldn't be a comparable product

40. true1984

Posts: 870; Member since: May 23, 2012

might. same as underclocking the k1 to 2.3 or 2.5

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.
FCC OKs Cingular's purchase of AT&T Wireless