Silicon warriors: Snapdragon 801 vs NVIDIA Tegra K1
|Build process||CPU||GPU||Open GL||eMMC||Memory interface||AnTuTu score|
(Lenovo ThinkVision 28)
|28nm||Quad-core 2.3 GHz ARM Cortex A15||950MHz|
|OpenGL 4.4||v. 4.5||64-bit LPDDR3||43617|
|Tegra K1 64-bit reference device||28nm||Dual-core 2.5 GHz Project Denver||950MHz|
|OpenGL 4.4||v. 4.5||2x64-bit LPDDR3||43851|
|Snapdragon 801 MSM8974-AC|
|28nm||Quad-core 2.45 GHz Krait 400||578MHz|
|OpenGL 3.0||v. 5.0||2x32-bit LPDDR3-1866||36469|
AnTuTu scores of NVIDIA Tegra K1 reference platforms, compared with popular chipsets
25. Ashoaib (Posts: 1221; Member since: 15 Nov 2013)
Good to have competition in processing power... as a result we will benefit from it... I hope a competition in a better bettery technology too
31. Zenzui (Posts: 68; Member since: 13 Feb 2012)
well said!! All benefits goes to consumer for this healthy competitions
17. Finalflash (Posts: 1471; Member since: 23 Jul 2013)
Yea, tablet vs smartphone, and still barely winning.
3. ajac09 (Posts: 1346; Member since: 30 Sep 2009)
k1 if they could get some hardware partners be nice..
4. StraightEdgeNexus (Posts: 2706; Member since: 14 Feb 2014)
Snapdragon wins. Qualcomm rocks Nvidia sucks
6. mr.techdude (Posts: 536; Member since: 19 Nov 2012)
qualcomm rocks in the mobile industry but no doubt nvidia is the king of graphic cards in the computer segment
14. StraightEdgeNexus (Posts: 2706; Member since: 14 Feb 2014)
I'm all about AMD Radeon gpu. I hate nvidia. I had one of the mid end GTX but it overheated horribly while playing GTA San Andreas(Seriously?). Also i experienced some framerate drops playing certain games. Can you expect such performance from king leader of PC's gpu that costs $200. My Radeon costs less and equivalent performance with no issues. Buying nvidia product was my first and last mistake in PC hardware.
16. ajac09 (Posts: 1346; Member since: 30 Sep 2009)
I love AMD processors and I love the performance of the APU in my laptop but Nvidia is king of graphics by far. but AMD especailly the APU is doing well
18. mr.techdude (Posts: 536; Member since: 19 Nov 2012)
I agree with you 100%, AMD is awesome but I haven't used any of their cards yet, however speaking about the gtx series which one did you have cause I'm using Asus g750jh and it has gtx 780m gpu. The cooling system on this laptop is ridiculously good like I'll play assassins creed black flag for like 2.3 hours straight with no overheating. Sure the graphics card over heats and that's normal but it depends on your unit on how good it is to take out that heat.
22. StraightEdgeNexus (Posts: 2706; Member since: 14 Feb 2014)
I dont know the exact model but i was using it in desktop with a massive fan. Anyway i'm not gonna change to nvidia or upgrading my desktop. Currently playing on playstation 3.
26. Deaconclgi (Posts: 211; Member since: 03 Nov 2012)
I had a Gigabyte R9 290x that I bought in December 2013 and had to get 2 RMAs due to the cards being unstable and crashing windows constantly. I even bought a new 800 watt power supply thinking that would fix it.
I finally went back to NVIDIA(previous card was a 560TI and 9600GT) in February and bought a EVGA 780 SC ACX and have not had any issues. I will most likely NEVER buy a Gigabyte card again and I probably will not buy another AMD GPU when they are newly released.
I will however, continue to buy laptops with AMD APUs as they provide the best price to performance and I am hoping to see powerful AMD APUs in Windows tablets in the near future.
The R9 290x experience was horribly frustrating! I reinstalled windows 8, windows 7, swept the drivers, used beta drivers, used Mantle drivers, missed gaming for my entire 2 week Christmas vacation, missed Battlefield 4 bonuses because I couldn't play.
It wasn't the end of the world but it was a horrible experience considering it was a $600+ GPU and I never had ANY issues with my NVIDIA GPUs.
27. StraightEdgeNexus (Posts: 2706; Member since: 14 Feb 2014)
$600 lol i would never spend it on gpu i would rather buy a playstation 4 and few titles.
28. Deaconclgi (Posts: 211; Member since: 03 Nov 2012)
That is a great option as well. I don't play games often and when I do, it is on a PC. I used to be a serious gamer but that was back when I had a Dreamcast, Xbox, Playstation 2, Gamecube, PC, two TVs and it all connected to a 300 watt stereo in my bedroom.
A wife, 2 daughters, a busy career and many years later and I haven't bought a console for myself since the original Xbox.
Now, I basically play Battlefield 4 (used to play 3) and a host of other games such as Far Cry 2, Crysis 2, Assassins Creed Black Flag (free with my 780 SC ACX). I don't even keep up with games anymore beyond the Battlefield series.
My computer is a super powerful machine that is under used but hey, when I do use it, it can hand any game that I buy...even if I only play one game about 4 times a month.
I did buy a Wii for my daughters a few years ago...they never play it....I messed up and bought Sims 3 for PC for them and they never looked back...sigh.
51. indianmelon (Posts: 2; Member since: 24 Jul 2014)
Joke's on you. The PS3 uses an nVidia GPU.
38. vincelongman (Posts: 957; Member since: 10 Feb 2013)
It depends on which cooler your card has
For example the Nvidia and AMD reference cooler aren't as good as the non-reference cooolers, especially AMD's for their highend card, their extremely loud and hot.
Personally, I chose a EVGA GTX 760 with EVGA's ACX cooler, because it was only $250 and came with AC4, SP:Blacklist, batman arkham origins and Metro:LL.
At the time the R9 270X were $230 and came with BF4.
So I got the 760 as it performs better and was better value because of the 4 free games
42. SuperMaoriBro (Posts: 258; Member since: 23 Jun 2012)
Qualcom rocks, nvidia rocks. Straightedgenexus sucks
43. StraightEdgeNexus (Posts: 2706; Member since: 14 Feb 2014)
Dont get personal here. I had all kind of horrible experiences with nvidia. The overheating and framerate dropping gpu, then the gpu of my crappy lenovo laptop got screwed up for no reason.
5. Duketytz (Posts: 333; Member since: 28 Nov 2013)
Where's our beloved 805? Even if nvidia is better, finding a tablet/smartphone with their chips are kind of rare.
8. true1984 (Posts: 582; Member since: 23 May 2012)
asus always uses nvidia chips in their high end tablets
7. JMartin22 (Posts: 713; Member since: 30 Apr 2013)
At the end of the day, Qualcomm is the winner because it's better optimized, has greater features and is backed with more developer support.
9. true1984 (Posts: 582; Member since: 23 May 2012)
their dual core is running at 3GHz. maybe if you clocked an 800 at the same speed you might get the same results
12. brrunopt (Posts: 405; Member since: 15 Aug 2013)
What matter is the power consumption / performance , not the Ghz / performance
32. true1984 (Posts: 582; Member since: 23 May 2012)
i was just speaking on the higher score it got, the power consumption is already bretty bad with nvidia chips
36. brrunopt (Posts: 405; Member since: 15 Aug 2013)
higher score ~ performance
but you are saying that a 3Ghz S800 would achieve the same results , witch makes no sense since it most likely it wouldn't be a comparable product
40. true1984 (Posts: 582; Member since: 23 May 2012)
might. same as underclocking the k1 to 2.3 or 2.5
10. grahaman27 (Posts: 345; Member since: 05 Apr 2013)
K1 is better (even though I dont trust leaked benchmarks), but it doesnt have the integrated modem. For all you snapdragon fans, thats the only reason nvidia hasnt picked up steam.
19. mr.techdude (Posts: 536; Member since: 19 Nov 2012)
I'm no snap fan but I find snapdragon to be way more effiecient in battery consumption in contrast to the more powerful k1, that's why snap is recommended in mobile handsets cause nvidia is not that experienced in the mobile part.
23. StraightEdgeNexus (Posts: 2706; Member since: 14 Feb 2014)
Crapxynos is the worst by far. Snapdragons are amazing specificaly those krait powered.
11. brrunopt (Posts: 405; Member since: 15 Aug 2013)
" as the reference kit was reportedly running at the whopping 35 to 40 watts. "
seriously ? reportedly ? i
s was a stupid comment from some idiot.. IT'S IMPOSSIBLE its drawing 35-40W , it would need a generous fan to keep that cooled down. Do you see any fan in there ? i don't...
29. TylerGrunter (Posts: 866; Member since: 16 Feb 2012)
The kit referred is not the one in the video, and acutally had a fan.
The issue at hand is that the scores in Antutu could have been done there, not in the tablet, so till real deives are out we´ll never be sure of how good the Tegra K1 is.
33. brrunopt (Posts: 405; Member since: 15 Aug 2013)
this is the kit they mention
where do you see a fan ?
34. yowanvista (Posts: 288; Member since: 20 Sep 2011)
It does however feature a heatsink that maintains the temps low.
37. brrunopt (Posts: 405; Member since: 15 Aug 2013)
Yes, but in not way is dissipating 35-40W...
the Asus chromebox has a 15W Haswell SOC (initially Asus claimed it was fanless) but in reality has a small fan ,
so yes, its impossible to be using 35-40W with no fan...
39. vincelongman (Posts: 957; Member since: 10 Feb 2013)
Its hard to compare the K1 to the 805, until Nvidia officially release the K1, and there's phones/tablets with it.
But your are right that its impossible to passively cool a 35-40W cpu like that.
I have a W8.1 hybrid with an Intel i5-4210Y, which is about 12W, and still has a little fan.
13. AfterShock (Posts: 2214; Member since: 02 Nov 2012)
So this comes with an extra long power cord to feed it?
I don't see how they'll make it power effecent and keep that GPU where an end user wants it, at full speed.
So no, I don't see this in phones, but large tablets that have car batteries for a supply.
15. yowanvista (Posts: 288; Member since: 20 Sep 2011)
Reference units are on thing but I really doubt that the GPU will be clocked at 950MHz on retail devices. It's just not going to work considering that the TDP will be rather high and the temperatures would very likely hit the 50-60 degrees under load under such circumstances. The production variants will very likely have the GPU downclocked to a much lower dynamic frequency.
20. mr.techdude (Posts: 536; Member since: 19 Nov 2012)
Let alone battery swelling *cough SAM cough cough SUNG*
21. mr2009 (Posts: 24; Member since: 26 Mar 2013)
no matter how good nvidia is, i'll choose qualcomm as i learn my mistake in atrix 4g. never again..
24. jove39 (Posts: 1252; Member since: 18 Oct 2011)
950MHz 192-core Kepler????
seriously 950MHz? let's tone it down to (feasible frequency in phone/tablet) 500MHz range...and then compare again.
30. TylerGrunter (Posts: 866; Member since: 16 Feb 2012)
I understand what you mean, but Tegra 4 had a boost frequency of 672 MHz, so I would expect the K1 clocked a north of 700 when in real devices.
No chance for a 500 MHz one.
35. brrunopt (Posts: 405; Member since: 15 Aug 2013)
since when is there a max MHz for a tablet to be feasible ? they are based on different architectures..
46. imkyle (Posts: 970; Member since: 18 Nov 2010)
My Tegra 4 Note 7 is a beast. K1 is going to be epic.
47. linaresx (Posts: 78; Member since: 13 Jun 2013)
The new standard in definition is Pores/In².
49. renz4 (Posts: 207; Member since: 10 Aug 2013)
it depends on how manufacturer want to implement the soc into their product. but nvidia demonstrate that at the same performance level of iphone 5s and xperia z ultra (representing snapdragon 800 device) TK1 will consume less power to get the same level of performance.
50. SabiXXX (Posts: 1; Member since: 17 Jun 2014)
I have Asus TF701T with tegra 4 and ASUS g74sx with nvidia 560m 3gb. Nvidia is great for my laptop.3 years old laptop and I play Watchdogs on all max,only antialiasing on 4x, instead of 8x. Really hard to make it overheat, unlike my tablet. Runs great but heathens fast on dead trigger 2. Battery lasts long when I do everyday tasks.However drains fast when gaming. Tegra4 runs everything on max, for 10 mins, before it overheats, then it goes on but lags are present from time to time and its hot.Just now it is bang for the buck.
52. indianmelon (Posts: 2; Member since: 24 Jul 2014)
this reference tablet looks really similar to the shield tablet they just put out.