x PhoneArena is looking for new authors in New York! To view all available positions, click here.
  • Home
  • News
  • Nvidia Tegra K1 smokes the iPad Air and Snapdragon 800 in graphics benchmarks

Nvidia Tegra K1 smokes the iPad Air and Snapdragon 800 in graphics benchmarks

Posted: , by Daniel P.

Tags:

Nvidia Tegra K1 smokes the iPad Air and Snapdragon 800 in graphics benchmarks
Nvidia dropped a bombshell announcement with its new Tegra K1 chipset last week, as it ushered in the era of Cortex-A15 processors and Kepler graphics for mobile. The 192-core GPU is certainly its main selling point, and now we get the benchmarks to gauge exactly how much better it will be from anything else currently on the market. 

As you can see in the GFXBench score table below, not only does it score 2.5x better in graphics performance than the current cream of the Qualcomm crop, Snapdragon 800, but it is also more than twice faster than the blazing A7 chip in the new iPad Air. In addition, it scores a tad higher than Intel's new HD4400 graphics subset that comes with the Haswell processor line, which is an amazing feat in itself, given that K1 will find itself in slim and light mobile devices soon.

Nvidia Tegra K1 smokes the iPad Air and Snapdragon 800 in graphics benchmarks
source: Wccftech

50 Comments
  • Options
    Close




posted on 13 Jan 2014, 02:09 6

1. tech2 (Posts: 2053; Member since: 26 Oct 2012)


Any word on when its hitting the conveyor belt or when is the earliest device launching ?

Dont let us down in the battery department this time Nvidia ! There is a reason why Exynos is using big.LITTLE architecture with A15s. Its too power hungry

posted on 13 Jan 2014, 04:47 1

19. brrunopt (Posts: 506; Member since: 15 Aug 2013)


thats why nVidea uses the 4+1 configuration...

posted on 13 Jan 2014, 05:30 1

22. Shubham412302 (Posts: 298; Member since: 09 Nov 2011)


adreno 420 dosenot have a match for k1 as its just 40% faster as compared to adreno 330 and k1 is said to be many times powerful than a7 in unreal engine 4 technology

posted on 13 Jan 2014, 09:28 1

35. PhenomFaz (Posts: 1070; Member since: 26 Sep 2012)


Yup battery life is gonna be crucial...but in the emantime I've got one word for Nvidia...
DAMN! :)

posted on 13 Jan 2014, 02:11 16

2. darkskoliro (Posts: 970; Member since: 07 May 2012)


What the djsjaoka i just bought an ultrabook with hd4400 and im getting outperformed by a mobile device? -.-

posted on 13 Jan 2014, 02:17 14

3. hurrycanger (Posts: 1118; Member since: 01 Dec 2013)


I know right xD
I have a desktop with hd2000. Guess how awesome I feel...

posted on 13 Jan 2014, 03:23

13. hipnotika (Posts: 298; Member since: 06 Mar 2013)


because intel hd graphics is worst gpu on the planet.

posted on 13 Jan 2014, 04:50 1

20. brrunopt (Posts: 506; Member since: 15 Aug 2013)


So you haven't seen the HD5200 , witch performs better than the GT740M

@ darkskoliro
The HD4400 is not a powerfull GPU, is wasn't design to be powerfull GPU , just powerfull enouth to normal use...

posted on 13 Jan 2014, 05:36 1

23. yowanvista (Posts: 304; Member since: 20 Sep 2011)


Any dedicated Kepler card beats Intel's IPG, the Iris Pro 5200 relies on RAM for video memory and this is way slower than GDDR3/5 found on any modern card. In theory the GT740M outperforms the 5200 but both will run games at less than 10fps maxed out.

posted on 13 Jan 2014, 06:46

29. brrunopt (Posts: 506; Member since: 15 Aug 2013)


1st - having dedicated memory doenst mean it will have better performance than any other with shared memory.

2nd - the hd5200 has 128MB of eDRAM witch is faster than GDDR5 afaik, and only high end (GTX ... ) have GDDR5, the GT740M has DDR3 .

The HD5200 performs between the GT740M and GT750M, thats a fact ...

posted on 14 Jan 2014, 06:28 1

43. joaolx (Posts: 353; Member since: 16 Aug 2011)


http://www.game-debate.com/gpu/index.php?gid=1783&gid2=1247&compare=iris-pro-graphics-5200-mobile-vs-geforce-gt-740m

If it has dedicated memory it can actually mean it performs better. Since intel hd uses ram memory the cpu won't use full memory and bandwidth because the gpu is also using it. Now the interesting thing is that the hd 5200 has 128MB ram but that is basically nothing really but it allows it to perform equally or better than the gt640m. The gt740m is equal to the ddr3 model of the gt650m. Both two years old.
But let's not forget that the hd 5200 is iris pro and can only be found on high-end cpu. I have an high-end quad-core i7 and I actually don't have it because it's only really for the highest-end do to the edram. Without the edram it's basically a hd 5100.

posted on 14 Jan 2014, 14:04

50. brrunopt (Posts: 506; Member since: 15 Aug 2013)


You can compare specifications all you want, what matters is the results. And the results don't lie, in average the Hd5200 not only performs better the GT640M but also above the Gt740M (in some games above the GT750)
And the HD5100 (no dedicated memory) performs at level with a GT710
http://www.notebookcheck.net/Computer-Games-on-Laptop-Graphic-Cards.13849.0.html

There are laptops with the i7QM+HD5200 for around the same price that with i7 QM + GT740M / GT750M

posted on 14 Jan 2014, 06:31

44. joaolx (Posts: 353; Member since: 16 Aug 2011)


The main affect that the integrated gpu will have is that they won't allow the cpu to use the whole ram and it's bandwidth decreasing it's maximum performance in games. The intel hd 5200 uses 128MB of edram but it's not enough to still beat it.

posted on 14 Jan 2014, 06:48

49. darkskoliro (Posts: 970; Member since: 07 May 2012)


Im actually satisfied with the HD4400 :) Because it runs my Dota 2 smoothly on medium graphics which was better than I expected hahaha.

posted on 13 Jan 2014, 02:19 4

4. JMartin22 (Posts: 1031; Member since: 30 Apr 2013)


The fact that it's ARMv7 architecture, which is outdated. Cortex-A15 is yesteryears technology. ARMv8 based Cortex-A53/A57 will be much more power efficient and powerful, respectively. The 192 graphics cores is marketing bells and whistles. But the fact is, for all of its power, it's still outdated.

posted on 13 Jan 2014, 02:40 5

8. Retro-touch (Posts: 257; Member since: 24 Oct 2011)


The SoC will come in two versions, one version with a quad-core (4+1) Cortex-A15, and one that leverages two of NVIDIA’s own 64-bit ARMv8 Denver CPUs

posted on 13 Jan 2014, 02:44 11

9. yowanvista (Posts: 304; Member since: 20 Sep 2011)


Cortex-A53 and higher are primarily aimed at high performance systems like servers. In fact the Cortex-A53 has the same pipeline length as Cortex-A7 and real-time version of ARMv8 (ARMv8-R) is still 32-bit on any Cortex-A5x core. ARMv7 is far from being outdated nor is the revised Cortex-A15 (r3p3) found in the K1. The architectural changes brought by AArch64 are only incremental additions and will in no way be produce a significantly more power efficient core design. The Cortex-A5x series were designed with performance in mind.

Please enlighten me, how are those 192 shaders specifically outdated? You do realize that theTegra K1 has 1 SMX comprising of 192 shaders?. Think of it as a downscaled version of a Kepler based GPU like the GTX 780 which has 12 SMX. How is that 'outdated'?

posted on 13 Jan 2014, 02:57 2

11. Reality_Check (Posts: 269; Member since: 15 Aug 2013)


Some people like to think they know everything. Ignorance.. SMH

posted on 13 Jan 2014, 06:45

28. AppleHateBoy (unregistered)


IMHO, it is outdated. GTX 680 was launched wayy back in March of 2012. Kepler architecture is almost 2 years old now and 2 years is a loong time wrt the pace at which the mobile industry is growing.

Fortunately, it's alteast newer than PowerVR 6 series GPU powering the Apple A7 which was announced in Feb of 2011..

posted on 13 Jan 2014, 07:14 2

33. brrunopt (Posts: 506; Member since: 15 Aug 2013)


exactly, 2y old and still is more powerfull than any other ..

tegra 6 (Parker) is supposed to come with Maxwell, it will be another big jump in performance.

posted on 13 Jan 2014, 21:57

41. yowanvista (Posts: 304; Member since: 20 Sep 2011)


Kepler isn't geared towards mobile implementations it a desktop class architecture. Nvidia just shrank it down in the K1 but it remains a top notch performance. The latest desktop cards still use the refreshed Kepler so no, it's far from being outdated.

posted on 13 Jan 2014, 12:18

37. Extradite (banned) (Posts: 316; Member since: 30 Dec 2013)


Pathetic OEMs are still not providing us consumers with bigger 3500Mah battery so it can run A15CPUs with equivalent to having 3000Mah battery. Those A15s are a powerfull CPUs.

posted on 14 Jan 2014, 06:35

45. joaolx (Posts: 353; Member since: 16 Aug 2011)


Nvidia has a good history have marketing but when an actual tablet/smartphone comes out with the supposed tegra it is already outdated.

posted on 13 Jan 2014, 02:24 6

5. livyatan (Posts: 692; Member since: 19 Jun 2013)


Actually the Adreno 330 is faster than PowerVR SGX6430 in the iPad Air.
It proves that in the other off screen benchmarks, as well as with the GFLOPS number -130 vs 115.
And I'm getting a 26fps in T-Rex on my Note 3.

As for the K1, it is leagues above with 340+ GFLOPS.
I've said long ago that it will embarrass Intel, and I got laughed at.

And now, there you have it!
It outperforms the HD4400, at one third the TDP!
What a revolutionary achievement.. and other GPU solutions from ARM vendors are going to do the same - the newly announced SGX TX high end PowerVR, and Mali t760.
Both are over 320GFLOPS at a mere 2W TDP.

posted on 13 Jan 2014, 04:54

21. brrunopt (Posts: 506; Member since: 15 Aug 2013)


Most of the TDP is for the CPU , witch is a lot more powerfull.

posted on 13 Jan 2014, 06:47

30. AppleHateBoy (unregistered)


The 2 W TDP is for the GPU. The TDP of the SoC is in the 4-5 W range.

posted on 13 Jan 2014, 07:10

32. brrunopt (Posts: 506; Member since: 15 Aug 2013)


i'm talking about the TDP of lhe Intel SOC , most is for lhe CPU (i5-4200U), not the GPU (HD4400)

posted on 14 Jan 2014, 06:39

46. joaolx (Posts: 353; Member since: 16 Aug 2011)


Knowing the disadvantages of integrated gpu and knowing that the hd4400 is an incredibly low-end gpu it doesn't surprise me at all. Not to mention the intel hd weren't designed for gaming.

posted on 13 Jan 2014, 02:34 2

6. yowanvista (Posts: 304; Member since: 20 Sep 2011)


"192-core GPU", actually its 192 unified shaders. Shader ≠ cores

posted on 13 Jan 2014, 06:25 1

26. renz4 (Posts: 228; Member since: 10 Aug 2013)


and nvidia has been calling their stream processor (SP) as CUDA cores for years (since fermi i think)

posted on 13 Jan 2014, 02:34 4

7. aayupanday (Posts: 516; Member since: 28 Jun 2012)


The K1's gonna let you play Next-Gen Console Quality graphics games while at the same time make an omlett on your device's back.

posted on 13 Jan 2014, 04:26

16. FeRoZ (Posts: 33; Member since: 04 Sep 2013)


:-)) thnx 4 da laugh ....

posted on 13 Jan 2014, 12:22

38. Extradite (banned) (Posts: 316; Member since: 30 Dec 2013)


He surely is a weak fragile guy, that he's shting the pants cause it will get hot. Be a man, not a pussicat...

posted on 13 Jan 2014, 04:33

17. danishnigz (Posts: 5; Member since: 31 Aug 2013)


HAHAHAH GOOD ONE

posted on 13 Jan 2014, 15:57

40. timepilot84 (Posts: 73; Member since: 16 Aug 2012)


At 5W, I wouldn't be eating any eggs you cook on it. Not unless you like E Coli.

posted on 13 Jan 2014, 02:48

10. galanoth (Posts: 319; Member since: 26 Nov 2011)


That's a reference tablet though.
Clock speed could be on ultra high with a heatsink for all we know.

posted on 13 Jan 2014, 04:14

15. MrPhilo (Posts: 47; Member since: 25 Feb 2013)


The Reference Device is a 7" Tablet, pretty much a Tegra Note 7 but with 4GB Ram, 1080p Screen and Tegra K1 (A15 ver).

posted on 13 Jan 2014, 02:59

12. Lyngdoh (Posts: 224; Member since: 06 Sep 2012)


Are these benchmarks ran on a standard resolution?? Else this is not a fair comparison.

posted on 13 Jan 2014, 03:40 3

14. Berzerk000 (Posts: 4020; Member since: 26 Jun 2011)


Yes, 1080p Offscreen. The device resolution is not a factor.

posted on 13 Jan 2014, 04:35

18. Diazene (Posts: 123; Member since: 01 May 2013)


what about the relative die size or power consumption?

posted on 13 Jan 2014, 05:40

24. joaolx (Posts: 353; Member since: 16 Aug 2011)


I thought Nvidia said it would be 3x faster than the A7. Guess not, but still impressive

posted on 13 Jan 2014, 06:24

25. SPIDERMAN4 (Posts: 25; Member since: 13 Apr 2011)


what about the snapdragon 805?

posted on 13 Jan 2014, 06:27

27. renz4 (Posts: 228; Member since: 10 Aug 2013)


i think you can try to do the math from above number. qualcom claim that adreno 420 will be 40% faster than current adreno 330

posted on 13 Jan 2014, 07:02 3

31. TylerGrunter (Posts: 911; Member since: 16 Feb 2012)


That graph is coming from notebookcheck.net. I'm not blaming Daniel P., the source page that he refers to doesn't give the original one. But it would be nice if the real source is added:
http://www.notebookcheck.net/NVIDIA-Tegra-K1-SoC.108310.0.html
And by the way: that's not a real benchmar, but what Nvidia claims it can do. I would wait for the real implementations before starting too muc hype.

posted on 14 Jan 2014, 06:42

47. joaolx (Posts: 353; Member since: 16 Aug 2011)


That's nvidia right there. When it comes out, just like all other tegras, it will be outperformed by qualcomm and all others..

posted on 13 Jan 2014, 09:14

34. theoak (Posts: 320; Member since: 16 Nov 2011)


Clock speed needs to be considered too. If NVidia is running 1 GHz faster it is kind of hard to compare.

posted on 13 Jan 2014, 09:35

36. npaladin2000 (Posts: 134; Member since: 06 Nov 2011)


Did they include on-board WiFi and LTE radios? If not, Qualcomm is still going to own the US phone market.

posted on 13 Jan 2014, 13:43

39. downphoenix (Posts: 2329; Member since: 19 Jun 2010)


fantastic specs but they mean squat if they end up like in the Tegra 4, not in anything except for some cheap chinese crap.

posted on 13 Jan 2014, 23:57

42. fteoOpty64 (Posts: 8; Member since: 19 Mar 2013)


Agree that NV has to DELIVER this time!. They got burned on T4 and T4i and cannot make that mistake again or else no one is ever going to trust them in mobile. At least they had the trust of MS, HP and Toshiba for T4 design wins.
For K1, NV has to quickly capture the super-tablet and super-monitor markets by mid-year. It appears they *can* do it this time. We shall see.

PS: NV Shield console with K1 will be a hoot! Lets hope they put a 6.5 inch FHD screen on it and camera on Shield to see the player (ie face tracking).

posted on 14 Jan 2014, 06:45

48. joaolx (Posts: 353; Member since: 16 Aug 2011)


They got burned with the tegra 2/3/4/4i. Not just the last two. They claim they have the best SoC only to outperformed by the time any smartphone or tablet comes out with their SoC.

Want to comment? Please login or register.

Latest stories