First NVIDIA Tegra 4 benchmarks are in, lags behind the iPad's A6X SoC
0. phoneArena 15 Jan 2013, 08:26 posted on
Apparently, someone with access to a Tegra 4 evaluation board decided to take it for a spin and see what NVIDIA's new mobile chip was capable of. And what better way of doing that than running a few benchmarks aiming for a new high score? Well, whoever tested the Tegra 4 didn't quite get there, despite posting some pretty promising results on the GLBenchmark database...
This is a discussion for a news. To read the whole news, click here
1. ajac09 (Posts: 1211; Member since: 30 Sep 2009)
Benchmarks are NOT law. If benchmarks did matter more companies would use more methods to manipulate them like they used to.
15. RaKithAPeiRiZ (Posts: 1223; Member since: 29 Dec 2011)
i think the problem is Android 4.2.1 , there is something wrong with the code , even the Tegra 3 -nexus 7 has shuttering issues ,they should try to run it on jellybean and rescore
19. SuperAndroidEvo (Posts: 3515; Member since: 15 Apr 2011)
Android 4.2.1 IS Jelly Bean. Did you mean running it on Android 4.1.2 which is Jelly Bean also?
26. hung2900 (Posts: 713; Member since: 02 Mar 2012)
The fact is that tegra 4's graphic benchmark is even worse than Exynos 5250 on Nexus 10 which is also running on Android 4.2.1
27. Dr.Phil (Posts: 818; Member since: 14 Feb 2011)
I haven't really read up on the Tegra 4 chipset, but did they fix the memory bandwith issue that plagued the Tegra 3? Perhaps that has to do with some of the lagged performance. Or it could be simple optimization problems.
35. AppleHateBoy (Posts: 444; Member since: 29 Feb 2012)
It's optimization. And perhaps the drivers are not ready yet. Once it's all done. I expect the performance to be atleast more that of Nexus 10.
BTW I must say ARM did an incredible job with the Mali-T604. Only 4 shader cores and it beats the hell out of Adreno 320 and A6!
34. AppleHateBoy (Posts: 444; Member since: 29 Feb 2012)
Tegra 4's benchmark performance is even worse than Nexus 4 which only has an Adreno 320.
20. MeoCao (unregistered)
when benchmarks are concerned, Android will always be a=in disadvantaged position against iOS. iOS is a very light OS and the CPU does not have to work hard at all.
25. MrJerry (Posts: 378; Member since: 05 Oct 2012)
You are right
But I am willing to get those disadvantages over Apple's too simple and boring iOS which doesn't allow me to do muti-tasks and have freedom
It's like doing one job at 95% vs doing many jobs at 90%
33. MeoCao (unregistered)
Multitasking is 1 part of the issue, resolution is big part too. Top Android phones have far more pixels to drive than iPhones and that of course affects GPU performance.
39. krysto (Posts: 3; Member since: 18 Jan 2013)
Tell that to people who have gotten iOS5 on iPhone 3GS or iOS 6 on iPhone 4.
There's not much difference at this point between the performance. Maybe 5% at most, but definitely not 30% or more, which is what people here seem to be implying.
36. suneeboy (Posts: 149; Member since: 02 Oct 2012)
If it beat the apple chip you would be beating the drum about how great the benchmark score is. You can't have it both ways.
2. wendygarett (unregistered)
Hmm, still anxiously waited for the upcoming tegra 4 device :)
imo, a simple dual core snapdragon s4 is already run perfectly smooth on android, while the a5 chips on ios... But competition is always good :)
3. kshell1 (Posts: 1140; Member since: 05 Oct 2011)
i still think 72 GPU cores is a bit of a Roid rage.
6. p0rkguy (Posts: 677; Member since: 23 Nov 2010)
What do you expect from a company that specializes in GPUs?
They can't do anything else other than pump up graphics. It's a bit sad because until new consoles come out, improving graphics will be pointless and that's just a maybe.
10. picka_vi_materina (Posts: 156; Member since: 21 Nov 2011)
That word "cores" is misunderstood at a level over 9000. It's not actual GPU cores. It's CUDA cores like the desktop GPUs. A single GTX680 may have well over 1000 CUDA cores, however a GTX690 is a dual GPU.
11. darac (Posts: 2156; Member since: 17 Oct 2011)
..because you say so?
I guess you can count, at least:
13. cretinick (Posts: 147; Member since: 25 Jan 2011)
They are 72 GPU CUDA cores.
The CUDA cores don't have much processing power per unit. They are basically designed to executing many concurrent threads at the same time. But the execution is slower than the traditional processors. In the end the throughputs are equivalent.
16. kshell1 (Posts: 1140; Member since: 05 Oct 2011)
Im just saying Nvidia isnt exaclty doing great with mobile processing constantly being outdone.the new snapdragon 800 and exynos octa will be much more powerful.
18. yowanvista (Posts: 275; Member since: 20 Sep 2011)
Those are actually CUDA cores, totally different from actual 'cores.
4. PhoneCritic (Posts: 331; Member since: 05 Oct 2011)
Let's wait for the score on the final and shipping product rather then an evaluation board. PA please note that readers are not initially aware of the fact this is a pre-production board and the title of the artical is at best very Mis-leading.
14. Commentator (Posts: 2178; Member since: 16 Aug 2011)
Maybe if they used "pre-production" instead of "first" in the title, but I have a hard time believing anyone will actually be misled by this title (I'd say it is at WORST misleading, not at best.)
5. TheLibertine (Posts: 484; Member since: 15 Jan 2012)
Tegra 4 never aimed to be as powerful as the A6 is: that thing is a graphics monster designed for very optimized products, while the Tegra 4 is a chip that will be used universally on mid- to high-end devices of all kinds.
9. wendygarett (unregistered)
The reason why a6 chips fast is not because the chip is powerful, is because the utilization of the ios...
24. faisolbauuz (Posts: 121; Member since: 05 Jan 2013)
No a6 chip is fats because of the gpu ipad 4 uses sgx554mp4 which is never beaten by other gpu for now if you want beat ipad 4 just use powervr 6 series (rogue) on the geekbench score im sure the tegra 4 will beat a6x soc
38. krysto (Posts: 3; Member since: 18 Jan 2013)
God so much misinformation in these comments.
ALL OF YOU ARE WRONG.
The A6X is TWICE if not THREE TIMES larger than the Tegra 4 GPU.
The whole SoC is 120mm2 (A5X used to be 160mm2), compared to Tegra 3 and Tegra 4's 80mmw die size. But that's for the whole chip - the GPU, and all the other components. So that means all that extra 40mm2 is completely dedicated to EXTRA GPU cores, which means it could be twice as large, if not more, than Tegra 4's GPU.
That still doesn't make me happy about the fact that Tegra 4 doesn't support OpenGL ES 3.0 and OpenCL (neither does A6X), but I just hate seeing so much false information, that then everyone takes for granted.
7. darac (Posts: 2156; Member since: 17 Oct 2011)
"The leaked GLBenchmark results are apparently from a much older silicon revision running no where near final GPU clocks."
17. Pedro0x (Posts: 270; Member since: 19 Oct 2012)
Exactly, I wanted to post this too. NVIDIA has worked on their GeForce ULP, we have too hope that power efficiency is good too. What really bugs me is that it is not unified architecture, it will be like mali 400, which is also non-unified.
8. OpTiMuS_BlAcK (Posts: 411; Member since: 04 May 2012)
This is why I really like PowerVR Graphics, I just don't like the fact that the more powerful ones are on iDevices, and not on Androids. But we'll see if the Tegra 4 can beat the SGX when it's already being sold on official devices.
12. howardbarrett (Posts: 45; Member since: 11 Jan 2013)
Not surprised. Even the upcoming Samsung Exynos 5 Octa uses a PowerVR 5 GPU.
21. remixfa (Posts: 13902; Member since: 19 Dec 2008)
guys this is demo hardware, which is like alpha stage prerelease. Dont get your panties in a bunch. Scores tend to skyrocket by commercial launch due to better drivers, issue fixes, ect. There isnt a single chip that ran exactly the same at this point of the game and then at launch.. they always improve. So chill with fanboy junk.
22. cripton805 (Posts: 907; Member since: 18 Mar 2012)
How many devices run at 100% all of the time?
23. papss (Posts: 3376; Member since: 03 Sep 2012)
Lets not take anything away from the iPad 4 speed.. I can tell you first hand that its a beast.. Regardless of your feelings towards apple.. Give credit where its due..
28. jove39 (Posts: 1205; Member since: 18 Oct 2011)
Mis leading title...it should be A6's sgx554mp4 beats Tegra4's GeForce ULP gpu...and I am not surprised by this result.
29. speckledapple (Posts: 877; Member since: 29 Sep 2011)
Ah the wonderful world of benchmarks.
30. BattleBrat (Posts: 884; Member since: 26 Oct 2011)
Dont Care, I'm getting the shield and a tablet with a Tegra 4 , and I'm going to love both.
31. taz89 (Posts: 1943; Member since: 03 May 2011)
tegra chips are always behind...when announced they are made to look like they are the best but when compared to the competition they are poor...sane happened with tegra,tegra 2,tegra 3 and the same will be with tegra 4...i have a nexus 7 which has a quad core tegra 3 and i though it would be the perfect tablet for gaming cause tegra is suppposedly a gaming chip lol pretty much every high 3d intensive games run laggy and with inconsistant fram rates where as on my s3 quad core exynos phone the same games run perfect...the games are most wanted,nova 3,wild blood,horn(horn does not work on s3 as its tegra only)dead trigger and others....to say i am dissapointed with tegra is a understatement...cant believe a quad core can be so bad
32. jsdechavez (Posts: 676; Member since: 20 Jul 2012)
Guys, there's no need to get excited about. No need to switch. haha
37. krysto (Posts: 3; Member since: 18 Jan 2013)
This article is all over the place. It mentions, resolution, then CPU speed, then the GPU benchmarks. And then what people will get from this is that "even though it has a 1.8 Ghz processor...it's "weaker than A6X".
Those two have NOTHING to do with each other - CPU speed, with these GPU benchmarks.
Tegra 4 is much faster than A6X in CPU performance, which is what most apps uses, and is slower in GPU performance, which is what apps only slightly use, and mainly just games take advantage of it.