Tech war: Nvidia Tegra X1 takes on Snapdragon 810 with raw GPU power


Last Sunday night, Nvidia CEO Jen-Hsun Huang stormed the CES stage to announce the Tegra X1, a new “256-core chipset” (SoC) for future cars and mobile devices that's, allegedly, so powerful, it has “more power than a supercomputer the size of a suburban family home from 15 years ago”. With this event, Nvidia accomplished two things. First, it broke out a beastly processor that's quite ahead of its time. Second, it went way, way overboard with the marketing speech. The Tegra X1 was the kind of shock-and-awe announcement that begged to be dissected, analyzed, and clarified. Which is what we did! This article will dig deeply into Nvidia's claims and, hopefully, answer all your questions regarding the X1 by comparing it directly to the Qualcomm Snapdragon 810 - the most powerful mobile processor that's coming out in the first half of 2015.


Central processing units (CPU)


The Tegra X1 marks a return to stock ARM Cortex-based CPU core designs. Previously, Nvidia used Denver cores in the 64-bit Tegra K1 (found in the Nexus 9 tablet), while Qualcomm used Krait cores in its previous-gen Snapdragon processors. In reality, Denver and Krait represent custom modifications of existing ARM processor schematics, but now, both vendors return to implementing straight ARM CPU core designs as the foundation of their flagships, possibly to save time and effort in bringing them to the market. In both the Tegra X1 and Snapdragon 810, we're talking about the 64-bit ARM Cortex-A57 (performance-oriented) and ARM Cortex-A53(energy-efficient) cores, arranged in two clusters of four. Unlike octa-core chips that operate the clusters of cores separately depending on the processor load, the new chipsets make use of ARM's heterogeneous multi-processing method so that all eight cores can be used simultaneously.

Both chips are made on a 20nm process, which means their components are smaller and more energy-efficient than before. We don't know yet what frequency the Tegra X1 CPU is clocked to, but the Snapdragon 810 can raise the needle to 2.7GHz. Both have their LPDDR 4 memory clock speed at 1.6GHz, with peak memory bandwidth reaching up to 25.6GB/s.

So, technically, it's almost the one and the same CPU, huh? No, it's not. The similarities end here. There's a multitude of differences in optimization techniques, turbo & throttling management, and built-in technologies to differentiate the X1 and 810. Most importantly, their graphics units are totally different.


Graphics processing units (GPU)


The Tegra X1 continues Nvidia's strategy of scaling down its general-purpose GPU microarchitecture to a mobile format. It took two years for the Kepler m-a, which debuted in the company's GeForce 600 series of desktop cards, to reach Nvidia's mobile computing projects. For the Tegra X1, it took the company just one year to introduce its Maxwell m-a to a mobile GPU - cheers for progress! Talking about progress, the Tegra X1's GPU escalates to a whopping 256 shader cores, up from the no-less impressive 192 shader cores found in its predecessor's.

Now, how about a brief explanation of the whole “256 core processor” brouhaha Nvidia has been raising? See, although a SoC (system-on-chip) melds together a central processing unit (CPU), a graphics unit (GPU), a modem, and other components (for example, signal and motion processors) on a single chip, said units exist independently on it and are formed differently. The CPU consists of those eight ARM cores we mentioned above, and they are optimized for serial processing - that is, these cores process whatever tasks gets thrown at them one by one, in sequential order. The graphics unit, however, has a parallel architecture - it consisting of many smaller, more arithmetically-efficient cores designed to handle multiple tasks simultaneously. So, in the Tegra X1's case, we end up with a octa-core CPU and 256-core GPU. Anyone telling you the Tegra X1 is a 256-core CPU is either misinformed, or a snake oil salesperson.

Impressed? Wait until you see what Qualcomm did with the Adreno 430 GPU that bosses the Snapdragon 810's graphics department. This beast has 288 shader cores - more than the Tegra X1 ever had! You might have missed that, because unlike Nvidia, Qualcomm simply didn't market it to you. But really, it's not only the quantity that matters here, it's how hard they're being pushed. The Tegra X1's GPU clock speed reaches a whirring 1GHz, while the Adreno 430 purrs at 600MHz. The last factor alone is enough to let the X1 claim the raw GPU power crown, but it is not that simple in reality. 

See, Nvidia designed the X1 for car computers first. It has all that power so it can calculate distances and maneuvers on the fly while processing video signals from up to six cameras, simultaneously. Of course, the chipset can play 3D games like it invented them, but its full potential can be harnessed only in the setting of a modern vehicle. If the chip ended up in a tablet ot smartphone, it would have to be throttled down to clock frequencies that ensure the chip won't be an overheating battery drain. Unless liquid cooling and double capacity battery packs turn reality overnight, the X1 in a smartphone or tablet won't be mind-blowingly faster than today's premium processors.


Connectivity and multimedia technologies


Now that we've examined the hardware thoroughly, let's see the additional features these SoC's are enabling. Smartphones and tablets are multimedia and connectivity powerhouses, first and most, so there's a heavy emphasis on fast internet and high-resolution video. Thus, the Nvidia X1 is ready to power 4K displays or feed 4K@60fps content to an external display over HDMI 2.0. It supports H.265 and H.264 encoding and decoding, while JPEG de/compression boasts a 5x speed boost. eMMC 5.1 storage support is also included. Connectivity-wise, the X1 hasn't been properly detailed yet, but we can assume it will have the BT, Wi-Fi, NFC, GPS, 2G/3G/4G bases covered.

Feature-wise, the Snapdragon 810 is a well-rounded SoC as well, although it doesn't give that “ahead of its time” vibe. Its eMMC support is limited to v5.0, while the HDMI 1.4 support and the less powerful graphics unit means a maximum of 4K@30FPS video streaming and processing. Yet, the 810 boasts many a tricks of its own, and its connectivity is overly-futureproof – in addition to the bases, it already has LTE Cat 9 networks covered – and that means up to 450Mbps download speeds, glad you asked!

Is the Tegra X1 really more powerful than a supercomputer?


Nvidia advertised the X1 as the first mobile chip that's more powerful than a supercomputer. Here's what the company said, verbatim: 


Is it really so?


The short answer is “not quite.” The long, scientific answer is that the super-computer story is great for marketing, but computer tech website KitGuru's super-computations prove the reality is quite different



That's the cold mathematical proof Nvidia's marketing probably didn't want you to see. However, credit has to be given where it's due – at FP16 operations, the Tegra X1 nails a peak 1024 GFLOPS, a full teraflop of processing power. This probably fueled the “supercomputer” angle of the announcement, and it does make for an incredibly powerful processor, hands down.

Conclusion


The Nvidia Tegra X1 is the most advanced SoC for mobile applications by far. Yet, the company explicitly stated that the kind of power it enables is not needed in today's smartphones and tablets, but for self-driving cars. Outside the automotive industry, we expect to see a limited number of mobile devices use the X1 in an underclocked form, at best. Thus, Nvidia's beast isn't a threat to Qualcomm's hegemony over the mobile chipset industry. The Snapdragon 810 remains the one processor you'll be seeing in most flagship smartphones and tablets for the majority of 2015, and Nvidia is okay with that, because it's after a different market with the X1.

FEATURED VIDEO

65 Comments

1. BobbyBuster

Posts: 854; Member since: Jan 13, 2015

I don't trust a single word from nVidia.

3. vincelongman

Posts: 5718; Member since: Feb 10, 2013

Same for everyone else, including Qualcomm, Apple, Samsung,... Don't trust benchmarks as well Always wait for third party reviews on real world performance

4. jaytai0106

Posts: 1888; Member since: Mar 30, 2011

You can't really blame nVida. They are good at making GPU and not CPU. Look at some of their GPU out there for PC. They are just absolutely amazing.

7. BobbyBuster

Posts: 854; Member since: Jan 13, 2015

I'm using Geforce750Ti. Nothing against it, but their Tegra line has been sucking without an exception, and nVidia always has been lying about the performance.

25. jaytai0106

Posts: 1888; Member since: Mar 30, 2011

Their Tegra line is a joke. They need to stick with PC where they know what's going on.

28. bur60

Posts: 981; Member since: Jul 07, 2014

A little bit competition for Qualcomms tirany is good tho

29. jaytai0106

Posts: 1888; Member since: Mar 30, 2011

That's true. Can't argue with that. Waiting for Nvida stuff the GTX Titan in a mobile phone XD Now, that's so real GPU processing power. But your phone might not make it through the booting animation.

35. Deaconclgi

Posts: 405; Member since: Nov 03, 2012

Which Tegra chip was in a device that you used that gave you the experience to come to the conclusion that the Tegra line is a joke?

43. jaytai0106

Posts: 1888; Member since: Mar 30, 2011

Which newer Tegra chip you have seen in a phone? Sorry I love their GPU, but it is just not a good SoC it's a whole different story. I honestly have no need for a tablet due to my work and lifestyle. I would love to see they push K1 into a phone, so I can enjoy their GPU goodness. Until then, their Tegra line is a joke to me.

52. AfterShock

Posts: 4147; Member since: Nov 02, 2012

So because its not in a phone an you've no need for a tablet is a joke? Wow, wonder what all them tablets an phones that don't quite make it on any benchmark are, junk jokes?

34. BattleBrat

Posts: 1476; Member since: Oct 26, 2011

Their shield devices are amazing.in fact, the shield tablet is one of the best Android tablets you can buy right now.I mean what can you do with all that video power on a tablet other than game? And no one pushes mobile gaming harder than Nvidia.

9. hohoho

Posts: 65; Member since: Dec 12, 2014

Nvidia's Tegra line is on its last legs, the showing as CES was just embarrassing. They're not making a profit from their chips, and no one wants their hardware in the auto industry. They have one partner, Audi, who is only offering their system in certain models as an option starting in 2016. They're getting no volume there whatsoever. Qualcomm has 3 partners for their car computers already, including Cadillac and Honda who plan to make it a standard safety feature just like backup cams and blind spot warning systems are becoming now. Despite how massive of a win this is for Qualcomm, it was barely even mentioned at their presentation, because Qualcomm isn't a one trick pony. They're dominating handset sales, they're increasing tablet market share, they're involved in the health care industry, they're revolutionizing LED lighting. Nvidia gave an hour long presentation that was like watching a very sick horse slowly drag itself to the edge of a cliff and plunge off of it.

30. jaytai0106

Posts: 1888; Member since: Mar 30, 2011

Nvidia needs to back away from mobile chip sector. Focus on their graphic card and work on making them cheaper. That GTX Titan isn't cheap for an average Joe like me....

36. Deaconclgi

Posts: 405; Member since: Nov 03, 2012

The GTX Titan isn't made for an average Joe. There are average cards made for average performance to match an Average Joe GPU budget. There are GPUs for just about every budget and performance bracket as with just about everything else in life, cars, homes, clothes, food and anything else, including smartphones, that may have a premium, expensive high tier option. Don't hold that against Nvidia. Hold that against yourself. I cannot afford the highest end Nvidia cards so I got the Nvidia GPU that I could afford, that's my problem and not Nvidia's.

44. jaytai0106

Posts: 1888; Member since: Mar 30, 2011

The GTX Titan shall be for everyone :P Just saying. My rig has enough slot for 4 of them... but damn rent and bills... Otherwise, I make enough money to buy 2 in a month. However, I am saving up for a GTX 970 because it's more suitable for my usage. I do want to eventually push 100+ FPS onto a 40" tv.

33. BattleBrat

Posts: 1476; Member since: Oct 26, 2011

Nvidia is the only one doing anything for us mobile gamers. Qualcomm has done nothing to help us yet you praise them. As for the Tegra processor they run just fine. Look at the Shield devices, they run beautifully!. Just because most people bought some cheap ass Tegra based devices that the manufacturer didn't bother optimizing it turned people off to them. My Sony Xperia Tablet S cost almost as much as an iPad it has a Tegra 3 and it runs fantastic.

45. jaytai0106

Posts: 1888; Member since: Mar 30, 2011

I know Nvidia is pushing the mobile gaming envelop, but right now K1 is only for tablet which I have no need for one. When they put that K1 or K1X in a mobile then i'll buy it. I can careless that its CPU side is slower than SD810. I just want to be able to enjoy good mobile gaming with amazing graphic.

49. renz4

Posts: 319; Member since: Aug 10, 2013

as it is nvidia no longer have much interest with smartphone market. it is clear when they did not integrate their Icera modem into K1. but with K1 they open new kind of market for their chip. and most of them will involve nvidia main expertise: GPGPU. we might see less and less nvidia chip in mobile device but if nvidia think that is the way to go they will pursue it. there is no need to keep fighting losing battle.

46. renz4

Posts: 319; Member since: Aug 10, 2013

you think Audi is the only one using Tegra? so qualcomm already have 3 design win and you called that massive? if Qualcomm really is beating nvidia then why they are not part of Open Auto Alliance? nvidia has been in the automotive since the early days of tegra (predates even before tegra comes to mobile consumer device). true Qualcomm dominate the smartphone space. but in automotive there is no way they beat nvidia right now. in fact they interested with automotive most likely because nvidia was successful in that segment. http://www.nvidia.com/object/automotive-partner-innovation.html

53. AfterShock

Posts: 4147; Member since: Nov 02, 2012

Last legs on 32, yup all of them are an non beat the K1 to the degree you make it sound other then being mainstream? Nvidia CEO, omg a bore, butt that an the market that they targeted are not. BTW what QC tablet rivals a Nvidia one? Oh right, non of the current ones.

57. archit6371

Posts: 15; Member since: Nov 06, 2014

renz4 threw back the s**t back on your face. Haha. You seem to be pretty ill-informed. See comment no.49

39. alex3run

Posts: 715; Member since: May 18, 2014

Lol... Phonearena, where's the proof that Adreno 430 has 288 ALUs? I doubt that it has more than 192 cores.

56. renz4

Posts: 319; Member since: Aug 10, 2013

I tried to look up and from what I found (from Wikipedia) adreno 430 will have that much.

2. teenaxta

Posts: 42; Member since: Sep 10, 2014

in short x1 is the only mobile chip which has 1teraflop of compute power...number of cores, clockspeed dont matter...benchmarks and compute power matters and in that aspect x1 kills everything !

12. Iodine

Posts: 1493; Member since: Jun 19, 2014

Power compustion matters as well. And a lot.

17. vincelongman

Posts: 5718; Member since: Feb 10, 2013

AnandTech had a sample 5W TDP X1 5W TDP is the same TDP as in iPads and Android Tablets They said under the same workload as the A8X, the X1 used 1.6 W, while the A8X used 2.5 W So power consumption and thermal output should be fine for tablets Probably still too high for phones (unless under clocked) but the A8X is probably also too high for phones too http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/3

21. Iodine

Posts: 1493; Member since: Jun 19, 2014

Those are nvidia numbers, far from reality. They have been saying the same about Tegra K1 that was supposedly 3x as fast and 50% more efficient but failed to compete with A7 not even mentioning A8X in terms of efficiency. At 5W you will not get nowehere near 100% perf from tegra and the A8X supposedly pulls just 4W.

37. RebelwithoutaClue unregistered

Those are anandtech numbers, not nvidia

54. AfterShock

Posts: 4147; Member since: Nov 02, 2012

28 vs 20 NM, an its neck an neck other power use.

64. metrana

Posts: 4; Member since: Nov 17, 2015

A8x pulls 2x the watts as the x1 and only provides 1/2 the performance. Hah please look into things before you make yourself look like a fool

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.