NVIDIA Tegra 2, Samsung Exynos, and Qualcomm Snapdragon the 3rd: the dual-core chipsets and beyond
While the first three were somewhat expected, the last one is a tectonic shift, and we can't stress enough how important it is that the vast driver support and familiar Windows environment will be marrying the power-efficient ARM silicon with its tiny footprint. It could either be a game-changer, if Microsoft plays it right, or sink Redmond into playing second fiddle to the cool kids on the block Apple and Google.
The new wave of multi-core mobile silicon promises 1080p Full HD video recording and playback (first sample here) from camera resolutions up to 18MP, as well as some speed boosters, like support for DDR2 and DDR3 memory, as well as console-level graphics. Well, hello, instant webpage load times, 3D user interfaces, 3D video capture and playback, simultaneous output to three or more screens, and other eye candy. All on the same battery charge, or less.
Let's get one persistent myth about dual-core chipsets out of the way right now. More cores does not mean sucking more juice out of the battery, if the software is optimized, and power management is done correctly, so don't lose much sleep over it. These two or more 1GHz+ cores are built with a 45nm process, compared to the 65nm process of, say, the first generation 1GHz Snapdragon, as found currently in Windows Phone 7 devices. Shrinking the transistors' size means that more of them can fit on the same place, pumping out higher performance from the same footprint. Since smaller transistors consume less energy, they can also be run at higher frequencies, without your phone starting to drip on your feet from overheating.
So you can choose between more transistors, hence more power on the same space, or a similar amount of transistors with a smaller footprint, which leaves space for other stuff, like better graphics, or extra connectivity chips. Not to mention you can fit more of them on the same standard silicon wafer now, which makes the chips cheaper to produce, if we don't count the R&D expenses. And these advantages are present in the single-core chipsets too, produced with the 45nm process, the best representatives of which are Hummingbird on the Samsung Galaxy S, the A4 on the iPhone 4 and the iPad, and the second generation Snapdragons, like those in the upcoming HTC Inspire 4G, or the T-Mobile G2.
So as far as efficiency goes, dual-core has yet to prove itself, but still, let's see who will be the major players in the upcoming multi-core tsunami. What the mobile chipset industry is going through now, is where desktop CPUs were a few years ago - the GHz war subsided due to overheating, to the advantage of multiple cores, and more efficient designs.
NVIDIA Tegra 2
well-played, NVIDIA, well-played". The company's dual-core silicon might not be the fastest or most efficient, but it beat everybody to the punch in a time when cell phone manufactures were desperately looking for differentiating factors in the sea of Android handsets. And so it was chosen by Google to become the reference platform for Android 3.0 Honeycomb with the Motorola XOOM tablet. More about it in our recap of the events here.
NVIDIA is currently the only one you can find in commercially available phones and tablets on the market. It pairs two 1GHz ARM Cortex-A9 cores with a low-power version of its GeForce GPU. The energy-saving case is illustrated by NVIDIA with its napkin calculation that two cores doing the same demanding task utilize 40% less energy than a single core, since they don't stress and sweat that much. When the lonely CPU is 100% loaded, it runs on the max 1GHz clock speed, heats up and consumes the maximum 1.1V, while two CPUs distribute the load among themselves, and run on 550MHz, thus running cooler, and drawing just 0.8V in total, resulting in the said 40% gain. What if we manage to max out both cores, though? Then the task will just be finished for a shorter time?
This talk about power efficiency comes from the company that makes quite power-hungry graphics cards. We also saw some promises in the Tegra 2 white paper for 140 hours of audio playback on a charge, while the other Cortex-A9 chipsets with two cores are listing 120 hours. The twist is that they are measuring a charge from a 1000mAh battery as standard, while NVIDIA measures it with a bigger 2000mAh one. If audio playback is an indicator, Tegra 2 could be up to 40% less efficient than the other dual-core competitors, which are yet to make a cameo appearance in a gadget, though.
Maybe that is why the Motorola ATRIX 4G has a huge 1930mAh battery. It promises 9 hours of talk time, which is great for an Android phone, but the single-core Motorola DROID 2 manages even more, and with a 1400mAh battery, so there will be more efficient dual-core chipset designs than Tegra 2 out there, that's for sure.
We'll see what the true values are when production-ready handsets hit the shelves. Other than that we don't doubt NVIDIA when it says it achieves over 2x faster page loading times, and sixty to hundred percent better frame rates in 3D games. The impressive Quadrant scores that the Tegra 2 winter crop is achieving attests to that. Motorola ATRIX 4G scores 2600+, the LG Optimus 2X makes 2391, and the Motorola DROID BIONIC 2284 right out of the box - these are the highest scores we've seen on this generic benchmark, and the test is not even taking into account the second core.
Moreover, NVIDIA says Tegra 3 is almost finished, taking us to quad-core level and CUDA CPU/GPU workload sharing. Work has already started on Tegra 4 as well, but even the 3rd generation won't see the light of day until probably late this year. Tegra 3 is to be announced at the MWC 2011 expo in Barcelona.
Samsung Exynos - the Galaxy Monster
Orion (what's with the cosmic names, Sammy?). It got renamed to Exynos since, and has been demoed on a tablet prototype. It is now powering the Samsung Galaxy S II.
The Hummingbird generation is the leader in graphics performance, as can be seen from this below benchmark comparison:
The Hummingbird chipset in the Galaxy S and the Apple A4 share a lot of similarities. After all, Samsung used to provide the silicon for the previous generations of the iPhone. The biggest difference is the graphics chip. Apple’s A4 keeps the PowerVR SGX535 that is in the iPhone 3GS, but clocks it up thanks to the 45nm manufacturing process, thus increasing performance to deal with the Retina Display resolution. Hummingbird, however, upgrades to PowerVR SGX540, which is way faster than 535, but still uses acceleration technology developed by Intrinsity, a company which Apple bought last year.
Perhaps that acquisition is the reason why Samsung went with something else than PowerVR for its own dual-core chipset. In February last year it announced that it will use not only ARM’s CPU architecture, but its graphics chip designs as well, called Mali. There are different versions of how fast the quad-core Mali-400 GPU in Orion is. Samsung said in the press release that it provides 5 times the performance of its previous generation chipset, i.e. Hummingbird, which would clock it at peak 450 million triangles per second. For the sake of comparison, that’s almost as much as the current-gen Xbox 360 processes. Guesstimates set Exynos's GPU speeds anywhere between 120 and 450 million triangles per second.
It all might have to do with the frequency that these GPUs are being clocked at. For mobile devices they are usually underclocked so as to keep battery consumption at bay. Sammy boasted a peak performance of the SGX540 in the Hummingbird to be 90 million triangles per second, but it's unclear if those are achievable at its 200MHz clock rate there. It is still the undisputed king of smartphone graphics, though, and there aren’t any games or other software that choke it up yet, so the "millions of triangles" race is a bit pointless for now. Have a look at the concept video for the next version of Mali :
DDR3 memory, whereas Tegra 2 supports DDR2. Still desktop-grade memories, but the difference in performance might be significant. Also, since ARM provides both the CPU and GPU design, there might be just as significant cost and power savings for Samsung going the Mali way.
Hopefully we will know pretty soon what Exynos is exactly capable of, when we get our paws on the gear in Samsung’s booth at the MWC expo next month. In the meantime, Sammy is reportedly a heavy buyer of Tegra 2 chipsets. This could mean that while the future in front of Samsung’s Exynos may be bright, it might not be near.
1. Seylan (unregistered)
So once we buy the latest smartphone with the best CPU/GPU available, a NEWER CPU thats even better, will be available just a few months later....
I certanley don't want to feel outdated, why don't they gives us ARM A-15 based platforms and duel core intel platforms straight away!?
2. calamazoo (unregistered)
it actually takes about a year before a new generation reaches actual phones, so you are relatively safe :-) plus there is still not much need for dual-core but full hd video, which you can do with one core too, like the Samsung Infuse for att. It will be the second half of 2011, and mainly the holidays when dual-core will be all the rage...
3. Hello-dirt (Posts: 100; Member since: 02 May 2010)
I like the info in this article, very informative and virtually free of bias. Thanks.
What I am still waiting for is a phone that harnesses a chip similar to the 3 core (i forgot the brand) version that operates an efficient single core for typical phone use but kicks the other high power dual-cores when it plugs into a laptop-like base, thus becoming the touchpad for the laptop.The laptop-like base can be the place for all the connections, storage (SSD or HDD), and extra battery to allow for all day operation. In the end, one will have a light-er phone with a lightweight 12-14" laptop base that is easy to carry. Why have a powerful phone, a powerful tablet, and a powerful laptop? combine the 1st and 3rd and get rid of those silly tablets!
4. TheEclectic (Posts: 77; Member since: 23 Sep 2009)
Is the extra RAM the reason the Atrix outperformed the Bionic in that generic benchmark? It should would be nice if they gave the Bionic 1GB of RAM instead of the 500mb reported.
5. RVM (unregistered)
no word about ST-Ericsson U8500 :x
6. Hybr1dz (Posts: 7; Member since: 24 Sep 2010)
A little miffed that the Atrix has twice the memory of the Bionic, and the benchmark seems to reflect this as both phones are nearly identical in all other hardware specs. I like Atrix's form factor much better than the Bionic too but it's still not enough for me to switch from VZW to ATT.
Also didn't know the iPhone 3GS produced higher frame rates compared to the iPhone 4. Is this because the iPhone 4's native resolution it was tested on is higher than the 3GS?
7. TtheDude (unregistered)
Gotta appreciate the new chipsets hitting the market.....but it's kinda funny that Moto is putting dual core into a Froyo device. The benefits of dual core chip sets will not be realized with Android until 2.5 Icecream. Considering the Bionic is launching with Froyo and will only be upgradable to Gingerbread it's nice to see a manufacturer wasting the $$$ on a marketing ploy. I would at least wait until Gingerbread hits and then hope you get the OTA (over the Air) upgrade sooner rather than later.
8. cheetah2k (Posts: 696; Member since: 16 Jan 2011)
Just as the PC race and the multi core CPU race evolved between AMD and Intel, another race has started in the handheld arena.
This year is going to be simply brilliant as we get closer to having desktop power in the palm of our hands.
With this in mind, I reckon holding out till 2H of 2011 before jumping onto the dual core bandwagon to make sure you get the best bang for your buck indeed!
9. paxttel (unregistered)
great article thanks :)
@RVM +1 :(
10. steven999 (unregistered)
For the person saying he dose not want to feel outdated, you just have Status Anxiety.(look it up if you don't know what it means). You can get professional help for it.
11. Teko (unregistered)
Please, update with ZiiLABS ZMS-20 / ZMS-40.