Samsung Galaxy S III specs review
Recently it turned out that it’s not one of the usual suspects Qualcomm, NVIDIA or Texas Instruments that is in the most dual-core smartphones, but actually Samsung’s ARM-based processors. That’s due to the fact that it not only makes the homebrew Exynos silicon for its own phones, but also supplies the Ax-branded chippery in Apple’s mobile devices.
The Galaxy S III everyday grind will be pushed by either the quad-core Exynos 4412, or the dual-core Qualcomm Snapdragon S4. They are made with the brand new 28/32nm production generation for mobile chipsets, meaning a significant leap in performance vs power consumption compared to the 40/45nm Cortex-A9 chips we had until now.
Why didn’t Samsung use the 2GHz dual-core Cortex-A15 Exynos 5250? Because it didn’t need to. It’s made for tablets, as Samsung kindly demonstrated to us at CES this year, powering a 2560x1560 pixels screen with its ARM Mali T-604 GPU. There is no point of scaling to put it in a phone, where nothing over the current 1280x720 pixels makes sense.
Samsung, for that matter, overclocked the Mali-400 GPU to 400MHz+, which the 32nm chip allows easily, and now the Galaxy S III beats everything in the graphics benchies, save for the quad-core PowerVR GPU in the new iPad, which, however, has to power the “Resolutionary” 9.7” screen, and will most likely be scaled down for the new iPhone, since, again, it makes no sense there.
The earliest we can see Cortex-A15 chips in phones or even tablets will be for the holidays, when maybe NVIDIA will try to be first just for the taste again, while mass usage of A15 chips in our mobile devices won’t happen until early next year at least.
sports some A15 features like internal 128-bit instructions support, and has a proprietary ISP like HTC's ImageChip, for all those camera and motion sensing shenanigans of the Galaxy S III.
Both Exynos 4412 and Snapdragon S4 are now at the top of the Android benchmarking game, so your Galaxy S III won’t feel underpowered, no matter which one ends up in it. If your carrier has an LTE network up and running, you are likely to end up with Qualcomm’s MSM8960 chipset, which has the newest 28nm LTE radio integrated in it.
Samsung said it has dedicated engineering teams working on homemade LTE chips to embed with Exynos, but it’s just not there yet. It’s hard to beat Qualcomm in this game as of this moment, and that was vividly illustrated in the last quarterly, when it became clear the company has amassed $26 billion cash pile selling wireless chips, licenses and know-how in that realm.
The Texans are the best there is when it comes to integrating every band under the sun in a chip the size of your nail, testing, fine-tuning and producing it in the most compact and power-efficient form available. The TSMC foundry can’t even supply enough of those 28nm chips for Qualcomm and its customers, let alone someone else producing an LTE chip with the same capabilities, footprint and power envelope this year. It’s going to take longer for Samsung to do that, and the difference in production components price to use Exynos or S4 is likely just a few bucks, which doesn’t warrant using anything else with inferior LTE radio.
We can argue until we are blue in the face about the perceived advantages of quad-core vs dual- in parallel processes like browsing, gaming or video editing, but in reality the apps have to be particularly optimized for that, and maybe the only advantage of the Exynos will be in browsing. The Galaxy S III browser, however, seems to be so optimized and tweaked, that the handset blows every other Android in Java benchmarks out of the water with almost double scores, as you can see in the slideshow below, courtesy of Anandtech, which can hardly be achieved by just the two more cores of the Exynos.
We’ll know more when we test an AT&T, Verizon or Sprint Galaxy S III vs the international version, but even if there is a prevailing difference, its magnitude will more than likely be insufficient to warrant the incessant whining how the US will be stuck with the “inferior” dual-core. In everyday usage nobody but spec geeks will be able to tell a difference, and no company got rich and famous catering to that crowd. As a friend eloquently put it - “a few more points on Quadrant don’t get you laid” - so let’s move on, and be happy that no matter what we’ll have the best chips that the market can currently provide with the Galaxy S III.
As for power consumption and battery performance with one chip and network vs the others, that’s a whole different story that can be told only when we get all versions released and tested.