x PhoneArena is hiring! Reviewer in the USA
  • Hidden picShow menu
  • Home
  • News
  • Qualcomm unveils its fastest 2.5 GHz Snapdragon 805 'Ultra HD' chipset with Adreno 420 graphics

Qualcomm unveils its fastest 2.5 GHz Snapdragon 805 'Ultra HD' chipset with Adreno 420 graphics

Posted: , by Daniel P.

Tags :

Qualcomm unveils its fastest 2.5 GHz Snapdragon 805 'Ultra HD' chipset with Adreno 420 graphics

We knew something is brewing over at the mobile chipset juggernaut Qualcomm, and the just announced Snapdragon 805 SoC didn't disappoint. If you thought like us, judging from the name, that this is a small upgrade to the current Snapdragon 800 king, you'll be pleasantly disappointed.

First off, the chip is all about the so-called 4K or Ultra HD video. It includes four brand new Krait 450 cores, able to run at 2.5 GHz, and fierce Adreno 420 graphics processor that allows encoding and decoding 4K video in real time without a hitch. A natural side effect of this blazing GPU will be even smoother gaming performance, too.

The next big improvement is in the dual camera image signal processors (ISP), which for the first time allow Gpixel/s throughput for better image quality and faster shot-to-shot times, and include gyro integration for the first system level optical image stabilization support. The sensor support doesn't stop here, as this hardware level integration includes all the other sensors too, for low-power sensor tasks that until recently were the realm of dedicated processors on custom made SoCs like Apple's A7 or Motorola's X8.

If that is not enough for you, Snapdragon 805 also gets packaged with the most advanced connectivity out there, like a 4th-gen multiband Gobi LTE platform for up to 150 Mbps LTE-A download speeds, plus Wi-Fi/ac and low-power Bluetooth 4.0 radios. Here Qualcomm again reminds us that these fiery connectivity options are packaged with the new chipset to enable 4K video streaming with the lowest power consumption on an integrated chipset.

Snapdragon 805 thus opens the door for much more detailed video, games and interfaces, not to mention streaming uninterrupted 4K video, so you can go broke on your data plan even faster. We kid, but given the rumors about WQHD panels on phones like the Galaxy S5 or the Xplay 3S, as well as 4K tablets prepped for next year, Qualcomm's new chip edition is probably not rotating entirely around Ultra HD graphics support by accident. 

Snapdragon 805 is currently sampling with phone and tablet makers, and will enter retail devices before June 2014 has rolled out, says Qualcomm, so we should see the first handsets with it announced at the MWC expo, or even hinted at CES in January.

+- Press Release

73 Comments
  • Options
    Close






posted on 20 Nov 2013, 08:04 20

1. Shatter (Posts: 2036; Member since: 29 May 2013)


So it only toke Quallcomm 2 months to dethrone Apple?

posted on 20 Nov 2013, 08:09 6

2. ardent1 (Posts: 2000; Member since: 16 Apr 2011)


It hits retail units before June 2014 or about 7 months from now (with prototype units by Jan 2014).

Or, just a couple of months before Apple releases the A8 chip.

posted on 20 Nov 2013, 11:59 1

37. TylerGrunter (Posts: 1533; Member since: 16 Feb 2012)


And that´s assumming that this chip can dethrone the A7.
I´m sure it will do in the GPU part (Adreno 330 is already at the same level, and Adreno 420 will be 40% more powerful).
But in the CPU part... I expect the A7 chip to rule for a while.

posted on 20 Nov 2013, 12:28 1

44. Pedro0x (Posts: 271; Member since: 19 Oct 2012)


Adreno 330 is not equally fast as the A7, atleast not in phones, in Qualcomm´s reference tablet yes but that one has a bigger allowed TDP so that is no surprise. If Adreno 420 will only have 40% better performance then it will have a worse performance in the real world because the new phones will have a QHD resolution which is about 77% more pixels and that leads to a worse performance.
By the way, I really hope that these new Snapdragon SoCs are armv8 architecture, if not then Qualcomm will be screwed

posted on 20 Nov 2013, 13:55

49. levizx (Posts: 7; Member since: 05 Nov 2013)


Who told you we all want QHD screens crammed into 5"? We don't even need 2.5K screens. 1080p will be in the majority of top-tier phones for years to come. Unless we grew magic eyes or hands.

posted on 20 Nov 2013, 17:27 4

59. Berzerk000 (Posts: 4275; Member since: 26 Jun 2011)


The Adreno 330 has been consistently proven to be on par with the PowerVR G6430 in the A7.

GFXBench T-Rex HD (Offscreen):

iPhone 5S - 25.5 fps
Nexus 5 - 22.9 fps
LG G2 - 22 fps
Galaxy S4 (LTE-A) - 24 fps
Galaxy Note 3 (LTE) - 26.2 fps

The difference between them is minimal. Also, I don't think this will be Qualcomm's fastest SoC of 2014. Cortex-A53 and A57 are here, and I doubt Qualcomm will pass up on making a brand new Krait architecture based off of them, it will probably be available in late 2014 with a new Adreno.

posted on 20 Nov 2013, 22:49

68. vincelongman (Posts: 4570; Member since: 10 Feb 2013)


The A7 has the fastest single core performance, but the S800 is the fastest overall CPU as CPU benchmarks such as Geekbench, Linpack and 3DMark Unlimited Physics show.
http://browser.primatelabs.com/android-benchmarks
http://browser.primatelabs.com/ios-benchmarks
http://www.anandtech.com/show/7335/the-iphone-5s-review/7

posted on 20 Nov 2013, 08:22

3. SAYED-EJAZ (Posts: 225; Member since: 10 Oct 2013)


This is small upgrade i guess , will be old soon while the new Samsungs 'blockbluster' Exynos out which is nearing and in the final stage.hoping huge,it wont be like Exynos 5.

posted on 20 Nov 2013, 09:07 1

14. jove39 (Posts: 1888; Member since: 18 Oct 2011)


By the time S805 comes to market in mid 2014...apple will have A7x or A8 ready...to regain lost crown.

posted on 20 Nov 2013, 09:21 2

23. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)


Do you think 1.3Ghz is the maximum the A7 can be clocked at?

Apple just SET the clock at 1.3Ghz in their products, because they found it to be the most balanced clock rate performance/power-consumption wise.

It's pity to see Qualcomm trying to squeeze out the last bit of performance from a legacy architecture.

It might be also a wise move though considering no 64-bit Android being in sight, other than Sammy who tries to look on par with Apple just on paper with their 64-bit Exynos.

posted on 20 Nov 2013, 09:52 7

30. Reality_Check (Posts: 277; Member since: 15 Aug 2013)


Maybe clocking it at 1.5GHz would have turned the A7 to a ticking time bomb?

posted on 20 Nov 2013, 10:07 4

31. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)


Why don't you test it with your a$$?

I'm sure the A7 @ 1.5Ghz won't do any harm while the SD805 @ 2.5Ghz will leave some permanent brand name on it, literally.

posted on 20 Nov 2013, 12:20 3

41. Reality_Check (Posts: 277; Member since: 15 Aug 2013)


So now we're stooping to new lows huh? Next time atleast 'act' mature instead of behaving like a kid :)

posted on 20 Nov 2013, 13:59 2

50. livyatan (Posts: 867; Member since: 19 Jun 2013)


And you pretend to be all knowing?
Yet clearly showing pathetic ignorance here.

Here's a little clue for you.. snapdragon 800 clocks at 2.3GHz, and draws less power than 1.9GHz Snapdragon 600..and is actually comparable to Apple A7 in power consumption.

posted on 20 Nov 2013, 21:07

65. medicci37 (Posts: 1287; Member since: 19 Nov 2011)


@ JakeLee Lmao

posted on 20 Nov 2013, 12:23 2

42. rihel_95 (Posts: 301; Member since: 21 Mar 2012)


4k ok. but what about storage? we need massive storage for 4k videos!

posted on 20 Nov 2013, 13:07 2

47. Phonecall01 (unregistered)


Hum... What would storage have to do with the processor performance again?

posted on 20 Nov 2013, 14:31 2

52. bigstrudel (Posts: 518; Member since: 20 Aug 2012)


He's just showing how useless the feature is.

posted on 20 Nov 2013, 19:47

64. InspectorGadget80 (unregistered)


Don't u mean NVIDIA?

posted on 20 Nov 2013, 08:27 2

4. Jommick (Posts: 221; Member since: 10 Sep 2013)


But is it 64-bit or...?

posted on 20 Nov 2013, 08:28 2

5. SAYED-EJAZ (Posts: 225; Member since: 10 Oct 2013)


it is not 64bit.

posted on 20 Nov 2013, 08:40 1

7. SAYED-EJAZ (Posts: 225; Member since: 10 Oct 2013)


Thats why i think this will get old soon.
whether 64-bit is useful or not , once 64-bit out people starts telling 32-bit is outdated just like 720p outdated when 1080p came.
Sad but true.
only when comparing 1080p with 720p , we come to know how much better it is,otherwise i am happy with my 720p S3.

posted on 20 Nov 2013, 08:52 3

11. Hyperreal (Posts: 258; Member since: 08 Oct 2013)


we will see. everyone was excited about so called octa core exynos but now we know it was a failure and sd800 was a better choice. anyway between 720p and 1080p there is a small,small difference and i am happy to have it on my htc one. but more then that is nonesense for me. we cant see the difference with more ppi.

posted on 20 Nov 2013, 09:12 2

18. SAYED-EJAZ (Posts: 225; Member since: 10 Oct 2013)


i think your eyes having some problem.
there is 'HUGE' difference between my S3 and S4 display.
i dont know about Htc.

posted on 20 Nov 2013, 09:20 6

22. jove39 (Posts: 1888; Member since: 18 Oct 2011)


In samsung made amoled display each pixel has 2 subpixels...in lcd displays...each pixel has 3 subpixels...total subpixel count in onex (2764800) is higher than s3 (1843200)...both are 720p...that's why you see a 'HUGE' difference in 720p and 1080p amoled displays...and hyper find it hard to see much difference in 720p and 1080p LCD displays.

posted on 20 Nov 2013, 09:24 1

25. Hyperreal (Posts: 258; Member since: 08 Oct 2013)


correct.

posted on 20 Nov 2013, 09:23 1

24. Hyperreal (Posts: 258; Member since: 08 Oct 2013)


yeah maybe betwen s3 and s4 cause it has pentile. But htc one x plus which I owned had a better display and the difference was not HUGE at all but still, was.

posted on 20 Nov 2013, 14:33

53. bigstrudel (Posts: 518; Member since: 20 Aug 2012)


You're right there's a big difference. The S3 had way under 300 ppi when you take into consideration that every third pixel is shared. It was a sub par screen when it was released.

posted on 20 Nov 2013, 09:15

19. jove39 (Posts: 1888; Member since: 18 Oct 2011)


Not sure why oem didn't considered 1600x900 resolution before moving to 1080p...900p would give enough detail and counts for (more than) 40% less pixels than 1080p...will be less work for gpu and less power hungry.

posted on 20 Nov 2013, 09:25

26. JakeLee (banned) (Posts: 1021; Member since: 02 Nov 2013)


1080p movies scaled down to 900p would look horrible IN ADDITION to consuming MORE POWER due to the down scaling process.

Want to comment? Please login or register.

Latest stories