x PhoneArena is looking for new authors! To view all available positions, click here.
  • Home
  • News
  • 64-bit, dual-core Nvidia K1 Denver chipset smashes the competition on AnTuTu

64-bit, dual-core Nvidia K1 Denver chipset smashes the competition on AnTuTu

Posted: , by Chris P.

Tags:

64-bit, dual-core Nvidia K1 Denver chipset smashes the competition on AnTuTu
Back during CES 2013 in Las Vegas, Nvidia came out with an important announcement: the company is now laser-focused on mobile chipsets. The silicon slinger demo'd two new processors, both under the Nvidia K1 moniker, but they differed greatly in terms of their underlying archtitecture. The 32-bit, quad-core K1, which has already been benchmarked, is based on four Cortex-A15 cores from ARM. In comparison, Nvidia is using two in-house Denver 'super-cores' in the second version, and we're now talking a 64-bit architecture.

As you gleaned from the title, we now have our first benchmark of the chip, even though its supposedly not due until the second half of 2014. The AnTuTu team, always on the lookout for unreleased goodies, has caught an instance of the chip taking a stroll through its territory, and has instantly sounded the alarm. The score, at nearly 44,000, is pretty amazing, and overshadows even the most potent chips currently on the market. Moreover, Nvidia appears to have worked some kind of RAM magic, as the score in that particular test is through the roof.

Now, before you jump to too many conclusions, there are tw disclaimers to keep in mind. For starters, Nvidia said that its Denver platform will peak at 2.5GHz, whereas the benchmarked chip is shown to be a 3GHz one, which definitely helps with the final scoring. Secondly, as great as these results are, this type of performance won't necessarily carry out to actual, real products, as there are numerous variables that manufacturers have to tweak, not least of which: power consumption. How well Nvidia has managed that oh-so-important aspect with the Denver remains unclear.

Before we go, we've gotta say that this situation reeks of irony. It was Nvidia with its Tegra 3 chip that first started the quad-core craze in mobile chips, that eventually even led to stuff like octa-core processors. It's funny to see that it is the very same Nvidia that is now taking a step back, and considering a different, less core-crazed approach.


64 Comments
  • Options
    Close




posted on 05 Mar 2014, 07:57 2

1. ArtSim98 (Posts: 2182; Member since: 21 Dec 2012)


Snapdragon 801 and 805 have almost no difference? I wonder if Qualcomm is still going to release an other chip after the 805 this year?

posted on 05 Mar 2014, 08:11 2

6. itsdeepak4u2000 (Posts: 2293; Member since: 03 Nov 2012)


But Nvidia should come fast on this chips. Coz Qualcomm is driving most of the major phones.

posted on 05 Mar 2014, 08:23 2

8. rd_nest (Posts: 683; Member since: 06 Jun 2010)


Before everyone jumps the gun, check the whooping power it consumes:

http://semiaccurate.com/2014/03/05/nvidias-tegra-k1-draws-shocking-number-watts/

posted on 05 Mar 2014, 09:24 3

15. Chris.P (Posts: 247; Member since: 27 Jun 2013)


That's the K1 quad, not Denver. But yeah, power consumption is obviously a pretty major concern.

posted on 05 Mar 2014, 09:51 3

19. jove39 (Posts: 1241; Member since: 18 Oct 2011)


Now wonder why cyclone cores are clocked at 1.3-1.4 Ghz?

Active cooling in demo unit maybe/is to sustain max frequencies for long periods of time. K1 may have higher than average power draw but it's way overblown in referred article.

Also, in mobile soc, all manufacturers hide actual sustainable speed and publish only boost/turbo speed of cpu.

S800 in Nexus 5 could sustain only 1.2Ghz (against published max 2.26Ghz) after continuous use of little over 3mins.
Detailed article link here -http://arstechnica.com/gadgets/2013/11/when-benchmarks-arent-enough-cpu-performance-in-the-nexus-5/

posted on 05 Mar 2014, 10:00

22. Finalflash (Posts: 1420; Member since: 23 Jul 2013)


So be it, but Nvidia has a way of skewing the numbers especially about power consumption. They release these tech demonstrations with insane power consumption against actual mobile chips in a mobile environment. That is the main reason no Tegra 4 mobile devices actually exist (maybe 1 or 2). Tegra 3 was promised to deliver the same thing but it was the first time anyone fell for that game and never again. No K1 design wins are being seen either, largely because of the power problem and now again with Denver I will assume the same problem will rear its head (otherwise they would have released power numbers).

posted on 05 Mar 2014, 10:31 2

27. jove39 (Posts: 1241; Member since: 18 Oct 2011)


Power consumption in Tegra cpu cores is not Nvidia's flaw...It's issue with ARM cpu cores. Here is article that explains this (from 2011) -

http://www.extremetech.com/computing/99721-how-snapdragon-s4-and-tegra-3-manage-arm-cores-differently

Here is extract from another article that suggest benefit of moving to custom core -http://www.extremetech.com/gaming/173975-nvidias-tegra-k1-soc-kepler-gpu-paired-with-64-bit-denver-cpu-coming-in-2014

"move to Denver CPUs will not include the 4-plus-1 core configuration. Nvidia originally implemented the fifth low-power core in Tegra 3 as an alternative to dynamic clock speed adjustments, which are not supported by ARM’s Cortex designs. Building the custom Denver CPU probably allowed it to better manage power consumption without resorting to extra cores. This is one of the benefits of licensing the ARM instruction set to create a custom core rather than simply using the Cortex reference designs. It’s the same thing Qualcomm has been doing with Krait and Scorpion CPUs in its Snapdragon SoCs for years."

posted on 05 Mar 2014, 10:39 2

28. renz4 (Posts: 202; Member since: 10 Aug 2013)


the lack of tegra 4 based phones is understand because tha lack of integrated modem but in tablets they are doing fine. in fact there are more tablet based on tegra 4 than snapdragon 800. so saying only one or two device using tegra 4 is plain wrong

posted on 05 Mar 2014, 10:23 1

26. lallolu (Posts: 226; Member since: 18 Sep 2012)


I do not trust nvidia chips at all in terms of power consumption and heat generation.

posted on 05 Mar 2014, 11:18 2

32. true1984 (Posts: 582; Member since: 23 May 2012)


i dont trust them due to their history with support

posted on 05 Mar 2014, 11:14 2

31. renz4 (Posts: 202; Member since: 10 Aug 2013)


if the source is semiaccurate then take a grain of salt when ever Charlie D are talking about nvidia.

posted on 05 Mar 2014, 11:26 3

33. jove39 (Posts: 1241; Member since: 18 Oct 2011)


As name suggests semiaccurate :)

posted on 05 Mar 2014, 13:13

40. Finalflash (Posts: 1420; Member since: 23 Jul 2013)


But he has been more accurate with most of the stuff that came out that others had no idea about. I have followed his stuff for a while and although he is very anti-Nvidia at times, he isn't exactly wrong. For that matter Brian Klug from Anandtech also referenced him for his review of K1 so in this case I am more willing to take his word on the matter.

posted on 05 Mar 2014, 14:21 1

51. renz4 (Posts: 202; Member since: 10 Aug 2013)


at times? it is more like at all times. but of course he can't totally trashed nvidia when the results or fact is very clear favoring nvidia side. sometimes he is right and sometimes he is wrong but even if he knows good thing about nvidia he will try to spin it to make it look negative. for example he knows GK104 is good and competitive with amd Tahiti. but then he starts to speculate to make GK104 look bad

posted on 05 Mar 2014, 16:45

55. Finalflash (Posts: 1420; Member since: 23 Jul 2013)


Doesn't matter though because we have the same caveats being thrown out by Anand tech too, and they're pretty reliable most of the time. Nvidia have pulled this stunt before and I wouldn't put it past them to try it again. I'll believe it when they put it in a mobile phone or tablet because until then they are just numbers.

posted on 05 Mar 2014, 13:54

41. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


that would be based on a PROTOTYPE!! once again, "semiaccurate.com" lives up to its name.

posted on 05 Mar 2014, 14:27

53. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


take a look and actually read that article- he is referencing the power supply to the prototype.... saying "60 watts!!!"

as anyone with electrical background knows, the power supply rating does not indicate the power draw. especially for a prototype. this article is just garbage, unequivocally trashing nvidia.

posted on 05 Mar 2014, 09:07 2

12. akki20892 (Posts: 3234; Member since: 04 Feb 2013)


Snapdragon series 610 & 615 is 64-bit, I hope they launch series 800 with 64-bit soon.

posted on 05 Mar 2014, 09:22 1

14. ArtSim98 (Posts: 2182; Member since: 21 Dec 2012)


Yeah. But it would be just weird if they released the first phones with 805 this summer and then we already would get 64 bit ones in the fall. Well, maybe we will get the high end 64 bits from Qualcomm just next year. TBH I wouldn't mind that since 64 bit still has a long way to go.

posted on 05 Mar 2014, 07:58 2

2. draconic1991 (Posts: 117; Member since: 27 Apr 2012)


Can't wait for the newest chip wars in the second half of the year...

posted on 05 Mar 2014, 08:03 5

3. anirudhshirsat97 (Posts: 378; Member since: 24 May 2011)


Nvidia always show that they can produce powerful chipsets. Problem is we don't see them in phones. I personally would like to own a high end tegra phone if they give good battery life.

posted on 05 Mar 2014, 08:07 1

4. draconic1991 (Posts: 117; Member since: 27 Apr 2012)


I would love them too but there is one other problem other than battery life....like in tegra 3, apps need optimisation for tegra chips...hated that....i dont know if they have solved that issue...but if they have yet to rectify it, i hope they do it fast

posted on 05 Mar 2014, 10:07 2

24. Finalflash (Posts: 1420; Member since: 23 Jul 2013)


It wasn't that they needed optimizations, its that they could have additional features not available on other SOCs. So that wasn't a problem it was a feature mostly. The problem with Tegra 3 was its power consumption which forced the chip to need to be underpowered (and it was pretty weak compared to the competition). Same problem again where they can't make a chip for mobile to save their lives and this time, no manufacturer is falling for the lies they won them over with for the Tegra 3.

posted on 05 Mar 2014, 12:05

35. draconic1991 (Posts: 117; Member since: 27 Apr 2012)


I dont know. i think i heard many reviewers and users say that some apps needed optimizations to run on the one x but i might be wrong.

posted on 05 Mar 2014, 08:11 13

5. _Bone_ (Posts: 2103; Member since: 29 Oct 2012)


Remember the Tegra 4 blowing away competition last year? It is available in exactly ZERO smartphones in the west world. Snapdragon is going to dominate, cause more than fast, their SoCs are more reliable, efficient and work colder.

posted on 05 Mar 2014, 08:55 4

9. BattleBrat (Posts: 1038; Member since: 26 Oct 2011)


Reliable? I have 3 tegra devices, they all work fine. Efficient? My tegra 3 Tablet runs For days. Nvidia is not caving in to the pressures of the eastern markets and releasing an octo core product. Instead they're going the dual core route, which apparently makes more sense for a mobile device. Go to andandtech they wrote something about it..

posted on 05 Mar 2014, 10:10 2

25. Finalflash (Posts: 1420; Member since: 23 Jul 2013)


Anandtech also noted the power problem that Nvidia keeps hiding even going so far as to reference the semi-accurate.com articles that questioned it. Nvidia chips are horribly inefficient largely because they literally took a hammer to their desktop lines and mashed them together with a few other needed SOC components. They are trying to peg two birds with one stone and hitting neither.

posted on 05 Mar 2014, 14:03 1

44. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


thats simply untrue. here are some excerpts from anandtech:

"In the case of Tegra K1’s A15s, the main improvements here have to do with increasing power efficiency. With r3p0 (which r3p3 inherits) ARM added more fine grained clock gating, which should directly impact power efficiency."

"[Nvidia] is the first (excluding Apple/Intel) to come to the realization that four cores may not be the best use of die area in pursuit of good performance per watt in a phone/tablet design."

"The data is presented in NVIDIA’s usual way where we’re not looking at peak performance but rather how Tegra K1 behaves when normalized to the performance of Apple’s A7 or Qualcomm’s Snapdragon 800. In both cases NVIDIA is claiming the ability to deliver equal performance at substantially better power efficiency."

posted on 05 Mar 2014, 13:57

43. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


nvidia has gone on record saying they are not focusing on american phones because in order to compete in america, you have to have CDMA support which Nvidia does not. besides that, these chips are purposed for more than just phones. tablets, superphones, laptops, cars, tvs, microconsoles, set-top boxes, micro servers, ext.

trust me, nvidia is breathing fire on the competition.

posted on 05 Mar 2014, 08:18 3

7. silencer271 (Posts: 149; Member since: 05 Apr 2013)


Who cares about Nvdia anymore. They make a chip and advertise it and then.. you only see it in 3 products.

posted on 05 Mar 2014, 08:58 4

11. BattleBrat (Posts: 1038; Member since: 26 Oct 2011)


I do, Nvidia has done more for mobile gaming than any other chipmaker, what would you need all that processing power for in a mobile device, if not for games? And who has done more for games in this field than Nvidia?

posted on 05 Mar 2014, 12:48 1

38. vincelongman (Posts: 936; Member since: 10 Feb 2013)


Agreed, Nvidia GPUs are currently by far the best, especially since mining has inflated AMD prices.
I can't wait for Maxwell!

posted on 05 Mar 2014, 10:43

29. renz4 (Posts: 202; Member since: 10 Aug 2013)


there are plenty device using tegra 4 chip. it's just that you might heard less of them due to most of them are not going into big news

posted on 05 Mar 2014, 11:54

34. InspectorGadget80 (Posts: 6116; Member since: 26 Mar 2011)


And why care bout 64bit chips?

posted on 05 Mar 2014, 14:24

52. renz4 (Posts: 202; Member since: 10 Aug 2013)


marketing cares.

posted on 05 Mar 2014, 19:49

56. galanoth07 (Posts: 44; Member since: 23 Jan 2014)


I do. Nvidia makes good products. Maybe you know nothing about Nvidia

posted on 05 Mar 2014, 08:56

10. loveHEAVYMETAL (unregistered)


hopethis dual core Denver version of k1 chipset can run much colder compare to old tegra,reason due to lesser core,
but 3ghz was normal armv8 a57 cpu,isn't it?(dual core k1 cpu based on a57,am i wrong?)

posted on 05 Mar 2014, 09:52 1

20. livyatan (Posts: 643; Member since: 19 Jun 2013)


Denver core is not related to A57 at all.
It is derived from Nvidia's pseudo x86 core project.

posted on 06 Mar 2014, 06:13

63. loveHEAVYMETAL (unregistered)


o....ok thanks for your corrections :D

posted on 05 Mar 2014, 10:45 1

30. renz4 (Posts: 202; Member since: 10 Aug 2013)


it is custom built much like qualcomm Krait.

posted on 06 Mar 2014, 06:15

64. loveHEAVYMETAL (unregistered)


thanks for your correction too :D

posted on 05 Mar 2014, 09:22 2

13. rkoforever90 (Posts: 55; Member since: 03 Dec 2011)


not to forget about lack of update due to poor driver support

posted on 05 Mar 2014, 09:36

16. Ishmeet (Posts: 111; Member since: 16 Sep 2013)


true that. Instead of launching products and advertising them they should first learn to support older chipsets too.
Had they been giving proper driver support, we would have seen more of mobile devices running tegra.

posted on 05 Mar 2014, 14:05

45. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


Im not so sure that trend will continue.

http://news.cnet.com/8301-1035_3-57618221-94/torvalds-gives-nvidia-software-thumbs-up-not-middle-finger/

posted on 05 Mar 2014, 09:46 1

17. StraightEdgeNexus (Posts: 2592; Member since: 14 Feb 2014)


44000 that isnt exactly ground breaking. Current snapdragon 800/801 crop max out at 36000. The snapdragon 805 will easily match tegra k1. Looks like tegra gpu will be the ace under the sleeve.

posted on 05 Mar 2014, 10:00

21. livyatan (Posts: 643; Member since: 19 Jun 2013)


Actually, Snapdragon 805 is shown on that chart.
With a supposed score of under 38000.
Considering that my Snapdragon 800 get over 36000 without any mods, I dismiss this as not credible.

Also, PA?
The quad core Tegra K1 is shown to have an even higher score than Denver version!

posted on 05 Mar 2014, 14:08

46. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


with half the cores? seems reasonable to me.

a score of over 43000 seems like a solid improvement to me, especially when they are more concerned about their GPU scores (which are through the roof!)

posted on 05 Mar 2014, 09:49

18. livyatan (Posts: 643; Member since: 19 Jun 2013)


44 000?
Hmm, call me not impressed..considering the GPU, I would actually expect much more

posted on 05 Mar 2014, 14:10

47. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


You are right, look at the 3D chart, it shows a tiny improvement over tegra 4, when we know in fact that the improvement is in the order of multiples.

posted on 05 Mar 2014, 10:05 1

23. mas39 (Posts: 40; Member since: 17 Jan 2013)


I agree 44,000 is nothing the s805 will do that and the s810 to come in second half 2014 will blow it away. Also tegra 3 was awful, had it in my lg 4x hd, a very poor chip indeed.

posted on 05 Mar 2014, 14:12

48. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


tegra 3 was awful, but the tegra k1 is a beast.

I dont get the fanboyism for snapdragon, I mean, their year-over-year improvements are laughable. they are riding the lazy river right now, I dont know why you would root for the monopoly leader that isnt even the best.

posted on 05 Mar 2014, 12:11

36. Federated (Posts: 146; Member since: 06 Mar 2010)


I'm excited for the Qualcomm Snapdragon 805 in the Fall. Sony Xperia Z3 for me!

posted on 05 Mar 2014, 12:18

37. Federated (Posts: 146; Member since: 06 Mar 2010)


In my opinion, when it hits an actual real-world device it will be 7,000-8,000 points lower than that. It will be clocked slower and they're going to tweak power consumption. Otherwise the device with this SoC will only last 4 hours max on a normal use. lol

posted on 05 Mar 2014, 14:12

49. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


in my opinion it will pass 1,000,000,000

posted on 05 Mar 2014, 12:50

39. 1701nino (Posts: 227; Member since: 07 Dec 2010)


I think this year is going to be great for nvidia,not earth shattering but great.Last year they didn't had a 4G LTE dedicaded modem included in their soc,now they have,if the driver support and power consumption will be ok,then all of the big players will consider puting nvidias chips in their phones and tablets.The more commpetition the merrier.

posted on 05 Mar 2014, 13:55

42. JMartin22 (Posts: 712; Member since: 30 Apr 2013)


Screw the benchmarks, these chips are close to being obsolete. Cortex-A15 is not nearly as powerful or as efficient as Cortex-A57 running on ARMv8 instruction.

posted on 05 Mar 2014, 14:14

50. grahaman27 (Posts: 345; Member since: 05 Apr 2013)


tip: the "denver" chip uses the ARMv8 instruction set.

posted on 05 Mar 2014, 14:39

54. networkdood (Posts: 6244; Member since: 31 Mar 2010)


Beats my 35k score on my Nexus 5....

posted on 05 Mar 2014, 19:59

57. galanoth07 (Posts: 44; Member since: 23 Jan 2014)


Nice one Nvidia. I hope this chip will be more efficient and will not run hot. Nvidia makes the best PC GPUs but I don't know about their mobile chips.

posted on 05 Mar 2014, 20:38 1

58. Edmund (Posts: 654; Member since: 13 Jul 2012)


Oh great, now we have butt-hurt qualcomm fanboys to add to the growing list tech idiots.

Want to comment? Please login or register.

Latest stories