64-bit, dual-core Nvidia K1 Denver chipset smashes the competition on AnTuTu

64-bit, dual-core Nvidia K1 Denver chipset smashes the competition on AnTuTu
Back during CES 2013 in Las Vegas, Nvidia came out with an important announcement: the company is now laser-focused on mobile chipsets. The silicon slinger demo'd two new processors, both under the Nvidia K1 moniker, but they differed greatly in terms of their underlying archtitecture. The 32-bit, quad-core K1, which has already been benchmarked, is based on four Cortex-A15 cores from ARM. In comparison, Nvidia is using two in-house Denver 'super-cores' in the second version, and we're now talking a 64-bit architecture.

As you gleaned from the title, we now have our first benchmark of the chip, even though its supposedly not due until the second half of 2014. The AnTuTu team, always on the lookout for unreleased goodies, has caught an instance of the chip taking a stroll through its territory, and has instantly sounded the alarm. The score, at nearly 44,000, is pretty amazing, and overshadows even the most potent chips currently on the market. Moreover, Nvidia appears to have worked some kind of RAM magic, as the score in that particular test is through the roof.

Now, before you jump to too many conclusions, there are tw disclaimers to keep in mind. For starters, Nvidia said that its Denver platform will peak at 2.5GHz, whereas the benchmarked chip is shown to be a 3GHz one, which definitely helps with the final scoring. Secondly, as great as these results are, this type of performance won't necessarily carry out to actual, real products, as there are numerous variables that manufacturers have to tweak, not least of which: power consumption. How well Nvidia has managed that oh-so-important aspect with the Denver remains unclear.

Before we go, we've gotta say that this situation reeks of irony. It was Nvidia with its Tegra 3 chip that first started the quad-core craze in mobile chips, that eventually even led to stuff like octa-core processors. It's funny to see that it is the very same Nvidia that is now taking a step back, and considering a different, less core-crazed approach.



1. ArtSim98

Posts: 3535; Member since: Dec 21, 2012

Snapdragon 801 and 805 have almost no difference? I wonder if Qualcomm is still going to release an other chip after the 805 this year?

6. itsdeepak4u2000

Posts: 3718; Member since: Nov 03, 2012

But Nvidia should come fast on this chips. Coz Qualcomm is driving most of the major phones.

8. rd_nest

Posts: 1656; Member since: Jun 06, 2010

Before everyone jumps the gun, check the whooping power it consumes: http://semiaccurate.com/2014/03/05/nvidias-tegra-k1-draws-shocking-number-watts/

15. Chris.P

Posts: 567; Member since: Jun 27, 2013

That's the K1 quad, not Denver. But yeah, power consumption is obviously a pretty major concern.

19. jove39

Posts: 2147; Member since: Oct 18, 2011

Now wonder why cyclone cores are clocked at 1.3-1.4 Ghz? Active cooling in demo unit maybe/is to sustain max frequencies for long periods of time. K1 may have higher than average power draw but it's way overblown in referred article. Also, in mobile soc, all manufacturers hide actual sustainable speed and publish only boost/turbo speed of cpu. S800 in Nexus 5 could sustain only 1.2Ghz (against published max 2.26Ghz) after continuous use of little over 3mins. Detailed article link here -http://arstechnica.com/gadgets/2013/11/when-benchmarks-arent-enough-cpu-performance-in-the-nexus-5/

22. Finalflash

Posts: 4063; Member since: Jul 23, 2013

So be it, but Nvidia has a way of skewing the numbers especially about power consumption. They release these tech demonstrations with insane power consumption against actual mobile chips in a mobile environment. That is the main reason no Tegra 4 mobile devices actually exist (maybe 1 or 2). Tegra 3 was promised to deliver the same thing but it was the first time anyone fell for that game and never again. No K1 design wins are being seen either, largely because of the power problem and now again with Denver I will assume the same problem will rear its head (otherwise they would have released power numbers).

27. jove39

Posts: 2147; Member since: Oct 18, 2011

Power consumption in Tegra cpu cores is not Nvidia's flaw...It's issue with ARM cpu cores. Here is article that explains this (from 2011) - http://www.extremetech.com/computing/99721-how-snapdragon-s4-and-tegra-3-manage-arm-cores-differently Here is extract from another article that suggest benefit of moving to custom core -http://www.extremetech.com/gaming/173975-nvidias-tegra-k1-soc-kepler-gpu-paired-with-64-bit-denver-cpu-coming-in-2014 "move to Denver CPUs will not include the 4-plus-1 core configuration. Nvidia originally implemented the fifth low-power core in Tegra 3 as an alternative to dynamic clock speed adjustments, which are not supported by ARM’s Cortex designs. Building the custom Denver CPU probably allowed it to better manage power consumption without resorting to extra cores. This is one of the benefits of licensing the ARM instruction set to create a custom core rather than simply using the Cortex reference designs. It’s the same thing Qualcomm has been doing with Krait and Scorpion CPUs in its Snapdragon SoCs for years."

28. renz4

Posts: 319; Member since: Aug 10, 2013

the lack of tegra 4 based phones is understand because tha lack of integrated modem but in tablets they are doing fine. in fact there are more tablet based on tegra 4 than snapdragon 800. so saying only one or two device using tegra 4 is plain wrong

26. lallolu

Posts: 733; Member since: Sep 18, 2012

I do not trust nvidia chips at all in terms of power consumption and heat generation.

32. true1984

Posts: 869; Member since: May 23, 2012

i dont trust them due to their history with support

31. renz4

Posts: 319; Member since: Aug 10, 2013

if the source is semiaccurate then take a grain of salt when ever Charlie D are talking about nvidia.

33. jove39

Posts: 2147; Member since: Oct 18, 2011

As name suggests semiaccurate :)

40. Finalflash

Posts: 4063; Member since: Jul 23, 2013

But he has been more accurate with most of the stuff that came out that others had no idea about. I have followed his stuff for a while and although he is very anti-Nvidia at times, he isn't exactly wrong. For that matter Brian Klug from Anandtech also referenced him for his review of K1 so in this case I am more willing to take his word on the matter.

51. renz4

Posts: 319; Member since: Aug 10, 2013

at times? it is more like at all times. but of course he can't totally trashed nvidia when the results or fact is very clear favoring nvidia side. sometimes he is right and sometimes he is wrong but even if he knows good thing about nvidia he will try to spin it to make it look negative. for example he knows GK104 is good and competitive with amd Tahiti. but then he starts to speculate to make GK104 look bad

55. Finalflash

Posts: 4063; Member since: Jul 23, 2013

Doesn't matter though because we have the same caveats being thrown out by Anand tech too, and they're pretty reliable most of the time. Nvidia have pulled this stunt before and I wouldn't put it past them to try it again. I'll believe it when they put it in a mobile phone or tablet because until then they are just numbers.

41. grahaman27

Posts: 364; Member since: Apr 05, 2013

that would be based on a PROTOTYPE!! once again, "semiaccurate.com" lives up to its name.

53. grahaman27

Posts: 364; Member since: Apr 05, 2013

take a look and actually read that article- he is referencing the power supply to the prototype.... saying "60 watts!!!" as anyone with electrical background knows, the power supply rating does not indicate the power draw. especially for a prototype. this article is just garbage, unequivocally trashing nvidia.

12. akki20892

Posts: 3902; Member since: Feb 04, 2013

Snapdragon series 610 & 615 is 64-bit, I hope they launch series 800 with 64-bit soon.

14. ArtSim98

Posts: 3535; Member since: Dec 21, 2012

Yeah. But it would be just weird if they released the first phones with 805 this summer and then we already would get 64 bit ones in the fall. Well, maybe we will get the high end 64 bits from Qualcomm just next year. TBH I wouldn't mind that since 64 bit still has a long way to go.

2. draconic1991

Posts: 200; Member since: Apr 27, 2012

Can't wait for the newest chip wars in the second half of the year...

3. anirudhshirsat97

Posts: 408; Member since: May 24, 2011

Nvidia always show that they can produce powerful chipsets. Problem is we don't see them in phones. I personally would like to own a high end tegra phone if they give good battery life.

4. draconic1991

Posts: 200; Member since: Apr 27, 2012

I would love them too but there is one other problem other than battery life....like in tegra 3, apps need optimisation for tegra chips...hated that....i dont know if they have solved that issue...but if they have yet to rectify it, i hope they do it fast

24. Finalflash

Posts: 4063; Member since: Jul 23, 2013

It wasn't that they needed optimizations, its that they could have additional features not available on other SOCs. So that wasn't a problem it was a feature mostly. The problem with Tegra 3 was its power consumption which forced the chip to need to be underpowered (and it was pretty weak compared to the competition). Same problem again where they can't make a chip for mobile to save their lives and this time, no manufacturer is falling for the lies they won them over with for the Tegra 3.

35. draconic1991

Posts: 200; Member since: Apr 27, 2012

I dont know. i think i heard many reviewers and users say that some apps needed optimizations to run on the one x but i might be wrong.

5. _Bone_

Posts: 2155; Member since: Oct 29, 2012

Remember the Tegra 4 blowing away competition last year? It is available in exactly ZERO smartphones in the west world. Snapdragon is going to dominate, cause more than fast, their SoCs are more reliable, efficient and work colder.

9. BattleBrat

Posts: 1476; Member since: Oct 26, 2011

Reliable? I have 3 tegra devices, they all work fine. Efficient? My tegra 3 Tablet runs For days. Nvidia is not caving in to the pressures of the eastern markets and releasing an octo core product. Instead they're going the dual core route, which apparently makes more sense for a mobile device. Go to andandtech they wrote something about it..

25. Finalflash

Posts: 4063; Member since: Jul 23, 2013

Anandtech also noted the power problem that Nvidia keeps hiding even going so far as to reference the semi-accurate.com articles that questioned it. Nvidia chips are horribly inefficient largely because they literally took a hammer to their desktop lines and mashed them together with a few other needed SOC components. They are trying to peg two birds with one stone and hitting neither.

44. grahaman27

Posts: 364; Member since: Apr 05, 2013

thats simply untrue. here are some excerpts from anandtech: "In the case of Tegra K1’s A15s, the main improvements here have to do with increasing power efficiency. With r3p0 (which r3p3 inherits) ARM added more fine grained clock gating, which should directly impact power efficiency." "[Nvidia] is the first (excluding Apple/Intel) to come to the realization that four cores may not be the best use of die area in pursuit of good performance per watt in a phone/tablet design." "The data is presented in NVIDIA’s usual way where we’re not looking at peak performance but rather how Tegra K1 behaves when normalized to the performance of Apple’s A7 or Qualcomm’s Snapdragon 800. In both cases NVIDIA is claiming the ability to deliver equal performance at substantially better power efficiency."

43. grahaman27

Posts: 364; Member since: Apr 05, 2013

nvidia has gone on record saying they are not focusing on american phones because in order to compete in america, you have to have CDMA support which Nvidia does not. besides that, these chips are purposed for more than just phones. tablets, superphones, laptops, cars, tvs, microconsoles, set-top boxes, micro servers, ext. trust me, nvidia is breathing fire on the competition.

7. silencer271

Posts: 254; Member since: Apr 05, 2013

Who cares about Nvdia anymore. They make a chip and advertise it and then.. you only see it in 3 products.

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.