64-bit, dual-core Nvidia K1 Denver chipset smashes the competition on AnTuTu
Nvidia K1 Denver benchmarked
1. ArtSim98 (limited) (Posts: 2738; Member since: 21 Dec 2012)
Snapdragon 801 and 805 have almost no difference? I wonder if Qualcomm is still going to release an other chip after the 805 this year?
6. itsdeepak4u2000 (Posts: 2606; Member since: 03 Nov 2012)
But Nvidia should come fast on this chips. Coz Qualcomm is driving most of the major phones.
8. rd_nest (Posts: 768; Member since: 06 Jun 2010)
Before everyone jumps the gun, check the whooping power it consumes:
15. Chris.P (Posts: 270; Member since: 27 Jun 2013)
That's the K1 quad, not Denver. But yeah, power consumption is obviously a pretty major concern.
19. jove39 (Posts: 1278; Member since: 18 Oct 2011)
Now wonder why cyclone cores are clocked at 1.3-1.4 Ghz?
Active cooling in demo unit maybe/is to sustain max frequencies for long periods of time. K1 may have higher than average power draw but it's way overblown in referred article.
Also, in mobile soc, all manufacturers hide actual sustainable speed and publish only boost/turbo speed of cpu.
S800 in Nexus 5 could sustain only 1.2Ghz (against published max 2.26Ghz) after continuous use of little over 3mins.
Detailed article link here -http://arstechnica.com/gadgets
22. Finalflash (Posts: 1720; Member since: 23 Jul 2013)
So be it, but Nvidia has a way of skewing the numbers especially about power consumption. They release these tech demonstrations with insane power consumption against actual mobile chips in a mobile environment. That is the main reason no Tegra 4 mobile devices actually exist (maybe 1 or 2). Tegra 3 was promised to deliver the same thing but it was the first time anyone fell for that game and never again. No K1 design wins are being seen either, largely because of the power problem and now again with Denver I will assume the same problem will rear its head (otherwise they would have released power numbers).
27. jove39 (Posts: 1278; Member since: 18 Oct 2011)
Power consumption in Tegra cpu cores is not Nvidia's flaw...It's issue with ARM cpu cores. Here is article that explains this (from 2011) -
Here is extract from another article that suggest benefit of moving to custom core -http://www.extremetech.com/gam
"move to Denver CPUs will not include the 4-plus-1 core configuration. Nvidia originally implemented the fifth low-power core in Tegra 3 as an alternative to dynamic clock speed adjustments, which are not supported by ARM’s Cortex designs. Building the custom Denver CPU probably allowed it to better manage power consumption without resorting to extra cores. This is one of the benefits of licensing the ARM instruction set to create a custom core rather than simply using the Cortex reference designs. It’s the same thing Qualcomm has been doing with Krait and Scorpion CPUs in its Snapdragon SoCs for years."
28. renz4 (Posts: 227; Member since: 10 Aug 2013)
the lack of tegra 4 based phones is understand because tha lack of integrated modem but in tablets they are doing fine. in fact there are more tablet based on tegra 4 than snapdragon 800. so saying only one or two device using tegra 4 is plain wrong
26. lallolu (Posts: 230; Member since: 18 Sep 2012)
I do not trust nvidia chips at all in terms of power consumption and heat generation.
32. true1984 (Posts: 588; Member since: 23 May 2012)
i dont trust them due to their history with support
31. renz4 (Posts: 227; Member since: 10 Aug 2013)
if the source is semiaccurate then take a grain of salt when ever Charlie D are talking about nvidia.
40. Finalflash (Posts: 1720; Member since: 23 Jul 2013)
But he has been more accurate with most of the stuff that came out that others had no idea about. I have followed his stuff for a while and although he is very anti-Nvidia at times, he isn't exactly wrong. For that matter Brian Klug from Anandtech also referenced him for his review of K1 so in this case I am more willing to take his word on the matter.
51. renz4 (Posts: 227; Member since: 10 Aug 2013)
at times? it is more like at all times. but of course he can't totally trashed nvidia when the results or fact is very clear favoring nvidia side. sometimes he is right and sometimes he is wrong but even if he knows good thing about nvidia he will try to spin it to make it look negative. for example he knows GK104 is good and competitive with amd Tahiti. but then he starts to speculate to make GK104 look bad
55. Finalflash (Posts: 1720; Member since: 23 Jul 2013)
Doesn't matter though because we have the same caveats being thrown out by Anand tech too, and they're pretty reliable most of the time. Nvidia have pulled this stunt before and I wouldn't put it past them to try it again. I'll believe it when they put it in a mobile phone or tablet because until then they are just numbers.
41. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
that would be based on a PROTOTYPE!! once again, "semiaccurate.com" lives up to its name.
53. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
take a look and actually read that article- he is referencing the power supply to the prototype.... saying "60 watts!!!"
as anyone with electrical background knows, the power supply rating does not indicate the power draw. especially for a prototype. this article is just garbage, unequivocally trashing nvidia.
12. akki20892 (Posts: 3434; Member since: 04 Feb 2013)
Snapdragon series 610 & 615 is 64-bit, I hope they launch series 800 with 64-bit soon.
14. ArtSim98 (limited) (Posts: 2738; Member since: 21 Dec 2012)
Yeah. But it would be just weird if they released the first phones with 805 this summer and then we already would get 64 bit ones in the fall. Well, maybe we will get the high end 64 bits from Qualcomm just next year. TBH I wouldn't mind that since 64 bit still has a long way to go.
2. draconic1991 (Posts: 130; Member since: 27 Apr 2012)
Can't wait for the newest chip wars in the second half of the year...
3. anirudhshirsat97 (Posts: 386; Member since: 24 May 2011)
Nvidia always show that they can produce powerful chipsets. Problem is we don't see them in phones. I personally would like to own a high end tegra phone if they give good battery life.
4. draconic1991 (Posts: 130; Member since: 27 Apr 2012)
I would love them too but there is one other problem other than battery life....like in tegra 3, apps need optimisation for tegra chips...hated that....i dont know if they have solved that issue...but if they have yet to rectify it, i hope they do it fast
24. Finalflash (Posts: 1720; Member since: 23 Jul 2013)
It wasn't that they needed optimizations, its that they could have additional features not available on other SOCs. So that wasn't a problem it was a feature mostly. The problem with Tegra 3 was its power consumption which forced the chip to need to be underpowered (and it was pretty weak compared to the competition). Same problem again where they can't make a chip for mobile to save their lives and this time, no manufacturer is falling for the lies they won them over with for the Tegra 3.
35. draconic1991 (Posts: 130; Member since: 27 Apr 2012)
I dont know. i think i heard many reviewers and users say that some apps needed optimizations to run on the one x but i might be wrong.
5. _Bone_ (Posts: 2125; Member since: 29 Oct 2012)
Remember the Tegra 4 blowing away competition last year? It is available in exactly ZERO smartphones in the west world. Snapdragon is going to dominate, cause more than fast, their SoCs are more reliable, efficient and work colder.
9. BattleBrat (Posts: 1070; Member since: 26 Oct 2011)
Reliable? I have 3 tegra devices, they all work fine. Efficient? My tegra 3 Tablet runs For days. Nvidia is not caving in to the pressures of the eastern markets and releasing an octo core product. Instead they're going the dual core route, which apparently makes more sense for a mobile device. Go to andandtech they wrote something about it..
25. Finalflash (Posts: 1720; Member since: 23 Jul 2013)
Anandtech also noted the power problem that Nvidia keeps hiding even going so far as to reference the semi-accurate.com articles that questioned it. Nvidia chips are horribly inefficient largely because they literally took a hammer to their desktop lines and mashed them together with a few other needed SOC components. They are trying to peg two birds with one stone and hitting neither.
44. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
thats simply untrue. here are some excerpts from anandtech:
"In the case of Tegra K1’s A15s, the main improvements here have to do with increasing power efficiency. With r3p0 (which r3p3 inherits) ARM added more fine grained clock gating, which should directly impact power efficiency."
"[Nvidia] is the first (excluding Apple/Intel) to come to the realization that four cores may not be the best use of die area in pursuit of good performance per watt in a phone/tablet design."
"The data is presented in NVIDIA’s usual way where we’re not looking at peak performance but rather how Tegra K1 behaves when normalized to the performance of Apple’s A7 or Qualcomm’s Snapdragon 800. In both cases NVIDIA is claiming the ability to deliver equal performance at substantially better power efficiency."
43. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
nvidia has gone on record saying they are not focusing on american phones because in order to compete in america, you have to have CDMA support which Nvidia does not. besides that, these chips are purposed for more than just phones. tablets, superphones, laptops, cars, tvs, microconsoles, set-top boxes, micro servers, ext.
trust me, nvidia is breathing fire on the competition.
7. silencer271 (Posts: 165; Member since: 05 Apr 2013)
Who cares about Nvdia anymore. They make a chip and advertise it and then.. you only see it in 3 products.
11. BattleBrat (Posts: 1070; Member since: 26 Oct 2011)
I do, Nvidia has done more for mobile gaming than any other chipmaker, what would you need all that processing power for in a mobile device, if not for games? And who has done more for games in this field than Nvidia?
38. vincelongman (Posts: 1141; Member since: 10 Feb 2013)
Agreed, Nvidia GPUs are currently by far the best, especially since mining has inflated AMD prices.
I can't wait for Maxwell!
29. renz4 (Posts: 227; Member since: 10 Aug 2013)
there are plenty device using tegra 4 chip. it's just that you might heard less of them due to most of them are not going into big news
56. galanoth07 (Posts: 44; Member since: 23 Jan 2014)
I do. Nvidia makes good products. Maybe you know nothing about Nvidia
10. loveHEAVYMETAL (unregistered)
hopethis dual core Denver version of k1 chipset can run much colder compare to old tegra,reason due to lesser core,
but 3ghz was normal armv8 a57 cpu,isn't it?(dual core k1 cpu based on a57,am i wrong?)
20. livyatan (Posts: 691; Member since: 19 Jun 2013)
Denver core is not related to A57 at all.
It is derived from Nvidia's pseudo x86 core project.
63. loveHEAVYMETAL (unregistered)
o....ok thanks for your corrections :D
30. renz4 (Posts: 227; Member since: 10 Aug 2013)
it is custom built much like qualcomm Krait.
64. loveHEAVYMETAL (unregistered)
thanks for your correction too :D
13. rkoforever90 (Posts: 78; Member since: 03 Dec 2011)
not to forget about lack of update due to poor driver support
16. Ishmeet (Posts: 111; Member since: 16 Sep 2013)
true that. Instead of launching products and advertising them they should first learn to support older chipsets too.
Had they been giving proper driver support, we would have seen more of mobile devices running tegra.
45. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
Im not so sure that trend will continue.
17. StraightEdgeNexus (Posts: 3197; Member since: 14 Feb 2014)
44000 that isnt exactly ground breaking. Current snapdragon 800/801 crop max out at 36000. The snapdragon 805 will easily match tegra k1. Looks like tegra gpu will be the ace under the sleeve.
21. livyatan (Posts: 691; Member since: 19 Jun 2013)
Actually, Snapdragon 805 is shown on that chart.
With a supposed score of under 38000.
Considering that my Snapdragon 800 get over 36000 without any mods, I dismiss this as not credible.
The quad core Tegra K1 is shown to have an even higher score than Denver version!
46. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
with half the cores? seems reasonable to me.
a score of over 43000 seems like a solid improvement to me, especially when they are more concerned about their GPU scores (which are through the roof!)
18. livyatan (Posts: 691; Member since: 19 Jun 2013)
Hmm, call me not impressed..considering the GPU, I would actually expect much more
47. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
You are right, look at the 3D chart, it shows a tiny improvement over tegra 4, when we know in fact that the improvement is in the order of multiples.
23. mas39 (Posts: 60; Member since: 17 Jan 2013)
I agree 44,000 is nothing the s805 will do that and the s810 to come in second half 2014 will blow it away. Also tegra 3 was awful, had it in my lg 4x hd, a very poor chip indeed.
48. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
tegra 3 was awful, but the tegra k1 is a beast.
I dont get the fanboyism for snapdragon, I mean, their year-over-year improvements are laughable. they are riding the lazy river right now, I dont know why you would root for the monopoly leader that isnt even the best.
36. Federated (Posts: 227; Member since: 06 Mar 2010)
I'm excited for the Qualcomm Snapdragon 805 in the Fall. Sony Xperia Z3 for me!
37. Federated (Posts: 227; Member since: 06 Mar 2010)
In my opinion, when it hits an actual real-world device it will be 7,000-8,000 points lower than that. It will be clocked slower and they're going to tweak power consumption. Otherwise the device with this SoC will only last 4 hours max on a normal use. lol
49. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
in my opinion it will pass 1,000,000,000
39. 1701nino (unregistered)
I think this year is going to be great for nvidia,not earth shattering but great.Last year they didn't had a 4G LTE dedicaded modem included in their soc,now they have,if the driver support and power consumption will be ok,then all of the big players will consider puting nvidias chips in their phones and tablets.The more commpetition the merrier.
42. JMartin22 (limited) (Posts: 973; Member since: 30 Apr 2013)
Screw the benchmarks, these chips are close to being obsolete. Cortex-A15 is not nearly as powerful or as efficient as Cortex-A57 running on ARMv8 instruction.
50. grahaman27 (Posts: 347; Member since: 05 Apr 2013)
tip: the "denver" chip uses the ARMv8 instruction set.
57. galanoth07 (Posts: 44; Member since: 23 Jan 2014)
Nice one Nvidia. I hope this chip will be more efficient and will not run hot. Nvidia makes the best PC GPUs but I don't know about their mobile chips.
58. Edmund (Posts: 654; Member since: 13 Jul 2012)
Oh great, now we have butt-hurt qualcomm fanboys to add to the growing list tech idiots.