Intel Atom Z2580 AnTuTu score drops 20% after revision to site
The AnTuTu site has since been revised and the benchmark score for the Intel Atom Z2580 is now 20% off with the RAM score itself cut in half. Scores for other processors remained the same. The 20% cut in the Z2580's score now puts the Samsung Exynos 5 Octa on top of the Intel silicon based on benchmarking the Exynos 5 Octa powered Samsung Galaxy S4 against the Intel Atom Z2580 flavored Lenovo K900.
Things could get wild next month when a major revision is scheduled for the AnTuTu testing standards.
Intel Atom Z2580 retested on AnTuTu after revision to the site Fullscreen
More popular slideshows
iOS 7 release date and time are today (Sep 18), get ready to update!
18 Sep 2013, 04:00
Samsung Galaxy S5 rumor round-up: release date, price and specs
12 Feb 2014, 08:58
Is this the LG G3: specs for top-shelf Sprint-bound LG phone leak out
11 Apr 2014, 05:03
Intel Atom Z2580 retested on AnTuTu after revision to the site
source: EETimes via SlashGear
19. ihatesmartphone (unregistered)
Don't care much about benchmark. lol XD
20. TheLolGuy (Posts: 469; Member since: 05 Mar 2013)
After stepping on AMD's toes with anti-business tactics to stay ahead with an inferior product, I no longer believe any numbers or promises they make/release.
They have a giant operation and about ~15 times the R&D expense of AMD (or so I've read) so I don't doubt they'll bring something competitive. Just don't read too much into their lies.
21. ihatesmartphone (unregistered)
So true, they have long monopoly in pc markets (same like Microsoft, not so many people are using linux or mac os, lol) and right now they want to enter in mobile business, so good luck for them because...
you are entering in very competitive market!! :D
2. TylerGrunter (Posts: 780; Member since: 16 Feb 2012)
It was quite fishy the score in Antutu for the Lenovo K900 and I always said to wait for the results of some real CPU tests like Geekbenck before crowning the Z2580. It just turned out to be true.
Kind of expected, wasn´t it?
3. Edmund (Posts: 645; Member since: 13 Jul 2012)
The exynos chip consumes more power; It also has 6 more cores
5. iushnt (Posts: 461; Member since: 06 Feb 2013)
its not 6 more cores, its only 2 more cores when it comes to real time operational performance..even though it is said to be consuming more power, it is still the second best battery performer in its category after gs4 sd600 version..Its only slightly lower than gs4 sd600 but much higher than lumia, xperia z, iphone 5 and htc one
7. Shatter (Posts: 1763; Member since: 29 May 2013)
At max power consumption the atom uses about half what an a15 quad core uses. Atom is the really crappy low end chip that intel makes. Wait till the other chip series get their power consumption low enough to use in phones.
Baytrail will be siginficantly better than A15 but as of right now it looks like its $200-$400 range tablet only.
13. livyatan (Posts: 559; Member since: 19 Jun 2013)
Can you be more wrong?
Atom Z2560(1.6GHz) TDP - 6.7W
A15 Exynos dual(1.7GHz) TDP -9.1W
Atom E3850 (Bay Trail-I) - 1.91GHz quad core, TDP 10W, 792MHz GPU, price around 150$(estimate based on the $132 price of the Quad 1.60GHz Bay Trail-M Celeron with 7.5TDP and 756MHz GPU)
Silvermont looks ridiculously overpriced and will be nowhere near competitive.. the ARM based SoC's will trash it it in GPU, be roughly on par with CPU, and far more affordable for the OEM's.
14. hung2900 (Posts: 711; Member since: 02 Mar 2012)
8W TDP of Exynos 5250 (as i remember from Anandtech, not 9.1W) is only theoriacally when both CPU and GPU run maximumly. However, this never is true due to throttling system (limit to 4W only) and no task needs that much power.
Don't be wrong between TDP and real battery usage, and between Exynos 5250 and Exynos 5410
17. livyatan (Posts: 559; Member since: 19 Jun 2013)
Go to notebookcheck.net (best reviews on the net imo).
On the Nexus 10, they measured 9.2-9.4W.
Interestingly enough, for the Atom Z2760 with a paltry GPU, they got an even higher value - 9.5W
16. livyatan (Posts: 559; Member since: 19 Jun 2013)
I cannot post links so I'll just give a copy paste breakdown.
It's an official data from Intel.
-Atom E3810 – 1.46 GHz single core CPU w/400 MHz GPU and 5W TDP
-Atom E3821 – 1.33 GHz dual-core CPU w/533 MHz GPU and 6W TDP
-Atom E3822 – 1.46 GHz dual-core CPU w/667 MHz GPU and 7W TDP
-Atom E3823 – 1.75 GHz dual-core CPU w/792 MHz GPU and 8W TDP
-Atom E3840 – 1.91 GHz quad-core CPU w/792 MHz GPU and 10W TDP
Intel Bay Trail-M (Celeron, Pentium for notebooks, convertibles)
-Celeron N2805 – 1.46 GHz dual-core CPU w/667 MHz GPU and 4.5W TDP
-Celeron N2810 – 2 GHz dual-core CPU w/756 MHz GPU and 7.5W TDP
-Celeron N2910 – 1.6 GHz quad-core CPU w/756 MHz GPU and 7.5W TDP
-Pentium N3510 – 2 GHz quad-core CPU w/750 MHz GPU and 7.5W TDP
Intel Bay Trail-D (Desktops)
-Celeron J1750 – 2.41 GHz dual-core CPU w/792 MHz GPU and 10W TDP
-Celeron J1850 – 2 GHz quad-core CPU w/792 MHz GPU and 10W TDP
-Pentium J2850 – 2.41 GHz quad-core CPU w/792 MHz GPU and 10W TDP
Theres also unoffocial mention of the supposed quad core 2.4GHz(??) mobile reference chip Bay Trail T, with 2.5WSDP and no TDP revealed but it should be at least in the bellow 5W range, to compete with Snapdragon 800.
22. Shatter (Posts: 1763; Member since: 29 May 2013)
Baytrail has a ton of variants. Some of them are geared towards battery life others for maximum power.
25. TylerGrunter (Posts: 780; Member since: 16 Feb 2012)
That´s what Intel and Intel fans have been saying for the last 3 years: wait till XXXX comes out and will smoke out ARM chips. And for the last 3 years it never came true.
Will it ever be true? I really start to wonder.
4. FlushGordon (unregistered)
There has been misleading reports about the K900's micro sd card support, does it have one?
8. livyatan (Posts: 559; Member since: 19 Jun 2013)
Well, well..I kept telling how memory scores for Atom in Antutu are very misleading and that the bench needs to be ignored..the fact is that Clover Trail + is beaten by 50-150% in other benchmarks.
So yep, TOLD YOU SO, Intel fanboys!
BTW, the graphics tests on Antutu are worthless also.. I noticed basically the same scores for the Galaxy S3 and Galaxy S4.
9. FlushGordon (unregistered)
I'd much rather trust 3DMark
Some flagship models tend to deflate-inflate scores (depending on the AnTuTu version)
The HTC DNA was a prime example, now we have the K900
11. Pedro0x (Posts: 270; Member since: 19 Oct 2012)
Antutu is worthless, it is so easy to optimize for your phone. You can optimize Nexus 4 so it has 29000 points(I think) which is more than SGS4. And yes, atom do not have good memory scores, but Silvermont should change that.
23. Shatter (Posts: 1763; Member since: 29 May 2013)
SIlvermont has DDR3 support, I think its a bit higher mhz than a15/krait can do too.
10. kanagadeepan (Posts: 589; Member since: 24 Jan 2012)
Another proof that benchmark scores have nothing to do with real life performance and hence purchase decisions should never be based on them, imho...
12. FlushGordon (unregistered)
Oh Come on!
With your reasoning, you wont rather care if the phone you're purchasing is stuffed with a crappy Adreno 203 or a PoWERVR544mp
Benchmarks pretty much give you the idea (altho not always precise) on how they'd perform in the real world
24. Shatter (Posts: 1763; Member since: 29 May 2013)
Not really the mediatek turbo @1.5ghz runs everything smoothly and can play every 3d game on the Play Store. After a certian point the benefits from more power start declining.
More power is not bad at all but it does use way more power. A phone with a mediatek turbo can be expected to get around 2 days of battery life atleast with regular use. A15 on regular use will require a charge almost every day.
26. XiphiasGladius (Posts: 799; Member since: 21 Aug 2011)
I’m waiting for midrange phones (price wise) powered by that particular Mediatek chip to hit the mainstream market.
27. FrostNerdy (Posts: 1; Member since: 14 Jul 2013)
I don't care about the Benchmark, it depends upon the user, just buy GS4 and at the same time Lenovo K900, then compare them! Check which is the best between the 2, sometimes Benchmarks are unrealiable.