TI shows off its dual-core Cortex A15-based OMAP 5 beating a quad-core Cortex A9 device
2. ledbetterp3 (Posts: 467; Member since: 31 Aug 2011)
I believe these are pretty accurate results. Can't wait!
3. protozeloz (Posts: 5387; Member since: 16 Sep 2010)
I believe that's a prime, just looking at the icons on the bar tells you... those are ASUS icons
4. remixfa (Posts: 14255; Member since: 19 Dec 2008)
why does the A9 keep looking like its pausing right before it hits 100%.. thats why it fell so far behind. Is that true to life? Is that how the T3 performs? Anyone got an Asus prime to know for sure?
Anyways, that makes me super excited. I've been waiting with baited breath to see what TI was doing with OMAP and that is most definitely a step in the right direction. Now lets see some other tests! :)
Super excited... i uh... think i wont be able to stand up straight for a lil bit. :)
9. protozeloz (Posts: 5387; Member since: 16 Sep 2010)
I have my doubts too, but until i can put my hands on one we cant really tell
5. darac (Posts: 2156; Member since: 17 Oct 2011)
Well, after all they said it's the greatest platform on Earth, haha.. paired with PowerVR graphics of next generation, i think this is indeed the best mobile chip.
Can't even imagine what kind of a beast quad core A15 with SGX6xx GPU will be
6. pokharkarsaga (Posts: 366; Member since: 23 Feb 2012)
texas instruments makes the best processor.it is much better than qualcomm.ti has shown that only cores should not be increemented but also architecture should be changed or improved.
7. JDCohen722 (Posts: 39; Member since: 18 May 2011)
People need to understand one thing, the Tegra 3 Chip set has Vertical Symmetrical Multi Processing,which means it can turn cores off when they aren't needed. And it is a known fact that the Tegra 3 CPU only uses Dual Cores in browsing and keeps the other 2 idle. Obviously a Dual Core Cortex A15 will beat a Dual Core A9
8. remixfa (Posts: 14255; Member since: 19 Dec 2008)
thats actually not completely true. When Nvidia showed off that feature, it showed it loading some heavy web pages, and all 4 cores were indeed firing at times for the loading part. After certain things were loaded and there was no more benefit to the 3rd and 4th cores, they shut down. They fire on and off depending on load.
20. jbash (Posts: 342; Member since: 07 Feb 2011)
even if it doesn't use all its core that chipset lost. An L is an L however you wanna spin it. Sounds like the tegra3 limits its own potential
10. imkyle (Posts: 1048; Member since: 18 Nov 2010)
I think they should have used Google's Chrome Beta as that only supports HTML5.
12. taz89 (Posts: 2014; Member since: 03 May 2011)
that is why i hope that samusng use a15 dual core exynos in the upcoming s3 instead of a9 quad core.
14. remixfa (Posts: 14255; Member since: 19 Dec 2008)
so far there is zero rumors to an A9 anything other than the exynos 4212 refresh, which is the same chip, just die shrunk for better performance/battery.. which is exactly what Qualcom is doing with this new S4 chip at the moment. All rumors point to an A-15 enabled SGS3... so lets hope that those rumors are right.
Samsung has pulled rabbits out of their electronic hats for the SGS1 and SGS2 . I dont think they would let the SGS3 be anything less than amazing. .. well, thats what I hope anyways. :)
16. taz89 (Posts: 2014; Member since: 03 May 2011)
either way am sure samsung gs3 will probably have the best overall chip again if the exynos is anything to go by.
17. Hunt3rj2 (Posts: 396; Member since: 11 Nov 2011)
Please stop with misinformation. Snapdragon S4 isn't just a dieshrink from S3, it contains a new architecture for the CPU directly equivalent to the Cortex A15, and the GPU is highly overclocked with newer drivers and possible some more processing power added in.
Snapdragon S4 should be very close to Samsung's Cortex A15 SoC in CPU, clock for clock. The GPU, I'm not so sure...
21. remixfa (Posts: 14255; Member since: 19 Dec 2008)
no matter how many times you repost, you are still wrong. Yes there are some minor variances with the krait, but THIS version of the krait is an A9 equivalent (the S3 was an A8 equivalent). There will be another version of the Krait droping around q3/q4 that will be A15 equivalent. The biggest difference with this to the S3 at this point is the inclusion of a stronger GPU and the power increase that comes with the massive die shrink that it went though.
This chip will be smoked by both the A15 exynos and A15 OMAP. It is a stopgap chip just like the T3 which will be quickly replaced by the A15 Grey chip later this year.
They wanted to be first, so they came with the A9 first to pull ahead of the pack slightly (and keep chip nuts like me from constantly ripping the horrible S3 chip) and will be droping the a15 comparable chip later.
22. Hunt3rj2 (Posts: 396; Member since: 11 Nov 2011)
You're wrong. I don't know why you're so obsessed with trying to claim that it's a Cortex A9 equivalent in performance, but it's pretty obvious that clock for clock this massively outperforms any Cortex A9.
You're obsessed with trying to claim that it's a Crapdragon, and that's great, but the simple truth is that Qualcomm got their Cortex A15 equivalent out the door faster.
Yes, Qualcomm is coming into the game with a crap GPU once again attached to their Krait, but no, the CPU is brand new and it is directly comparable to Cortex A15, clock for clock. The next iterations will do nothing to the architecture of the CPU, because they've already finished it. From here on out, the only thing that will change is a better GPU and higher clock speeds, and possibly more CPU cores.
Your ignorance is showing through quite clearly. If there was an architectural change to the CPU, then no sane company would keep it under the same name. It would be called something else other than Krait. Please bring logic to this discussion, I understand that even though you are known for being knowledgeable, it doesn't stop the fact that Krait is Krait. If Krait was meant to compete with Cortex A9, then single core performance wouldn't be around 50% more than the Exynos GS2's performance.
Qualcomm was first to market with a new architecture that is going to destroy every non-Cortex A15 SoC. Deal with it.
Oh, and clock speeds may be lower, but seeing as how Qualcomm is first to 28nm, I'm hardly surprised that they clock conservatively.
23. remixfa (Posts: 14255; Member since: 19 Dec 2008)
you do realize that nearly every chip and design they are comparing to in that Anandtech article is an A9, right? They are not comparing it to the well known A15 designs. Ever stop to wonder why? lol
Is a V6 camero and a V8 camero still a camero? The why exactly cant a A9 Krait and a A15 Krait be named the same? its called "brand identity".
There is nothing wrong with the Krait. Its catching up to current performance maxes set by Exynos and a hair more. there is nothing wrong with that. I'm not dissing the Krait. I dont think it will be king of the crop, but I also dont think its complete crap like the Crapdragon line. And yea, Qualcomm needs to step it up in the GPU department.
13. iCandy (Posts: 46; Member since: 07 Dec 2011)
Even if performance was the same, i'm terribly curious about the power comsumption delta between the 2x800mhz Core verses a 4x1300mhz Core.
15. remixfa (Posts: 14255; Member since: 19 Dec 2008)
you are forgetting the 5th companion core that runs at 500mhz for idle. Also, the chips can clock themselves up and down depending on need. they are not stuck at 1300mhz. All things equal, the T3 should still be better on power draw by its design nature. We will find out soon enough though.
18. kanagadeepan (Posts: 765; Member since: 24 Jan 2012)
I don't mind waiting a few seconds for my browser to open a page... But what I care most is how long I can use my phone without the need to look for power plug...
If TI wins over Tegra3 (said to be with low load on battery using companion core) in BATTERY life too, then NO further arguments TI IS WINNER... If NOT, Tegra3 FOUR PLUS ONE is ok for me...
19. imkyle (Posts: 1048; Member since: 18 Nov 2010)
Tegra 3 handles flash VERY well. I really don't mind waiting an extra second for pages to load.