Qualcomm insider claims that Apple's 64-bit A7 CPU caused panic among chipmakers
Quoted by HubSpot, the insider claims that the A7 hit Qualcomm “in the gut”:
“Not just us, but everyone, really. We were slack-jawed, and stunned, and unprepared. It’s not that big a performance difference right now, since most current software won’t benefit. But in Spinal Tapterms it’s like, 32 more, and now everyone wants it.” says the insider. “Apple kicked everybody in the balls with this. It’s being downplayed, but it set off panic in the industry.”
That's despite the knee-jerk reaction of one Qualcomm executive, that labeled the chip a marketing gimmick, the same executive that was promptly reassigned after Qualcomm issued a statement refuting the former's statement. More importantly, not long after the iPhone 5s debut, both Qualcomm and Samsung announced their own transition plants. Be that as it may, chipmakers apparently didn't view the transition towards a 64-bit architecture with too much interest initially, and reportedly thought it wasn't worth the R&D spending at the time.
“The roadmap for 64-bit was nowhere close to Apple’s, since no one thought it was that essential,” says the insider. “The evolution was going to be steady. Sure, it’s neat, it’s the future, but it’s not really essential for conditions now.”
Whether you decide to trust the alleged insider's story or not, there's no denying that the competition has jumped on the bandwagon almost instantaneously, and that, more than anything, should signal just how close to home the above claims really are.
source: HubSpot via Apple Insider
1. Ninetysix (Posts: 1010; Member since: 08 Oct 2012)
But that fandroid guy said that Apple doesn't innovate anymore. I'm confused :(
2. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
A7 chip is the best chip on the market hands down.
40. brrunopt (Posts: 206; Member since: 15 Aug 2013)
best is a very relative term..
as powerfull goes it's not the best...
41. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
You sure about that? Shall I bring Geekbench into this?
Ninetysix, bust out the links, please.
42. brrunopt (Posts: 206; Member since: 15 Aug 2013)
Nexus 5 - ~3300
iphone 5s - ~ 2600
45. saurik (Posts: 85; Member since: 13 May 2013)
If you view the Nexus 5's info which scored ~3300 on Geekbench, you'd see that it's clocked at 2.8GHz whereas the official version ships at 2.3GHz, implying that the device used to benchmark was overclocked.
Keeping in mind that the average consumer doesn't overclock his/her phone, the best way to find out the "real" benchmark score would be to look at the official Geekbench's Android comparison chart (http://browser.primatelabs.com/android-benchmarks) where the Nexus 5 scores 2560.
Do your homework before posting, kid.
57. brrunopt (Posts: 206; Member since: 15 Aug 2013)
missed that, my bad..
still , Nexus 5 at 2.3Ghz (more exactly 2265Mhz) scores up to 3000 , above the 5s
63. brrunopt (Posts: 206; Member since: 15 Aug 2013)
and even on those "official" results the 5s scores 2523 vs 2805 on the note 3 , 2701 on the Sony Xperia Z1 , ....
"Do your homework before posting, kid "
49. Dr.Phil (Posts: 813; Member since: 14 Feb 2011)
While I agree with your assessment and agree that the A7 chip is "the best chip on the market", I must also point out that the way you word your comments make it seem like you are an arrogant jerk. It's very counterproductive to having a good conversation that allows insight and opinions on the matter. There are several times that I will voice a majority or dissenting opinion, but I do it in a manner that promotes dialogue from another person who may or may not agree with me.
Also, when talking about benchmarks you have to take things with a grain of salt, especially benchmarks that are cross-platform like Geekbench. There is no way of knowing how much the benchmark relies on software utilization. There seems to be a pretty significant difference in scores when the software of the device is updated. Now, I am not denying that even when you throw out software utilization that Apple's A7 chip wouldn't still come out on top. What I am saying is that you can't just quote benchmarks like Geekbench to support your claims.
56. JakeLee (Posts: 372; Member since: 02 Nov 2013)
The metal benchmark routines don't rely on OS's system libraries very much and mostly consist of pure machine level instructions. Therefore, it would be safe to say the benchmark results are rather accurate as long as they remain within the ARM boundary.
But I absolutely agree with you on what he sounds like.
60. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
You're still at it? Seriously it's getting real sad to see someone so petty like you.
67. Dr.Phil (Posts: 813; Member since: 14 Feb 2011)
Yes but the Geekbench benchmark has actually seen some pretty significant differences in device scores with software updates. You can look up the scores of the 5S before and after iOS7 and there was a big difference before they re-optimized it.
72. JakeLee (Posts: 372; Member since: 02 Nov 2013)
Are you THE Dr.Phil btw?
We also have an "Obama" here which is of course a pun, but in your case, I'm not so sure considering the polite way you've been talking.
59. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
... What? Arrogant jerk, for giving Apple credit? Usually I always agree with you, but here I won't. Counterproductive? All anyone has to say is, "No, I believe this _______ from ______ is the best," and you can go from there.
Geekbench is the most accurate way to measure the power of a SoC, whether it's on Android or iOS to date. Until something new comes up, Geekbench seems to be the most accurate. I have my opinion, and you have yours.
Agree to disagree.
68. Dr.Phil (Posts: 813; Member since: 14 Feb 2011)
I wasn't saying you were an arrogant jerk, I was saying that the way you are wording your comments make it sound like you are one. I also wasn't stating that in a way to make you feel bad about yourself but rather to point out how you are coming across. I was trying to be helpful and hopefully save you from people who are going to see your comments as an attack and will want to attack you directly back. I also said I agreed with your statement that the A7 chipset was one of the best chipsets you can find on the market right now.
Also, it's very debatable as to how accurate Geekbench is. As I have noted in my previous comment, there was a significant difference between pre and post iOS7 scores on the iPhone 5S. This suggests that the benchmark relies more heavily on software optimization than what we think.
66. applesucker (Posts: 62; Member since: 29 Oct 2013)
only when it runs on android because the difference will be noticeable not like the ios which is already had a straight performance
73. JakeLee (Posts: 372; Member since: 02 Nov 2013)
How about this?
Every new / updated app from Feb 1st on will be automatically a 64-bit one.
It's not that hard deducting that the new iPhone in 2014 will feature a 64-bit-only SoC omitting Aarch32.
What this means? Increased design / space / cost / power efficiency plus a design also feasible for servers without much of a change.
Can Qualcomm or Sammy mimic this? Not in five years.
79. grahaman27 (Posts: 160; Member since: 05 Apr 2013)
Your a moron. It does not mean that at all, they must be updated in order to simply be compatible with ios7, it doesn't mean 64 bit support for everything. Its crazy that apple needs developers to make 2 versions of one app just to support their new chip fully.... Welcome to 2006 apple.
80. JakeLee (Posts: 372; Member since: 02 Nov 2013)
Ever heard of fat binary submission?
Use XCode5, target iOS6 and above, and you automatically generate both 32 and 64-bit version by default.
That's how Apple handled the transition on Mac
You don't have to prove your ignorance since you are a moron by default.
81. Victor.H (Posts: 369; Member since: 27 May 2011)
And I will be putting the last sentence in a frame, definitely.
3. _Bone_ (Posts: 1995; Member since: 29 Oct 2012)
Not real innovation as like icons and rounded corners, 64bit has been around, first on mobile via Huawei. But it's a logical evolution iOS is taking early, and it might pave way for real innovation.
4. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
A7 is one hell of a beast. Give credit to where it's due.
10. _Bone_ (Posts: 1995; Member since: 29 Oct 2012)
Huawei's K3v2 is the first 64bit mobile SoC. Give credit where it's due.
13. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
Please explain to me where I said the A7 was the first 64-bit SoC. I'll wait.
Bone, I would have expected you to be smarter than to try and question me.
17. androiphone20 (Posts: 943; Member since: 10 Jul 2013)
New instruction set ARMv8, does that mean anything to you?
30. Finalflash (Posts: 875; Member since: 23 Jul 2013)
But even the article points out the real reason for the panic, that being "it is 32 more". Not the benefits but the marketing gold that is with consumers who also don't know what 64 bit really is. The processor is good that isn't the contention, it is that 64 bit is a marketing gimmick at best and that is what the industry saw it as as well. Also the instruction set doesn't do much either (as in it didn't double the speed, more like 1-2%), it is the core components of the CPU die here that made the difference with the gains shown in benchmarking.
44. JakeLee (Posts: 372; Member since: 02 Nov 2013)
From your previous comments, I assume you are an app programmer proficient with Java.
There are so many things HHL programmers aren't aware of, especially the Java programmers tend to know nothing about how CPUs work :
Download some AOSP codes from codeaurora.org, look at bionic math particularly. Even if both input and output are mostly 32-bit data, very often the intermediate results are 64-bit ones spanning over two 32-bit registers. Just look how inefficient it is dealing with a single value spanning over two registers.
Even more importantly, the 31 64-bit architectural registers on ARM64 are capable to fill the 64-byte cache line completely during every iteration, thus absolutely maximizing the performance/power efficiency while this kind of luxury is simply not possible with the 14 32-bit registers on ARM32. (You are lucky if you can fill 8 bytes)
That amount of boost in register capacity also grants even world's most lackluster compilers like GCC generate halfway decent machine codes - a hidden benefit.
The result? The A7 operates roughly 30% faster in 64-bit mode than in 32-bit mode without any kind of special 64-bit treatments WHATEVER you are doing.
And it's just the beginning.
For comparison :
You may put a $1000+ graphics card into your Windows machine that's 20 times as fast as the Intel integrated one, it won't improve your Office-like applications' UX by any bit.
30% isn't much? That's roughly the performance difference between the most expensive i7 and cheapest i5. Just look at their prices.
Do not even think of refuting my arguments with Windows x64 though. That would just highlight your ignorance. The x86, being a CISC one, can do arithmetics directly on RAM without loading the data to its architectural registers first. On CISC machines therefore, the ISA serves just as the frontend while the backend does the vast majority of the job. (arithmetics on memory are taken over by the hidden physical registers)
Therefore, it hardly matters if the frontend operates in 32 or 64-bit on x86 machines.
There is also this kind of "backend" on ARM, but it rather plays a supportive role due to the RISC nature.
That's the difference.
28. saurik (Posts: 85; Member since: 13 May 2013)
No it's not. Here are two sources confirming that K3v2 is not a 64bit SoC and A7 is the first 64bit mobile SoC.
Also, a quick look at your beloved android benchmarks reveal that K3v2 is 32bit (1.5 GHz quad-core) and ranks much below the iPhone 5's dual-core A6 let alone the A7.
Give credit where it's due.
69. _Bone_ (Posts: 1995; Member since: 29 Oct 2012)
You've been owned you stupid idiot.
Stop claiming things you have no clue of.
76. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
Ironic since you claimed the A7 wasn't the first 64-bit processor when clearly it was lol.
You've been owned you stupid idiot.
58. Altair (Posts: 278; Member since: 02 Feb 2012)
Huawei is just a cheap copycat, who didin't invent that. They just copied technology behind it from others.
24. Droid_X_Doug (Posts: 5153; Member since: 22 Dec 2010)
Apple just moved to 64-bit sooner than the industry expected them to. Until apps are designed to work with 64-bits, the average end-user won't realize much benefit.
61. JakeLee (Posts: 372; Member since: 02 Nov 2013)
80% of top iOS apps already run in 64-bit.
And what do you mean with "designed to work with 64-bits"? There is no such thing like designing for certain bit numbers.
I can very good imagine assembly optimized apps that benefit from the increased number of NEON registers though. That would be a jaw dropping experience.
36. jos_031 (Posts: 28; Member since: 12 Jun 2012)
A7 uses arm cortex a57 architecture designed by arm holdings. Eventhough 64 bit is not of much use, there is improvement in instruction per clock cycle.. this will increase the performance for a given clock speed.. i think qualcomm krait is based on cortex a9 ... there has been 2 revision the a15 and a57. the qualcomm was not spending much money in new processor architecture.. this is more about licensing fee to arm than research.. 64 bit significance will come only when apps require more than 4gb of ram.. but 64 number make people want that.. but will actually benifit from per clock speed increase.. did apple innovate? not much, but there is innovation in implementing right components in processor and chosing best communication protocol within. you can think it as making a dish although all components are available right proportion make it a tasty dish.. the component of a7 of apple is from other companies but they made a tasty dish.. one thing is apple was smart enough to invest on a57 since business is all about advantage.. even if it is in mind of people...
38. JakeLee (Posts: 372; Member since: 02 Nov 2013)
K3v2 is based on the Cortex-A9 of ARMv7 architecture, a32-bit one. period.
11. protozeloz (Posts: 5284; Member since: 16 Sep 2010)
Man I go away a few days and this place turns sadder and sadder
15. SuperAndroidEvo (Posts: 3427; Member since: 15 Apr 2011)
C'mon protozeloz, lets keep the faith in PhoneArena.com. lol
20. protozeloz (Posts: 5284; Member since: 16 Sep 2010)
Peoplused to actually tried sounding smart before bringing the hate... Not anymore
22. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
The new wave of users since Spring have been... Not very educated. I'm only here because of the comment section lol.
26. protozeloz (Posts: 5284; Member since: 16 Sep 2010)
Actually, its the word -to- that got there somehow
35. Ashoaib (Posts: 338; Member since: 15 Nov 2013)
Yeah yeah... apple is light years ahead of all... the company which cant manufacture anything on its own is light years ahead of all.. probably apple were the first who sent the man on the moon but they are keeping it a secrete...
43. jove39 (Posts: 1196; Member since: 18 Oct 2011)
A7 is best ARM soc...I really like how Apple stick to 2 cores that are almost as powerful as 4 krait cores...Is it innovation? No...A7 is not first 64bit soc...ever heard of atom baytrail cores?
47. JakeLee (Posts: 372; Member since: 02 Nov 2013)
It's amazing to see such ignorant comments from time to time.
You will need the Itanium for a comparison with the ARMv8, not the Frankenstein zombie x86.
I never heard of an Itanium based SoC though.
And should I buy a tablet with Baytrail sometime, it better run Windows rather than Android with TONS of compatibility issues.
62. JerryTime (Posts: 427; Member since: 09 Nov 2013)
Samsung made the chips, so no apple doesn't.
65. Ninetysix (Posts: 1010; Member since: 08 Oct 2012)
Ohh..I want to play this game!
TSMC made the chips, so no Qualcomm doesn't.
5. ArtSim98 (Posts: 823; Member since: 21 Dec 2012)
A7 is sure nice, but isn't really needed right now. We'll see how the 5S performs in the future when 64 bit is actually useful in phones.
8. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
As games and apps in the App Store get heavier with content, a chip needs to excel at making it butter smooth.
48. JakeLee (Posts: 372; Member since: 02 Nov 2013)
It's only Android that doesn't need a 64-bit SoC since it can hardly benefit from it :
The Dalvik VM is a *32-bit* machine. It's set to stone.
You may get 64-bit Android, 64-bit Zygote, and even 64-bit VM port in the future, the emulated machine remains a 32-bit one.
And you won't see any meaningful number of 64-bit native apps for Android for a very long time after the launch of 64-bit Android due to the high level of fragmentation. I guess as long as about five years.
Look at the AppStore. About 80% of the top 100 apps require iOS6 or higher, indicating they run in 64-bit natively per se on 5s or 2013 iPads.
Benefits of 64-bit computing are already present on iOS, not a future thing.
It seems many people simply want to ignore this fact though.
Apple has a long history of successful transitions : 68000->powerPC->x86->x64.
The most recent one, ARM32->ARM64 seems to be the easiest and fastest one so far, all thanks to its unique fat binary deployment system and iron grip over the AppStore.
7. bigstrudel (Posts: 518; Member since: 20 Aug 2012)
Apple has been a year ahead of the curve on the release of both the A6 and A7. The A6 core for core still outperforms any android on the market except the S800 and it only falls short of that because of a full 1ghz clock speed advantage for the 800. AND the A6 is at 32nm.
9. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
I wouldn't go as far as saying a year ahead as the 805 and Bay Trail are around the corner. A6 is snappy, yes, but think about how light iOS is compared to Android. iPhone 5 is starting to stutter on iOS 7. Don't even get me started with the 4S and 4.
18. protozeloz (Posts: 5284; Member since: 16 Sep 2010)
I haven't touched my Bosses 5 as much after the update but when I did It felt it was slower than the xperia z1 I own and that phone is on 4.2 also dammit having to go to the bottom to delete a contact? FUUUUU long press and menu FTW!!!!
21. PapaSmurf (Posts: 5823; Member since: 14 May 2012)
I want a 5S but the decrease of 1.7" is scaring me. :(