3GHz mobile chip barrier broken with a dual-core 28nm processor from TSMC

3GHz mobile chip barrier broken with a dual-core 28nm processor from TSMC
ARM has always been quoting its Cortex-A9 and Cortex-A15 "Eagle" architecture capable of up to 2.5GHz per core, and that's with the most powerful representatives.

Well, it seems that for the foundries that make the actual chips nothing is impossible - TSMC announced that it has done a dual-core Cortex-A9 chip with its newfangled 28nm production method, that runs at the astonishing 3.1GHz.

It's hard to believe that mobile chips have come thus far just in the span of a few years, buoyed by the revoluion in smartphones and tablets. As the 28/32nm process in the Snapdragon S4 and the new Exynos demonstrate, the die shrink can bring many advantages, allowing you to choose between power consumption frugality or revving it up to 3GHz+, it turns out. 

The truth is in the middle, and most handset and tablet manufacturers spend thousands of manhours to balance things out, but breaking the 3GHz barrier with an ARM-based mobile chip once again comes to prove how flexible ARM's architecture is, something that undoubtedly keeps Intel execs on their toes.

source: Liliputing



20. sid07desai

Posts: 290; Member since: May 03, 2012

gentlemen, those days are not far away when smartphones will have cooling fans..

21. Non_Sequitur

Posts: 1111; Member since: Mar 16, 2012

My first thought when rrading this article.

23. remixfa

Posts: 14605; Member since: Dec 19, 2008

nah, if it ever got that nuts, i think they would move to a Cell processor style before trying jam a fan in there. it would never work.. fan or not, its in your pocket most of the time, and its going to melt down. Now who wants 200 degree silver paste all over their pants? not me. lololz ooh.. 6 chip Cell processors at 15nm. ooooooooh the power.. :)

6. mercorp

Posts: 1045; Member since: Jan 28, 2012


4. shadowcell

Posts: 300; Member since: Mar 28, 2012

Looks like the gigahertz myth is transitioning properly onto ARM architectures.

7. remixfa

Posts: 14605; Member since: Dec 19, 2008

yup. yup it is. people never learn. remember when Pentium 4 chips used to get massive clocks? then they decided to actually make good chips and clock speeds cut in half with the core2duo while more than double or tripping the power? same thing will happen here. great design and high quality manufacturing trump clock speed and even size any day of the week. so far TMSC has been using inferior manufacturing processes to artificially push size and MHZ barriers. makes for great headlines but lowsey chips.

9. PackMan

Posts: 277; Member since: Mar 09, 2012

What about power draw and heat dissipation? Managing these on mobile chips is going to be more of a challenge than just hitting higher clock speed.

11. shadowcell

Posts: 300; Member since: Mar 28, 2012

My hands melted getting 4ghz but I'm the happiest guy alive.

15. remixfa

Posts: 14605; Member since: Dec 19, 2008

with only passive cooling available for phones that is a major issue. next up.. OMAP5 complete with liquid cooling .. lol way to hit the mark on that one. completely slipped my mind.

3. phil2n

Posts: 519; Member since: Apr 30, 2012

tommorow im gonna have turbo boost

2. SuperAndroidEvo

Posts: 4888; Member since: Apr 15, 2011

YES it's true. I can't wait to see this supposed HTC super phone. It's rumored to have a 1080p RGB matrix 5" HD screen, the NEW 28nm quad-core 2.5GHz Qualcomm S4 Krait with Adreno 320. If that is EVER realized then that will be my new phone, hands down. Qualcomm is going to come out with a quad-core 2.5GHz phone first. They are saying the Q3 or fall. Sh*t is getting very powerful. Laptops are going to be threatened very, very soon.

5. remixfa

Posts: 14605; Member since: Dec 19, 2008

I hope your prepared for a letdown with those rumor specs. lol 1080p screens are a usesless massive power hog. your eyes will never see the difference at such small screens

10. SuperAndroidEvo

Posts: 4888; Member since: Apr 15, 2011

Hey with 28nm anything is possible ESPECIALLY if they can achieve 3.1GHz already! Let's see what happens down the road. To me the Samsung Galaxy S III is really not too BIG of a stand out. Something is clearly wrong with the Exynos chips if they STILL have to use Qualcomm for their phones to use LTE in the US. Qualcomm is set up to have another huge year because they will get a HUGE boost to their sales from all the US sales of the Samsung Galaxy S III. Especially now that Verizon is on board. I am just quietly waiting for this super phone from HTC & the Samsung Galaxy Note 2. From there I will make my choice. I am really not that into the Samsung Galaxy S III especially with its LTE problems. Both Samsung's Exynos & NVIDIA's Tegra just refuse to play well with LTE! I just don’t get it.

13. theruleslawyer

Posts: 108; Member since: Apr 23, 2012

With a 5" screen someone with good eyes should be able to resolve more than the 720p we have now. 1080p is the next well known step up. Something like a 900p would be closer to the limits of human vision at 12" Either way we don't have much room before we start bumping to pointless spec updates. But hey, that hasn't stopped them from endlessly increasing pixel count on small sensor cameras (especially phones) Why let physics get in the way of a good marketing campaign?

14. remixfa

Posts: 14605; Member since: Dec 19, 2008

lol @ lawyer, very true about that last part. @SAE Im not in a position to have to care about LTE so its not something I have to worry about. as a vzw customer I can understand your concern though since there is a massive difference between Rev_A and LTE. its just American/Japanese LTE that has the issue, not all LTE from what I've seen. I'd have to read up on why as I'm fuzzy on that myself... unless someone has the knowledge to spread around. before you think the sgs3 isn't a power stand out, you must remember it is pre release software so its likely to get a boost.. also with all that "nature" eye tracking and other stuff going on, that's going to be taking up available horsepower on those tests . its also running at 1.4ghz vs the S4s 1.5ghz.. all that and it still won by a decent margin. expect those scores to skyrocket in the coming months.

19. SuperAndroidEvo

Posts: 4888; Member since: Apr 15, 2011

Yes but you have to remember, the current S4 is only dual-core & it has the Adreno 225. The S4 Pro will still be dual-core but it will have the Adreno 320. Wait until the quad-core S4 comes out with the Adreno 320. Then we can have a serious comparison. Right now Samsungs quad-core is faster than any dual-core that is out at this very moment. It's not a fair comparison.

22. remixfa

Posts: 14605; Member since: Dec 19, 2008

if the S4 was a true A15, it would be the faster chip, or at least dead even. Its not.

24. SuperAndroidEvo

Posts: 4888; Member since: Apr 15, 2011

The S4 Krait is not a A-15 per say. It's in-between a Cortex A-9 & a Cortex A-15. It's very close to a A-15 though. So a S4 Krait is more powerful than a Cortex A-9. With that said I really think the quad-core S4 Krait at 28nm with the Adreno 320 will kill the Exynos found in the Samsung Galaxy S III. A S5 should be more powerful than a Cortex A-15. Just had to throw that out there. lol (Total Speculation) Remember this time next year Samsung’s Galaxy S IV should be a quad-core Cortex A-15. So Qualcomm should have something really nice to counter it, just like the Krait clearly kills the Cortex A-9.

25. remixfa

Posts: 14605; Member since: Dec 19, 2008

Honestly, the S4 krait is about as fast as the 4212 exynos dual core clock per clock. The exynos is still an A9.. a powerful A9, but an A9 non the less.

17. yt6nin

Posts: 100; Member since: Nov 16, 2011

I like that substantial upgrade in screen resolution, since all of the screen resolutions have been following the trend of incrementing 360... increment of 180 sounds reasonable to me, but we'll have to see. The one thing I'm puzzled when I read your idea of 900p was why the resolution change from 240p to 360p to 480p @ 120 increments??? That's rather strange, and then from 480p to 720p is a 240 increment... 720p to 1080p is a 360 increment. The base of 120 is there though... Then may be it might be 840p if they're gonna make the screen resolution change on a 120 increment base... But 900p, I know where your going, I just hate the tech industry for having so many resolutions, and especially the older golden standard resolutions, which are so done in this generation. I mean, come on, who would want to watch videos in 240p, 360p, and 480p, wh/all looks pixelated when enlarged??? I think 720p should be the standard for all future videos, and then move onto 1080p... Pointless specs or unnecessary and superfluous specs = satisfaction of everyone's desires. Basically having the best of what is capable and possible to achieve. I mean wouldn't you want the best of the best when it's quite affordable, or worth the price? Especially in a smartphone, having a 5" 1080p Super AMOLED HD Plus with RGB Matrix screen Display, Quad-Core Exynos CPU clocked @ 1.8GHz, with a battery lasting 2 days at the least with heavy use on LTE network w/several hrs of use on the auto brightness. 64GB of internal memory, and a microSD slot, with wireless charging and comes with ICS. Doesn't it just sound amazing to think about having a ultraphone with such specs? It would make my friends envious for sure, only if they were techy enough, lol. The vast majority of the people associate their smartphones or smartphones in general by the company, say Apple, Blackberry, HTC, Motorola, Samsung, and etc... There are a limited amount of people, the more techy people who consider the best smartphones are by looking into depth of the actual specs of the phone, say, the cpu, gpu, screen resolution and display type and all of that stuff.

18. theruleslawyer

Posts: 108; Member since: Apr 23, 2012

I think the resolution sizes have more to do with buffers that fit in fixed amounts of memory and /or bandwidth limits than any specific increase in number of lines of resolution. I picked 900p since its close to the right number, and I have a laptop that runs at that resolution already. Actual lines may vary depending if they vary 16:10, 16:9, or other aspect rations. I agree that 1080p is likely simply from the marketing aspect of "Full HD display!!11!!!!omgwtfbbq!!111!"

16. yt6nin

Posts: 100; Member since: Nov 16, 2011

I love criticizing those viewing images and videos in resolutions anything lower than 720p or 1080p when such high resolutions are available. Say with computer monitors, it's easily differentiable between the same image on a same screen size between 720p and 1080p, but on a much more smaller scale of a smartphone screen of about 5" max, that is quite a challenge to distinguish the beauty of images in 720p and 1080p... For now, 720p has been the new standard for true high-end smartphones, and then 1080p will come forth as the standard for the next generation smartphone screen resolutions, within a year or so. Hopefully by then, we get 1600p as more of a commodity in both quantity and price when that happens, as I much perfer having a widescreen, 1600p than a regular 1080p. Widescreen is so much better for computer monitors, but going back to the main point of the argument, I concur that 1080p is going to be useless for now, except a nice idea and spec to have in future phones. There will be a time for the 1080p screens on phones, but now is just NOT the time. Hopefully we can get longer lasting, Android phones with removable battery, w/2GB of RAM min and a Quad-Core CPU with LTE w/a 4.8" Super AMOLED HD Plus screen. The casing of the phone is material that can withstand so many falls and drops from pocket/weight height, showing little to no dents, very little scratches, and chips. Or may be a plastic painting surface that can cover up scratches like the material Nissan has on their cars and implemented on a iPhone protector product, if I'm not mistaken... So many criterias to check off with a must, that its so many people's wishes and desires...

8. Berzerk000

Posts: 4275; Member since: Jun 26, 2011

That will be HTC's Galaxy Note competitor, and it will completely mop the floor with the Note. The Note 2 however, I couldn't say. We'll see

12. redrooster13

Posts: 110; Member since: Feb 20, 2012

lol I don't know where you heard those rumors, but I seriously doubt it. Maybe by this time next year.

1. medicci37

Posts: 1361; Member since: Nov 19, 2011

Now, thats News !!!

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.
FCC OKs Cingular's purchase of AT&T Wireless