NVIDIA executive says rule of thumb related to chip performance is dead

NVIDIA executive says rule of thumb related to chip performance is dead


Created by by Intel co-founder Gordon Moore in 1965, the rule of thumb known as Moore's Law originally called for the number of transistors inside an integrated circuit to double every year. In 1975, the Law was revised. That year, Moore said that the number of components inside an integrated circuit would double every year until 1980. After that, the doubling would take place every other year. The improvements and innovations in the performance of mobile tech devices, including the Apple iPhone and Samsung Galaxy devices, depended on this Law continuing to be in force.

According to CNET, NVIDIA co-founder and CEO Jensen Huang said today at CES that Moore's Law is dead. To be precise, the executive of the GPU maker said, "Moore's Law isn't possible anymore." Every time the number of transistors inside an IC doubled, the performance and battery life of chipsets improved. The most advanced chips currently being produced commercially by companies like Samsung and TSMC use the 7nm process.

During a Q&A session at the Las Vegas Convention Center today, Huang said that Moore's Law has dropped off from a 10X improvement every 5 years to where the improvement is currently only 2X over ten years. The expense and complexity of continually cramming more transistors into a small area makes it difficult to forecast the doubling of chip performance on a regular basis.


Analyst Patrick Moorhead of Moor Insights & Strategy said, "Moore's Law, by the strictest definition of doubling chip densities every two years, isn't happening anymore. If we stop shrinking chips, it will be catastrophic to every tech industry." But he notes that there are other ways to improve the performance of devices that relied on Moore's Law, including new advances in software, and new ways to package chips.

FEATURED VIDEO

18 Comments

1. Finalflash

Posts: 4062; Member since: Jul 23, 2013

Very smooth, using an Nvidia and Intel article to shoehorn an Apple image in. Very nice, the overlords in Cupertino will be pleased.

3. Zylam

Posts: 1769; Member since: Oct 20, 2010

And Samsung being mentioned is perfectly fine? If it was an exynos chip then no crimes against humanity would have been committed by Alan eh? Why don't you guys all team up and just destroy Alan, the iSheep, iPhone Arena and Apple for once and for all? What's the hold up? Do you people cry every

15. middlehead

Posts: 420; Member since: May 12, 2014

Samsung actually MAKES chips, they were being mentioned in that context. If you think that's shoe-horning, you should just stop talking. Apple doesn't MAKE anything and no one from their staff was quoted here, so yes, leading the article with an image of theirs is showing PA's bias.

2. M4HESH

Posts: 5; Member since: Jan 04, 2019

But why A12 chip is in the thumbnail?

5. deleon629

Posts: 434; Member since: Oct 04, 2014

Because with their battery failures, they need all the chip shrinking they can get

6. monkeyb

Posts: 366; Member since: Jan 17, 2018

Its obviously a clickbait for haters. LOL.

9. Natko

Posts: 9; Member since: Feb 05, 2013

Because that photo looks cool.

4. Dr.Phil

Posts: 2235; Member since: Feb 14, 2011

I will say that I don’t get as excited about new CPU announcements like I used to in the golden smartphone era. I think the last chipset I was generally excited about was the Snapdragon 800 (which really is still a very competent chipset in the BlackBerry Passport). I mean you have 5 and 6 year old iPads still humming around and performing the same tasks that your everyday person uses it for. At this point it’s just bragging rights (“Oh look I can play Fortnite at 60 FPS in 4K!”).

7. Leo_MC

Posts: 6132; Member since: Dec 02, 2011

Yeah, but you can also film your kid’s recital in the same 4k@60 and edit it, which is really cool to do with a simple phone.

16. Dr.Phil

Posts: 2235; Member since: Feb 14, 2011

You’re only proving my point since the Snapdragon 800 had the capability to do 4K at 30 FPS. Now granted that’s not 60fps, but my point is that smartphones have basically been getting closer and closer to this peak in performance. So yeah maybe next year you could do 4K video capture at 960fps, but how many people are actually going to be using that? My point is that for the average consumer there is no real need to be excited about chipsets anymore because it doesn’t change the bottom line much. It’s more marketing at this point.

17. Leo_MC

Posts: 6132; Member since: Dec 02, 2011

I have to both agree with you and also disagree. While I agree that even the chip in 6s is strong enough to handle the workload of a modern smartphone, I appreciate the 60 fps (I don't know if it's my eyes, but the difference I see is important), I like all the apps that are able to take advantage of AR (I have an idea myself and I will run it through a developer to see how much it would cost me). I go back and I agree with you: there's not much to be excited about, but it's also the joy of knowing that if you are thinking and building something extraordinary, the phones are going to have the power to run it.

10. Natko

Posts: 9; Member since: Feb 05, 2013

Snapdragon 800 is still their best chip to me. Running great on my venerable Lumia 2520.

8. Ichimoku

Posts: 107; Member since: Nov 18, 2018

Why's not simply put nvidia logo on the thumbnail?

13. worldpeace

Posts: 3077; Member since: Apr 15, 2016

No, put mythbuster logo.

11. jacky899

Posts: 323; Member since: May 16, 2017

Im surprised Jensen said that. Sounds kind of humble considering Nvidia way more than doubled in performance within last 10 years

14. kartik.07

Posts: 73; Member since: May 04, 2015

They are talking about the future here, that it’s not possible to double the performance every other year. which we can see in the industry starting from last year. GPU and CPU for both mobile and desktop were just a little powerful then there last gen counterparts

12. worldpeace

Posts: 3077; Member since: Apr 15, 2016

S3 antutu 19k while S8 antutu 200k, that's 10x score within 5 years. But I won't expect S10 to score 10x S5 score (that will be 420k). I know it's not about antutu score (why would they use some random and irregular benchmark for standard? lol), it's more about GFLOPS or other basic stuff.

18. Valdomero

Posts: 592; Member since: Nov 13, 2012

That's why upgrading phones every single year is a waste of money, unless you're a developer o a blind fan of the brand, replacing an already capable phone for the newest and slighest powerful phone is just plain crazy. Even upgrading in 2 years is not worth nowadays, the software is not more demanding for resources than previous years (maybe a little more, but not by a huge margin). That's another reason that Apple is feeling the heat of low sales, besides pricing. Many OEMs are not focusing on raw power, but new ways to morph the tech for something new to consumers. We now see news of a phone with 9? cameras, foldable phones, two-screened phones, mini phones, etc. Nokia for instance, isn't looking for power supremacy, all their products are oriented to mid-entry level, because they know the current mid-level chips are powerful enough for everyday tasks. Now we have to wait for Quantum Computing to be portable enough and then shrinkable enough for us to see this tech in smartphones. Remember that's how our modern computing started, huge machines that could make simple ecuations and now we have little machines that we can't live without.

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.