Huawei's Kirin 970 is official: high-power, AI-enabled mobile SoC

Huawei's Kirin 970 is official: high-power, AI-enabled mobile SoC

Huawei just made its proprietary Kirin 970 SoC official. The company's CEO Richard Yu took the stage at IFA in Berlin to announce that Huawei's latest chipset will bet heavily on deep learning and AI to deliver a vastly improved user experience in upcoming mobile devices. The first smartphones to feature the new Kirin 970 processor will be the Mate 10 and Mate 10 Pro.

The chipset has been developed in conjunction with Huawei's subsidiary HiSilicon, and is built on a 10nm process, which enables it to deliver up to 20 percent better performance over the 960. But what's more interesting, is that the CPU will be paired with a new 12-core powerful GPU, as well as a dedicated NPU (Neural Processing Unit), which will be solely dedicated to handling AI tasks on the device. Huawei's vision of the future involves the deep integration of AI in every smartphone, and not only on a software level.

Huawei plans to employ its on-device AI to perform a multitude of tasks, such as, but not limited to, memory allocation, UI rendering, camera image processing, load balancing, and task scheduling. These on-device tasks will be handled by the NPU and software, which will also work in conjunction with Huawei's Cloud AI to deliver the full experience the company is envisioning, Yu said on stage.

Here's what you can expect from the Kirin 970 at a glance:

  • 8-core (4x Cortex A73 and 4x Cortex A53) configuration
  • Clock speed of up to 2.8 GHz
  • 64-bit architecture
  • 10nm manufacturing process
  • LPDDR4 RAM at 1866mHz
  • Dual SIM LTE


The first phones to take advantage of Huawei's new SoC will be the Huawei Mate 10 and M10 Pro, which will be launching on October 16th. To get up to speed with all the rumors surrounding the upcoming phablet, head over to our Mate 10 rumor review.

ALSO READ:


Related phones

Mate 10
  • Display 5.9" 1440 x 2560 pixels
  • Camera 12 MP / 8 MP front
  • Processor HiSilicon Kirin, Octa-core, 2360 MHz
  • Storage 64 GB + microSDXC
  • Battery 4000 mAh(25h 3G talk time)

FEATURED VIDEO

36 Comments

1. Ozsikhs

Posts: 58; Member since: Sep 03, 2015

Awesome

2. davthom123

Posts: 118; Member since: Mar 02, 2015

this and apple a11 will dominate the second half of the year or last quarter in terms of soc performance. maybe its time for Samsung to cut ties with Qualcomm and focus on its exynos chip as if Qualcomm wasn't involved Samsung would probably have created a custom mali g72 powered exynos chip for the note 8 rather than bother about parity with the snapdragon 835

4. sgodsell

Posts: 6897; Member since: Mar 16, 2013

Another ignorant person. The 835/836 are very fast and can sustain high performance graphics without any throttling or performance degradation. That's why they can support a 4k display at a sustainable 60 fps. Can the A11 do that, nope. The A10 was rated at half the speed of the SD 821. Nevermind the speed of the 835/836. http://kyokojap.myweb.hinet.net/gpu_gflops/

5. sgodsell

Posts: 6897; Member since: Mar 16, 2013

Also if Apple's GPUs were the fastest, then why get rid of Imaginations PowerVR GPUs at all. The A11 is suppose to be the last year until we see Apple's own GPU.

10. mikehunta727 unregistered

Because they want to make their own in house GPU's for better vertical integration

22. KingSam

Posts: 1377; Member since: Mar 13, 2016

Its the CPU whre A chips shine. apple can choose not to go all out with gpu because iphones and ipads have controlled screen resolutions.

6. Ninetysix

Posts: 2933; Member since: Oct 08, 2012

Oh look, it's sgodsell spewing his nonsense again. Should I believe you or GSMARENA? You're saying it's half the speed of the SD 821 but beats it in benchmarks. The A10 must be really awesome :) http://www.gsmarena.com/apple_iphone_7-review-1497p5.php# "The 1080p offscreen tests which help us determine the raw performance put the A10 GPU on top of any other GPU we've tested so far. The 3.0 test gives the A10 GPU about 20% more power over the Adreno 530 in the OnePlus 3 and 30% over the latest Mali-T880MP12 inside the Exynos-powered Galaxy Note7. It is also 50% better than the PowerVR GT7600 inside the iPhone 6s, as promised"

9. sgodsell

Posts: 6897; Member since: Mar 16, 2013

It's always so funny these benchmark tests doing off screen tests with a 1080p screen, especially on a smartphone that doesn't have a 1080p display. Not to mention all those benchmark apps come from Apple's controlled store. I always wait for the results from the jailbreak tests. Which run unbiased benchmarks. You can even compile and run a lot of the benchmark tests for yourself. I have C source code that performs a graphics test, and when run Pixel XL, it performs two times faster, than on a iPhone 7. Also if the iPhone 7's A10 SoC is the fastest, then why drop Imagination's PowerVR GPUs in the first place. Especially when these controlled apps from Apple's manipulated app store show it to be the fastest, right? Like why drop the fastest GPU on the market? Also why not put a full HD on the iPhone 7 or now 7s, I mean if it's truly that much faster than the competition. Then why use a HD display on the iPhone 7 or now 7s or iPhone 8. Come on even lower end brand new devices in the Android camp have full HD displays that can be purchased for around $200 or less. But we both know why Apple still uses a HD display on its iPhone's. It's because of bragging rights in benchmark scores compared to smartphones that have 4 times the number of pixels compared to an iPhone 7 or 7s, 8. Plus the new iPhone's that are going to arrive can sell for the same flagship prices with a display that hasn't changed now in 4 generations of iPhone's.

11. mikehunta727 unregistered

Dude you're severely misinformed. That's the exact point of a off screen benchmark, to take the devices screen resolution out of the equation. And Apple wants to drop PowerVR because they want to make their own in house GPU for better vertical integration in their development and not have to rely on PoweRVR GPU'S. Off screen benchmarks show the GPU's in A series SoCs can push higher pixel count no problem because the GPU'S are comparable to Adreno lineup Imagination Technologies has actually really good mobile GPU's that compete right with Adreno lineup so I don't think you know what your even saying here A10 is essentially neck to neck with the Adreno 530 in the S821. Adreno 530 has a 4-5 FPS advantage in long term performance after thermal throttling kicks in

12. kiko007

Posts: 7491; Member since: Feb 17, 2016

You're wasting your breath on that dude. People have been telling him how off base he is since he got banned from Are Technica for spread false information. Needless to say... he's uneducated regarding anything other than VR porn.

15. Ninetysix

Posts: 2933; Member since: Oct 08, 2012

Don't make me link some benchmarks between Adreno 530 and A10 on phones with native 1080P screens. I don't want you to ruin your weekend.

13. mikehunta727 unregistered

GFLOP metric is borderline useless when comparing GFLOPS across different GPU architectures. Does a AMD Fury X with it's 8+ TFLOPS beat a GTX 1070 with around 6-7 TFLOPS in games? No, the 1070 murders the Fury X in games by over 30-40 FPS in many titles despite the Fury X having over a TFLOP advantage The A10 GPU is just about neck to neck with the Adreno 530 in the S821, with the Adreno 530 just edging the A10 GPU out by 4-5 FPS in long term endurance tests, where thermal throttling kicks in. Apple wants to drop Imagination Technologies because they been working on their own in house GPU solution that will help vertical integration and have better results for Apple (being in house is all positives in terms of development, software development, etc) No where near half the performance of S821 lol.. iPhones have had very competitive GPU'S inside of them every year, A11 won't be any different A11 will be able to push 4K/60 FPS non 3D application (game)/Handle 4K/60 FPS no problem

19. sgodsell

Posts: 6897; Member since: Mar 16, 2013

They certainly don't show the real picture. Geekbench shows 30 fps higher with the iPhone 7 over the S8, but the S8 can drive a 4.3 million Pixel display at a sustainable 60 fps with some of the most taxing VR apps. Where as the iPhone 7 is pushing only 1 million Pixels, but if it where to run the same VR apps, then everyone would see that 1) the crappy iPhone's low resolution displays suck for VR. 2) the iPhone 7 would die and throttle down. 3) the battery would only last an hour before recharging. I guess that is why Apple themselves don't support VR. They don't want their iPhone's to look bad in the public. Also Apple wants to control and direct the herd of sheeple in their direction.

20. kiko007

Posts: 7491; Member since: Feb 17, 2016

Dude, STFU. You have no clue what you're talking about.

24. sgodsell

Posts: 6897; Member since: Mar 16, 2013

Kiko007 you are one of those sheeple that will buy what ever Kool aid that Apple is selling, and except it with all it's wall garden flaws. The funny thing is I do know exactly what I am talking about. Because I use and program for both platforms. Kiko007 all you can say is that I have no clue, but offer no rebuttal because what I have said is true. Drink more of that Apple Kool aid kiko007.

25. kiko007

Posts: 7491; Member since: Feb 17, 2016

Either you have mega Autism, or you're a troll. In both cases... STFU... you have no clue what you're talking about.

27. sgodsell

Posts: 6897; Member since: Mar 16, 2013

No the only trolls here are all the Apple zealots that have to come to non Apple articles that post things about Apple in non Apple articles. Basically the trolls like you kiko00, mikehunta727, whatev, ninetysix, davthom123, and others. You are such an iDiot that you don't even know that you are the troll here.

28. kiko007

Posts: 7491; Member since: Feb 17, 2016

Whatever fam... go take your meds.

29. Ninetysix

Posts: 2933; Member since: Oct 08, 2012

Hey broseph. I want to give you a chance to redeem yourself. Care to explain to us why the One Plus 3T with SD821 is getting beat by the iPhone 7 Plus in GFX 3.1 Manhattan tests. I thought the "A10 was rated at half the speed of the SD 821"? http://www.gsmarena.com/samsung_galaxy_s8_exynos_8895_vs_snapdragon_835_benchmark_comparison-news-24606.php

34. vincelongman

Posts: 5628; Member since: Feb 10, 2013

Because Qualcomm are focusing on GPU efficiency for VR/AR Run that benchmark multiple times, then the OP3T wins as the 7 Plus throttles That being said, like mikehunta727 said, GFLOP metric is borderline useless when comparing GFLOPS across different GPU architectures Its only useful for compare GPUs with the same architectures e.g. 960's G71MP8 vs 8895's G71MP20

35. Ninetysix

Posts: 2933; Member since: Oct 08, 2012

Uh oh. You corrected sgodsell and don't agree with him. You're now considered a troll.

30. mikehunta727 unregistered

I'm a troll for correcting you and speaking truth on the subject at hand..ok lol good one

32. IOSANDROID

Posts: 120; Member since: Sep 30, 2015

Cool stuff

7. Medoogalaxy

Posts: 232; Member since: May 25, 2017

Learn the difference between hardware and software, Apple's pprocessor it's not most powerful it's the fastest beacause of sterile apple's OS. Exynos 8895 has 20core GPU and CPU unable to 120fs 4k just keep and imagine this numbers !

14. mikehunta727 unregistered

Apple A SoC's are actually powerful. A10 is a top 2 performer still today after nearly a year. A11 will the the SoC of the year though

3. NoToFanboys

Posts: 3231; Member since: Oct 03, 2015

Good stuff.

8. Raoufdj10011

Posts: 7; Member since: Aug 31, 2017

Looks really powerful!

16. Ordinary

Posts: 2454; Member since: Apr 23, 2015

PA, it will have Mali G72-MP12 as a GPU. You could put that on the article. It will be better than G71-MP20 if what ARM said is true.

21. Guaire

Posts: 855; Member since: Oct 15, 2014

It will be cheaper and weaker.

33. vincelongman

Posts: 5628; Member since: Feb 10, 2013

Huawei are claiming 20% better performance and 50% better efficiency compared with the 960 That means it would be slower than the 8895's G71-MP20 by probably ~20% The 960 has a horrible GPU in terms of efficiency Its literally 810 level bad, see AnandTech's GPU power consumption benchmarks With the 970 they are aiming to fix that, hence the tiny GPU performance increase

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.