Samsung officially unveils the Exynos 9810, touts AI features, advanced 3D "hybrid" face recognition

Back in November, we caught first wind of Samsung's next-gen Exynos chipset, which was revealed in passing in press materials from Samsung. However, at that time, information about the new SoC was scant, with details being withheld for a final reveal further down the line. And here we are now, with an official announcement from Samsung, outlining some of the best new features and capabilities of the chipset.

The Exynos 9810 is still being built on Samsung's 10nm manufacturing node, contrary to earlier rumors alleging an 8nm process, but the second-generation 10nm FinFET manufacturing node is promising higher performance with a lower power draw for the new chipset. Furthermore, the Exynos 9810 boasts custom CPU cores, an upgraded GPU, and advanced machine-learning and image-processing capabilities, which will be used for delivering better multimedia experiences and tighter security.

Samsung is yet to announce all specs of the new chipset, but it has revealed that the CPU will have four third-generation custom cores, which are relegated to power-intensive tasks and can operate at a maximum clock speed of 2.9 (GHz), with another cluster of four cores optimized for efficiency. Samsung claims that, with the widened pipeline and improved cache memory of the new chipset, single-core performance will be "enhanced twofold," while multi-core performance will be boosted by "around 40 percent" compared to the Exynos 8895.

The Exynos 9810 will power the Galaxy S9 and S9+ in some markets

As was expected, the Exynos 9810 also boasts advanced neural network-based learning capabilities, which will come into play for the new, "hybrid" 3D face recognition system, which Samsung claims utilizes both hardware and software to deliver improved security and realistic "face-tracking filters." Yeah, that last bit means Animoji-style shenanigans. But furthermore, the 9810 features dedicated image processing and upgraded multi-format codec (MFC), which can be used for advanced object recognition and scene categorization, as well as visual and image processing. Samsung says this could deliver advanced stabilization for images and video of up to UHD resolution, real-time shallow depth-of-field simulation in high resolutions, video recording in up to 120fps at 4K, and vastly improved low-light performance. But it this also bodes many improvements in Bixby Vision, which relies heavily on object recognition to deliver results.

“The Exynos 9 Series 9810 is our most innovative mobile processor yet, with our third-generation custom CPU, ultra-fast gigabit LTE modem and, deep learning-enhanced image processing,” said Ben Hur, vice president of System LSI marketing at Samsung Electronics. “The Exynos 9810 will be a key catalyst for innovation in smart platforms such as smartphones, personal computing and automotive for the coming AI era.”

The Cat.18 LTE modem inside the 9810 is also among the first to offer 6x carrier aggregation (CA), which (in theory) offers 1.2Gbps downlink and 200 megabits per second (Mbps) uplink speeds, a step up from the previous 5CA modem, which capped out at 1Gps.

The Exynos 9810 will be the chipset powering the upcoming Samsung Galaxy S9 and S9+, in some regions at least.

source: Samsung

Related phones

Galaxy S9
  • Display 5.8" 1440 x 2960 pixels
  • Camera 12 MP / 8 MP front
  • Processor Qualcomm Snapdragon 845, Octa-core, 2800 MHz
  • Storage 64 GB + microSDXC
  • Battery 3000 mAh(31h talk time)
Galaxy S9+
  • Display 6.2" 1440 x 2960 pixels
  • Camera 12 MP / 8 MP front
  • Processor Qualcomm Snapdragon 845, Octa-core, 2800 MHz
  • Storage 64 GB + microSDXC
  • Battery 3500 mAh(35h talk time)



1. Phonehex

Posts: 768; Member since: Feb 16, 2016

Exynos >Snapdragon.

2. Wiencon

Posts: 2278; Member since: Aug 06, 2014

That's true and it's a shame that more phones don't use Exynos

34. umaru-chan

Posts: 372; Member since: Apr 27, 2017

It's not true. Especially after SD835. SD810/820 was inferior but it was the past. Overall SD835 is better mobile processor than their exynos counter part which I believe is designed to do well in benchmark rather than real world. Posting big score in benchmark, claiming to have higher capabilities than QC's processor doesn't mean anything if the real world performance is on par with qc. It's samsung's fault that they can't optimize qc processor. Other manufacturer shows how good qc's processors are. Pixel, Oneplus are fast phone and remains fast for the longest time. Whereas reviewers constantly say samsung phone start lagging within 3 months. Shame.

52. HansP

Posts: 542; Member since: Oct 16, 2011

I'm pretty sure Samsung was one of the first manufacturers to announce they stopped optimizing SoC architecture to detect benchmark software. Your second part of BS is so derailed BS it's getting hilarious. AC is a US site, using US phones. That means the phone he's complaining about uses the SD835. The write-up you're linking to even mentions that his UK buddy with an Exynos-based Note 8 isn't suffering from lag. It would make no sense for Samsung to add this stutter deliberately as they can't sell the Exynos alternative in the regions they're stuck with the Snapdragon. So, if Samsung can't write code for the Snapdragon, that is all up to Qualcomm not providing proper developer tools for it. Long story short, your arguments are massive own-goals, if you're really trying to argue the SD835 is better.

69. ph00ny

Posts: 2069; Member since: May 26, 2011

I've had my note 8 for several months and i don't have the same issue they're having. At the same time i don't install things like facebook which eats away battery life

4. vincelongman

Posts: 5758; Member since: Feb 10, 2013

So far it seems: Exynos CPU > Snapdragon CPU Exynos ISP > Snapdragon ISP Exynos GPU < Snapdragon GPU Exynos AI accelerator < Snapdragon DSP Will be interesting to see reviews

43. Boast_Rider

Posts: 536; Member since: Sep 14, 2017

I think the GPU on 8895 was quite a bit faster than 835. Check this out: Not sure about the 9810, but the G72 even in 12 core was quite formidable in Kirin 970. 18 core in the 9810 will be a monster.

49. vincelongman

Posts: 5758; Member since: Feb 10, 2013

Huawei clock their GPUs higher e.g. 970 is ARM Mali-G72MP12 @ 850 MHz 9810 is ARM Mali-G72MP18 @ ~700 MHz So the raw difference is about 25% The 8895 and 835's GPU are very close in short tests, more or less on par in performance (835 leads in long sustained tests and has lower power consumption) Samsung are claiming the 9810 has 20% higher performance Qualcomm are claiming the 845 has 30% higher performance WHILE consuming 30% less power The 845's GPU is on another level compare to the 9810's GPU But the 9810's CPU is on another level compare to the 845's CPU So it will be interesting to see reivews

54. Boast_Rider

Posts: 536; Member since: Sep 14, 2017

The GPU performance of the 845 and 9810 should still be very close. The g71 around 10% faster than adreno 630. So the 20%gain here should match the 30% there. About power consumption , the g72 should also be a lot better, as anandtech noted in their 9810 article. The CPUs should be very close in performance too. Both SoCs look to be within 10% of each other, like always(except.thr 8890 and 7420).

56. vincelongman

Posts: 5758; Member since: Feb 10, 2013

The 8895 is only ~10% in some short tests The 835 leads in long sustained tests and has lower power consumption So the 9810 and 845's GPU should be very close in short test. But again the 845's GPU will be significantly better long sustained tests, this time probably a slightly bigger margin Anandtech say efficiency should be improved, not power consumption Samsung say 20% higher performance (presumbly at the same power consumption) i.e. higher perf/watt, meaning better efficiency. But not neccessarily lower power consumption CPUs won't be close in benchmarks if Samsung's single core claims are true Real world performance will be very close though

57. Boast_Rider

Posts: 536; Member since: Sep 14, 2017

I would wait for the tests for that. But based on how much Kirin improved with G72 over G71, my hopes are that both will have very close GPUs. About real world gaming, really no one cares. The games on android are not improving in terms of fidelity since 3-4 years now, so that's pointless. Even Snapdragon 625 plays every single game fine. Besides, android is really bad for gaming, with most games like asphalt capped at 30 fps and some games really badly optimized for some platforms. Samsung's single core claims are obviously false. I wouldn't expect more than 50% gains. Besides, there is no single decent benchmark to test that. Geekbench relies on memory bandwidth too, which is constant since 810 days(between 25-29 Gbps). Samsung always matches the performance of Snapdragon chips. I wouldn't expect them to beat SD845 by 40% or so in single core. That would cause buyers' remorse for US customers.

50. Trex95

Posts: 2383; Member since: Mar 03, 2013

The G71 20 cores Exyons 8895 as anandtech mentioned it’s in par with iPhone 7 when it comes to gaming performance? Try it with my S8 plus Exyons gaming performance not bad, but not as good as iPhone unless if you turn gaming mode and keep the resolution at 1080P it performance well, the good thing about Samsung GPU it doesn’t heats up like snapdragon GPU.

55. vincelongman

Posts: 5758; Member since: Feb 10, 2013

? The 8895's GPU heats up quicker and uses more power compared with the 835's GPU. See the AnandTech review of 835 vs 8895

60. Trex95

Posts: 2383; Member since: Mar 03, 2013

Only if you play high intensive games like asphalt 8 other than that no.

19. GreenMan

Posts: 2698; Member since: Nov 09, 2015

Apple A Series > Exynos > Snapdragon > Mediatek.

24. Anonymous.

Posts: 423; Member since: Jun 15, 2016

The Apple A series do not trump the Snapdragon in GPU, and I doubt it beats the Exynoos ISP.

25. Chrusti

Posts: 103; Member since: Jul 25, 2015

NO. When it comes to emulating the drivers of the exynos are abysmal and can’t hold a candle to the Snapdragon.

30. Anonymous.

Posts: 423; Member since: Jun 15, 2016

Isn't it the other way around? The Exynos is rather better suited for emulation than the Snapdragon. It emulates PSP, GameBoy, SEGA, Nintendo and even PlayStation 2 games!

38. Ciro1900

Posts: 591; Member since: Dec 17, 2017

Nop. Snapdragon destroy exynos and bionic ht tps:// ht tps://

45. CreeDiddy

Posts: 2276; Member since: Nov 04, 2011

Poor choice of video examples. That's all that matters...

53. Trex95

Posts: 2383; Member since: Mar 03, 2013

Nokia 8 performs better than one plus 5, note 8, S8/S8 plus, U11 and pixel xl 2 ?!

61. makatijules

Posts: 835; Member since: Dec 11, 2017

But the SD has a better GPU. The GPU offers desktop-like performance for graphics. Also the SD was meant to be a faster stronger CPU, but the trade-off is lower battery-life. Which with fast charging on-board is no big deal to me. Samsung did use the Exynos in the US with the S6. Other than degraded battery-life on CDMA networks, I though the phone performed very well. I just don;t understand why they actually have to use SD.

67. AmashAziz

Posts: 2934; Member since: Jun 30, 2014

Probably because of licensing fees they have to pay to qualcomm so that the exynos can work on cdma networks.

3. Hollowmost

Posts: 425; Member since: Oct 10, 2017

4k120fps capture and playback 4k120fps capture and playback I said .... 4k120fps capture and playback Where is apple ? Where is SNAPDRAGON ? Where Huawei ???? Ohh and there is a possibility that the Galaxy s9 display will be 120hz

8. vincelongman

Posts: 5758; Member since: Feb 10, 2013

Too bad Sony/Samsung probably don't have a camera sensor capable of 4k120fps ready yet

15. Hollowmost

Posts: 425; Member since: Oct 10, 2017

4k120fps is achievable only using on-sensor stacked DRAM ... Only Sony have this technology ...

18. vincelongman

Posts: 5758; Member since: Feb 10, 2013

Oh true! Can't believe I forgot about that Will be great when it becomes more widespread

21. worldpeace

Posts: 3135; Member since: Apr 15, 2016

DRAM on sony's camera is still very limited in size.. It's 1Gbit or around 125Mbyte, it's cant save that much of raw data (then 125Mbyte of raw data could be encoded to 10-40 MB of file depending on encoder used) Btw, the last time sony use that DRAM on their phone, it can only capture 0.18s of 1000fps at fullHD, and if they change it too 4k@120fps, it can only record less than 5 seconds before the DRAM run out.. Btw, is there any DSLR (or any professional camera) that capable to record 4k@120fps? and cost less than $10,000?

29. Dr.Phil

Posts: 2488; Member since: Feb 14, 2011

I believe the ARM Cortex A72 was the first to tout 4K 120fps. So theoretically Huawei Kirin chipsets which use ARM reference designs would have that capability, but for whatever reason it may be turned off. I am guessing that similar to how Samsung was able to push a software update that allowed the Note 8 to capture 240fps in 720p after it launched, that you could potentially see the same happen from Qualcomm and Huawei.

31. Anonymous.

Posts: 423; Member since: Jun 15, 2016

"I am guessing that similar to how Samsung was able to push a software update that allowed the Note 8 to capture 240fps in 720p after it launched" That was rather 4K@60fps which Samsung pushed a software update to the Note8, as Samsung had previously disables that feature because the Snapdragon version could not record 4K@60fps. So Samsung didn't want a disparity between the two SoC models of the Note 8. In fact, the Exynos 8895 in the Note 8 can do 4K@120fps !

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit for samples and additional information.