Siri gets pummeled by Google Assistant in new digital assistant 'IQ test'

13comments
Siri gets pummeled by Google Assistant in new digital assistant 'IQ test'
AI and AR seem to be emerging as the key battlefields for today’s tech industry leaders aiming to become tomorrow’s innovators, with complex deep learning, artificial neural networking, natural language processing, and real-world augmentation methods likely to separate the true trailblazers from the doomed trend followers.

Of course, when it comes to a hot area of research and development like AI-powered virtual assistants, being first is not everything. Apple’s Siri made its iOS debut way back in 2011, failing however to progress at an anticipated pace as late market entrants Alexa and Google Assistant rapidly turned up the heat. 

Numerous recent surveys and in-depth research papers have proven at least Google’s voice agent is superior to Siri in many crucial ways, either displaying a better understanding to queries in general or providing more useful answers to specific questions.

The latest such report focuses specifically on Siri, Google Assistant, Alexa and Cortana’s comparative capabilities on smartphones rather than smart speakers or other smart home-controlling devices, declaring a clear overall winner.

Incredibly enough, a Pixel XL-tested Google Assistant was able to understand all 800 questions posed by a team of LoupVentures researchers and analysts. In contrast, Siri on OS 11.4 misunderstood 11 queries, which still results in an impressive 99 percent success rate, while Alexa and Cortana stood at “only” 98 percent, with 13 and 19 misinterpreted questions respectively.


It’s obviously important to point out Alexa and Cortana are not baked into a mobile operating system right now, supporting iPhones via third-party apps that are understandably not as optimized as their pre-installed rivals.

In other words, the Amazon and Microsoft-developed digital assistants have a perfectly acceptable excuse for only scoring 61.4 and 52.4 percent accuracy rates in terms of correctly answering questions. We can’t say the same about Siri, which lags behind Google Assistant, with accuracy scores of 78.5 and 85.5 percent respectively. 

That’s right, 85.5 percent of those 800 questions, tackling various “local”, “commerce”, “navigation”, “information”, and “command” topics, were not only comprehended, but correctly answered by the Pixel XL’s pre-loaded voice assistant. That’s simply mind-boggling, and we can’t wait to see how much better Google can get in areas like natural language processing.

source: LoupVentures

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless