Siri gets pummeled by Google Assistant in new digital assistant 'IQ test'
Of course, when it comes to a hot area of research and development like AI-powered virtual assistants, being first is not everything. Apple’s Siri made its iOS debut way back in 2011, failing however to progress at an anticipated pace as late market entrants Alexa and Google Assistant rapidly turned up the heat.
Numerous recent surveys and in-depth research papers have proven at least Google’s voice agent is superior to Siri in many crucial ways, either displaying a better understanding to queries in general or providing more useful answers to specific questions.
The latest such report focuses specifically on Siri, Google Assistant, Alexa and Cortana’s comparative capabilities on smartphones rather than smart speakers or other smart home-controlling devices, declaring a clear overall winner.
Incredibly enough, a Pixel XL-tested Google Assistant was able to understand all 800 questions posed by a team of LoupVentures researchers and analysts. In contrast, Siri on OS 11.4 misunderstood 11 queries, which still results in an impressive 99 percent success rate, while Alexa and Cortana stood at “only” 98 percent, with 13 and 19 misinterpreted questions respectively.
It’s obviously important to point out Alexa and Cortana are not baked into a mobile operating system right now, supporting iPhones via third-party apps that are understandably not as optimized as their pre-installed rivals.
In other words, the Amazon and Microsoft-developed digital assistants have a perfectly acceptable excuse for only scoring 61.4 and 52.4 percent accuracy rates in terms of correctly answering questions. We can’t say the same about Siri, which lags behind Google Assistant, with accuracy scores of 78.5 and 85.5 percent respectively.