Third-party Amazon Echo and Google Home apps are a minefield of scary security flaws

Third-party Amazon Echo and Google Home apps are a minefield of scary security flaws
Smart speakers are all the rage right now, with global sales growing at a rate reminiscent of the smartphone industry's early days and two companies rising above the vendor pack to vie for the crown. In their battle for world domination, Amazon and Google are trying everything in their power to stand out and one-up each other, pretty much improving the capabilities of their already impressive virtual assistants every day and constantly expanding their hardware portfolios as well.

Unfortunately, while this intense competition in a relatively new market with huge potential for long-term growth has created the ideal environment for rapid development of innovative experiences and applications, the security and privacy of users may have been seriously neglected. To their credit, both Google and Amazon seem to be limiting the data their voice-controlled devices can collect and the possibilities of mishandling said information, but a number of issues and concerns are still standing.

One very delicate matter that hasn't received a lot of media attention involves the vetting process of so-called Alexa "skills" and Google Home "actions." These features that can be added to the two companies' smart home devices via official stores are developed by third parties, which makes them susceptible to outside attacks.

Be careful what your voice assistant asks you to do

Generally, users ask Alexa or Google Assistant questions, as well as give commands, set alarms, and so on. But as evidenced in a pair of videos uploaded to YouTube by a company called SRLabs, your AI-powered assistant could occasionally ask you to do some stuff instead of the other way around. Or at least that's what you might be led to believe by a malicious Echo or Google Home app.

Security Research Labs, which specializes in, well, security research developed such an app with the intent to reveal the shocking vulnerabilities of both Alexa and Google Assistant-enabled devices. As it turns out, it was indeed extremely easy to plant basic Alexa skills and Google Home actions that could "vish" (voice phish) users' passwords by pretending to be legitimate "lucky horoscope" services with regional restrictions of some sort.

Said apps could trick users into believing their devices and not the actual apps in question were requesting vocal password verification to install software updates. Of course, that's not how updates work on smart speakers, but this is just one example of a security breach that could be exploited in a very serious way. Other examples include asking for an email address corresponding to said password or even financial information.

Eavesdropping is also incredibly easy

Another test that SRLabs ran to verify the strength of Google and Amazon's app approval mechanisms consisted of manipulating the same innocent-looking horoscope "skill" and "action" into listening in on conversations when theoretically deactivated. Predictably enough, simply asking a third-party app to "stop" giving you previously requested information may not stop the listening process as well.


In this case, hacking an Echo and a Google Home work a little differently, but the end result is equally scary. Anything you say around these devices can be used against you in a number of ways that we honestly don't want to talk about in a lot of detail.

Google and Amazon are putting "mechanisms in place" to improve security

Technically, that's only Google's reaction to these troubling revelations made by SRLabs, but Amazon offered Ars Technica a similar albeit less vague statement in which the company highlighted "mitigations" are already in place to "prevent and detect this skill behavior" from now on.

By the way, what SRLabs did was not purely theoretical, publishing malicious apps that passed Google and Amazon's approval process while obviously not exploiting the potential of the skills and actions before contacting the two tech giants and removing the phishing and eavesdropping apps. 

Based on what Google and Amazon are saying, it should no longer be possible for such blatant security vulnerabilities to go unnoticed in the future, but it's probably better to be careful what you download anyway. And remember, don't do anything sketchy that your voice assistant might ask.



1. Derekjeter

Posts: 1546; Member since: Oct 27, 2011

What else is new. It doesn’t matter how much risk these apps or companies put us in, people will still download it , buy it, use it. Specially Apple, Facebook fans.

5. sgodsell

Posts: 7514; Member since: Mar 16, 2013

This is PhoneArena scaring people. First of all no 3rd party voice app can hack any Google Assistant or Alexa device, period. For all those devices, all of those apps are run on the back end servers, of Amazon, Google, or your companies servers. The policies that are in place don't allow you to create and publish apps like the one described here. Lastly if any app that did make it to the store, that did these bad things, and asked for a password, or an email. Then the person using that voice app would have to be pretty dumb. Plus your email can be linked to an app, with the users permission. If it's a weather app looking for an email? Then don't give it permission. Unless that app is going to send you an email for something you really want, then don't give permission for your email, period.

2. iloveapps

Posts: 909; Member since: Mar 21, 2019

This is not new to amazon and google. Their services relies on amassing people data and using it without consent and selling to ad companies.

6. sgodsell

Posts: 7514; Member since: Mar 16, 2013

Your proving how ignorant you really are iloveapps. First of all this is about 3rd party apps, which run on the back end servers. They can't hack you, or your devices, or even get your data, period. Unless your a total idiot, and both Amazon and Google allow apps like what is mention in this article to be published on their voice app stores. Which developers can't make and publish apps like this to begin with. Second, not Google or Amazon are not getting your consent without anyones permission, period. Third, who's selling to ad companies for these voice platforms? Why would Google or even Amazon sell their customers voice platform data? That just doesn't make sense. You may as well say Apple does something similar, especially since it's well known now that anything you say to Siri can be seen by Apple employees, or Apple's contractors. Now that Apple uses ALL Siri customers data (transcripts text), and that is even with the latest opt-out setting. It doesn't matter if Siri customers opt-in or out, it just doesn't matter.

3. Gryffin

Posts: 71; Member since: Dec 19, 2018

if you're not paying for the product, you are the product same goes for google assistant and alexa.

4. miketer

Posts: 535; Member since: Apr 02, 2015

We really really really really don't need this. For all the fans and addicts of such a speaker- seriously, we can live without it. Please.

7. Rahulkaw001

Posts: 32; Member since: Oct 22, 2019

it is very serious issue

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit for samples and additional information.