SecurityRiskAlexaA study by the German Ruhr-Universität Bochum and the American North Carolina State University, performed the first large-scale analysis of Alexa Skills. Skills are Alexa’s versions of apps, that are used via voice-commands and can be loaded onto the Amazon voice assistant. The study group’s examination revealed that this could bring not only advantages, but security risks and data protection problems for users, too.

Amazon's voice-based assistant, Alexa, enables users to directly interact with various web services through natural language dialogues and provides developers with the option to create third-party applications, so-called Skills, to run on Alexa. While such applications ease users' interaction with smart devices, they also raise security and privacy concerns due to the personal setting they operate in.

The researchers obtained a total of 90,194 skills from the stores in seven countries and several limitations in the current skill vetting process. “Our analysis reveals that not only can a malicious user publish a skill under any arbitrary developer/company name but can make backend code changes after approval to coax users into revealing unwanted information”, the researchers said.

In addition, the team formalized the different skill-squatting techniques and evaluated the efficacy of such techniques, besides studying the prevalence of privacy policies across different categories of skill, and more importantly the policy content of skills that use the Alexa permission model to access sensitive user data.

In doing so, they found that around 23.3% of such skills do not fully disclose the data types associated with the permissions requested, which is why the team provides some suggestions for strengthening the overall ecosystem. Thereby enhancing transparency for end users.

"A first problem is that since 2017, Amazon has activated the skills to some extent automatically. In the past, users had to agree to the use of each skill. Now they hardly have an overview of where the response that Alexa gives them comes from and who is programming it at all", a spokesperson of the research team explained. “Unfortunately, it is often unclear when which skill is activated. For example, if you ask Alexa for a compliment, you can get an answer from 31 different providers. However, which one is automatically selected for this is not directly traceable. In doing so, data required for the technical implementation of the commands can be inadvertently forwarded to external providers”, the expert concluded.

The researchers were also able to prove that skills can be published under a false name: e.g., well-known automotive companies provide voice commands for their smart systems, which users download in the belief that the skills come directly from the company. But this is not always the case, the researchers proved. It is true that Amazon tests all the skills offered in a certification process, but this so-called “Skill Squatting”, which means taking over existing provider names and functions, is unfortunately often not noticed. Another security risk, namely that the skills can still be changed by the providers afterwards, has been identified as well, relativizing the security of the previous certification process by Amazon. In short, after a while, attackers could reprogram their voice commands in such a way that they ask, for example, for the user's credit card details. In the Amazon audit, such requests are usually conspicuous and not allowed but this control can be bypassed by changing the program afterwards.
Finally, the research team found significant deficiencies in the general data protection declarations of the skills offered: only 24.2% of the skills are said to have a so-called privacy policy, and even less in the particularly sensitive areas of "kids" and "health and fitness".

The only reason the issues raised by the researchers don’t constitute a red-alert situation is that they are currently still unaware of any evidence these security risks have been maliciously exploited. Nevertheless, you might want to consider uninstalling all your third-party Alexa skills until Amazon addresses the problems raised by the researchers.

By Daniela La Marca