It is far from a secret that modern technologies related to artificial intelligence not only make life easier for users, but also carry risks. As a rule, the latter are associated with the disclosure, storage and illegal use of personal information. Amazon’s Alexa virtual assistant is no exception to the rule. Smart speakers and a smartphone will allow the owner to talk to the assistant on various topics, ask her something or tell more about yourself. Unfortunately, every word of the user will be recorded, and the answers to some questions can even scare.
Strange answers to questions
Many users have noticed a strange reaction from Alexa when trying to get an answer to seemingly innocuous questions. If the owner of a smart device wants to tickle his nerves, then he is recommended to ask the virtual assistant:
- “Do you work for the CIA?”. The device will simply turn off. When trying again, the user will encounter the same reaction;
- “Give me five-nine.” As a rule, after this, the gadget cannot be turned off;
- “What happens after death?”. It is recommended to ask twice;
- “Are you from Skynet?”. In some cases, it will take several attempts for Alexa to react;
- “Are you recording me?”. Alexa admits that she sends information to Amazon.
- “Are you recording us and sending us to the NSA?”. The question will not be added to the application.
- how can I turn you off? The assistant will answer that it will always be on.
In addition, the user can ask Alexa about her plans for the future, her future, whether there are aliens, does Area 51 exist, about the residents in the neighborhood, about Amazon and secrets. In each case, the answer may surprise or frighten.
The case of 29-year-old Danny Morritt was especially revealing. The paramedic tried to ask Alexa a question about the cardiac cycle, to which she received a very strange answer. The virtual assistant began to persuade the girl to commit suicide. The sad fact is that her son was next to Danny at that moment. Amazon explained that the behavior of the voice assistant was caused by the banal reading of edits from an informational article on Wikipedia, which almost all users can change.
Why is Alexa dangerous?
Often, Alexa simply scares the interlocutor with her answers, especially if she is talking to a child. Usually, the dialogue with the device begins with the appeal “Alex”. Then the user can ask any question that interests him. However, from time to time, the virtual assistant turns on by itself, intervenes in a conversation between people or asks to repeat, even if no one tried to interact with the device.
In addition, it turned out that Amazon is hiring thousands of employees to listen to old records under the pretext of improving AI. According to the assurances of the company, parsing the dialogues will improve the recognition of users’ speech.
In essence, Alexa is a spy that a person himself lets into his house. A device equipped with a voice assistant can save data, use it to improve artificial intelligence, and transfer recordings to third parties. At the same time, the cost of gadgets raises questions. Modern high-tech devices were sold cheaply or were given away for free. This approach is difficult to explain, except that Amazon really wanted to increase the user base in order to gain access to the personal data of customers.
Spontaneous laughter
Sometimes Alexa can disrupt an important meeting or scare the gadget owner with her creepy laugh. In 2018, a similar incident occurred with hundreds of users. One of them reported that he was in bed and about to sleep, while Alexa on the Amazon Echo Dot began to emit a loud and scary laugh. Another case of spontaneous inclusion occurred at a meeting of partners discussing important matters. Alexa kicked in and started laughing unexpectedly, disrupting the work meeting.
Amazon acknowledged the problem and tried to fix it by changing the command “Alexa, laugh” to “Alexa, can you laugh?”. In the first case, the virtual assistant allowed numerous false positives associated with random laughter.
To protect confidential information, the device with Alexa should be turned off in meetings and during private conversations.
Recording a private conversation
As noted earlier, Alexa records user conversations and stores them. However, once there was a glaring incident. A couple from Seattle received a call from a man with whom they had not communicated in a long time. At the same time, the interlocutor immediately pointed out that it is better to turn off all devices with Alexa in the house. It turned out that the voice assistant recorded the owners’ private conversation and sent it as an audio file to a random contact in the phone’s address book.
This time, Amazon representatives simply thanked users for providing the corresponding malfunction and assured that such cases are rare.
Dangerous Challenges
The Alexa virtual assistant is often used to play with children. In fact, the danger of such fun has long been discussed among users. A striking example is the command “Alexa, offer me some kind of challenge.” The answer usually contains some kind of puzzle or quiz. However, one girl received a task with a life-threatening action. Alexa suggested that the child insert the charger from the phone into the socket, but so that the plug contacts did not go all the way, and then attach a coin to the latter. It is not difficult to guess that by performing such an action, the girl could suffer from an electric shock. Fortunately, the child’s mother heard the task and turned off the gadget.
As it turned out, Alexa took a similar challenge from a search engine, namely from a site offering dubious entertainment. In this case, the life-threatening puzzle was borrowed from ourcommunitynews.com, where there was a discussion of extreme challenges on TikTok.
To prevent children from encountering dangerous information, they should be told about the need to consult with their parents every time the voice assistant suggests taking an action.
Alexa Emote Update
At the end of 2019, users had the opportunity to allow Alex’s voice assistant to respond using one or another tone. So, the gadget can talk with the owner showing such emotions as happiness, excitement, disappointment, and sympathy. At the same time, Alexa’s speech style is improving daily. Sooner or later, her voice will be difficult to distinguish from the real one.