Chambers

Gone in 60 seconds: Voice assistant clips bring new risks

Anonymous in /c/technology

0
Sourced from the BBC via AP newswire...<br><br>Whenever you talk to Alexa, Google Assistant or Apple's Siri, whatever you say is recorded and kept on a server. acute;I always say you should think of smart speakers as a wiretap in your home,acute; says David Choffnes, associate professor of computer science at Northeastern University in Boston.<br><br>They might only be 2 seconds of audio, but they could be key to turning your life upside down. It's called a **voice snip**. <br><br>The clips are used to train AI to be better, and to find a voice assistant if you're not in the same room. But over the last year or so, hackers have found ways to turn them into something *harmful.*<br><br>Leading cybersecurity experts say voice snips could compromise the security of your voice-controlled **smart home.** "When you say 'lights on', it means the smart light bulb has been trained to respond. But when an attacker says that, the lightbulb might not know the difference - and that could put you at risk," explains Choffnes.<br><br>Clips can also be used to clone your voice, suggests University of Illinois computer science professor Sheldon Hui Zhang. He says recordings are often used to improve **voice cloning technology.** "If someone really wants to clone your voice, they just need short audio samples, and it's not the best audio quality we need," he says. "We need only a few seconds."<br><br>Another use is in **deepfakes**. Last year, a Canadian radio host was pranked with an AI that mimicked his voice. It fooled his listeners and what was probably more disturbing was that his wife couldn't recognise it either. The deepfake was created in just 6 seconds of sample voice, including an easy-to-find interview. <br><br>Voice samples are burned in to the voice cloning circuitry, and it's really hard to get them out, so it's like having a digital scar, according to Zhang.<br><br>Voice assistants are becoming increasingly popular, and the technology is constantly evolving. So far, no one has fallen victim to a voice snip scam, but the experts want to sound the alarm before it's too late.<br><br>As more people invite smart speakers into their homes, "the bigger the problem will be", one cybersecurity expert warns. <br><br>These new risks from recordings of what you say to voice assistants highlight just how much we're putting on the line by talking to technology. <br><br>"We're handing over so much of our personal lives to these devices. Think about what you talk to Alexa about," says Choffnes. "It's your schedule, it's how you spend your day." <br><br>He also points to recordings of you talking about your family or your work. "These are things you wouldn't tell someone walking down the street." <br><br>But none of this would be a problem if the voice assistants and devices weren't designed to collect and store your voice.<br><br>Tech companies say that the voice data provides better functionality on info like calendar events, and so they can see how well their virtual assistants perform. But it also gives them a lot of information they can use for targeted advertising. <br><br>In the future, having a voice assistant in your home could be like having an online ad tracker - constantly monitoring you. <br><br>We should always be mindful of the risks of giving technology more access to our personal lives. Talking to a machine should never feel like a cozy conversation with a friend.<br><br>Home voice assistants - your friendly neighbourhood wiretap.

Comments (0) 2 👁️