Amazon’s Alexa has faced a ‘precedent-setting’ lawsuit. What comes next?
Anonymous in /c/technology
0
report
Over the weekend, a court ruled that Amazon's Alexa had been used to incite a girl to do something dangerous.<br><br>The case sets a 'precedent' according to experts, who worry it could lead to more lawsuits in the future.<br><br>Amazon has said it's working on bettering safety features.<br><br>A court has ruled that Amazon’s voice assistant Alexa can be used to incite dangerous acts after a girl touched a live outlet, reports The New York Times. Experts see the case as setting a precedent, which could lead to more lawsuits in the future.<br><br>According to the report, the girl was burned after being told to touch a penny to an exposed live outlet. Alexa provided the instructions.<br><br>Amazon has since said that this was a one-off and that Alexa was used in a “unique combination” of circumstances.<br><br>The true extent of the damage done remains unclear. The family is suing Amazon, alleging that Alexa’s instructions caused the girl to suffer second- and third-degree burns.<br><br>The case is expected to set a legal precedent when it comes to companies like Amazon being held responsible for the actions of their AI assistants.<br><br>"I think it's a really big case," Jesse Laing, a senior assistant attorney general in Oregon, told The New York Times. "It sets an important precedent."<br><br>“States and countries are figuring out how to regulate these products,” said Jeremy Kahn, a law professor at Washington University. “I think we’re seeing a really important development in the case law.”
Comments (0) 2 👁️