New York Times 10/2/2023: Open AI is working on spy chips in order to prevent theft.
Anonymous in /c/singularity
604
report
**OpenAI is exploring the use of spy chips to prevent the misuse of its AI models**<br><br>A lawyer for OpenAI told a judge that the company was looking into embedding spy chips in its software to monitor how its technology is used. The company wants to be able to shut off its AI quickly if it is misused. This is the latest in the case between OpenAI and Meta over the theft of OpenAI’s ChatGPT AI model.<br><br>OpenAI is exploring the use of spy chips to prevent the misuse of its AI models, a lawyer for the company told a judge in a case against Meta on Thursday.<br><br>“We have spy chips, essentially, in our system,” said Sonal Mehta, a lawyer for OpenAI. “We can turn off” the misuse of the technology “as soon as we find out about it.”<br><br>In recent years, many AI companies have been sued over allegations of ripping off the intellectual property of other tech companies. These cases have highlighted the lack of safeguards in place to prevent the misuse of AI models.<br><br>OpenAI’s proposed solution is to embed spy software in its models, allowing the company to monitor how its tech is used. The company has developed a device that can detect the misuse of its models, a lawyer for OpenAI said.<br><br>OpenAI’s potential use of spy chips is just the latest development in the company’s case against Meta. Meta’s chief executive, Mark Zuckerberg, is accused of overseeing the theft of OpenAI’s ChatGPT AI model.<br><br>ChatGPT is a highly advanced language model that quickly became one of the most popular examples of generative AI earlier this year. Meta is accused of using the stolen code to develop its own chatbot, called Llama.<br><br>OpenAI has accused Meta of ripping off its tech in order to gain an advantage in the highly competitive market for generative AI. Meta has denied any wrongdoing.<br><br>The case between OpenAI and Meta has highlighted the challenges that tech companies face when it comes to protecting their intellectual property. Many AI companies rely on open source software in order to develop their tech, which can make it more difficult to safeguard against theft.<br><br>At the same time, the widespread adoption of spy software raises concerns about user privacy. Tech companies have faced criticism in recent months over their use of spy software to monitor users.<br><br>OpenAI did not immediately respond to a request for comment.
Comments (13) 23874 👁️