Apple has suspended its program, where contractors were listening in to Siri voice recordings in order to improve accuracy for the voice-assistant. The suspension comes after Apple was accused of violating user privacy with the program, given not all users were aware that snippets of their recordings were being shared with employees. The news was first reported by The Guardian. Here’s what has happened.
Apple confirmed that it has suspended the program, where contractors were grading voice recordings, queries sent to Siri for accuracy. Apple said the suspension was being done to protect user privacy, an area where it has often touted its superiority, especially in comparison to rivals like Google and Amazon.
Apple’s statement to Bloomberg said, “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.” It means going forward there will be an option for users to decide whether or not they want their recordings being sent to Apple’s contractors. It has not confirmed when this update will be released.
According to the Guardian, the contractors were hearing confidential information such as medical information, drug deals, and even recordings of couples having sex as part of the Siri program. These contractors, who graded these recordings are hired by Apple and many of them are working across the world. Apple says that less than 1 per cent of commands sent to Siri were being heard by these people. But given the number of Apple users, even one per cent is not a small number.
For Apple users, it was not explicitly made clear that their recordings could be passed onto these contractors. The report claims that it was not mentioned in the privacy document either. In its response to the Guardian story, Apple had said that the user requests, which were being head by these contractors were not associated with any particular Apple ID.
The company also said the recordings are “analysed in secure facilities” and that the reviewers have to abide by the company’s “strict confidentiality requirements.”
READ MORE | Google’s Project Zero security researchers find 6 security flaws on Apple’s iOS
The recordings came to light thanks to a whistleblower who was working for the firm and remains anonymous. While Apple says the recordings were not linked to any Apple ID, the whistleblower contradicted this. They claimed the recordings included personal user data, such as location, contact details and app data. It was also reported that several accidental triggers for Siri, which can often end up recording conversations were sent to Apple as well for listening.
Apple is not the only company accused of doing this. Given the popularity of voice assistants and how all companies want to improve their own version, others are also employing similar tactics. Previously, Bloomberg reported how Amazon employees thousands of people to listen in to Alexa conversations in order to improve the quality of responses by its voice assistants.
According to the report, the recordings are “transcribed, annotated and then fed back into the software as part of an effort to eliminate gaps in Alexa’s understanding of human speech.”
Google also acknowledged this year that its contractors could access recordings made by the Assistant from its Home speakers. The admission came after some of these recordings were leaked. Google does let users view all their recordings made to the Assistant on their Google Account settings, and they have the option of deleting this data individually as well.