Apple admits that private Siri recordings are analysed by humanshttps://indianexpress.com/article/technology/tech-news-technology/third-party-contractors-are-listening-to-accidental-siri-recordings-apple-admits-5860411/

Apple admits that private Siri recordings are analysed by humans

A whistleblower has revealed that Apple is hiring contractors to listen to accidental Siri recordings of confidential medical information, drug deals, and couples having sex. Apple responds that less than 1 per cent recordings are used for "grading".

apple, siri recordings, siri recordings humans listening, contractors listening to Siri recordings
Siri recordings are heard by third-party contractors, reveals whistleblower. (Express Photo: Mohammad Faisal)

Apple is paying contractors to hear accidental Siri recordings as part of a process called “grading”. Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control”, reported The Guardian.

The report cites a whistleblower working for one of the contractors and Apple has responded to the report confirming that a small portion of Siri recordings is indeed used for improvements. While Apple’s privacy explainer site details that Siri data is sent to “Apple servers” to improve the quality of the service, it doesn’t explicitly mention that humans are processing it, not does it mention third-party contractors.

The report claims that the contractors are tasked with grading the responses on a variety of factors, including:

*Whether the activation of the voice assistant was deliberate or accidental
*Whether the query was something Siri could be expected to help with, and
*Whether Siri’s response was appropriate

Advertising

The whistleblower says that Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. The service often mistakes the sound of a zip as a trigger and it can also be activated in other ways, he adds. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.

Although Siri is included on most Apple devices, the report highlights that the Apple Watch and the company’s HomePod smart speaker as the most frequent sources of mistaken recordings. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on,” the report quotes the whistleblower.

Siri recordings are accompanied by user’s location

The whistleblower says that there have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on.

“These recordings are accompanied by user data showing location, contact details, and app data,” the report quotes the whistleblower who goes on to add that employees are expected to hit targets as fast as possible and staff are encouraged to treat recordings of accidental activations as a “technical problem”, but no procedure was said to be in place to deal with sensitive information.

Also read | Apple denies ranking its own apps higher in App Store searches

The report adds that the whistleblower’s motivation for disclosure was based on fears of such data being misused, as there purportedly is not much vetting on who works with the data, a high turnover rate of employees, no proper guidelines about privacy, and the possibility to identify the users.

“It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on,” the whistleblower says emphasizing that Apple should reveal to its users this human oversight exists and change Siri’s response to query– “Are you always listening”, which is currently set to– “I only listen when you’re talking to me”.

Apple’s response

Responding to The Guardian report, Apple said, “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

The company added that a very small random subset, less than 1 per cent of daily Siri activations, are used for grading, and those used are typically only a few seconds long.

Also read | Google’s big privacy push still puts the onus on you, the user

Advertising

Apple is not alone in employing human oversight for its automatic voice assistants. Earlier, Amazon and Google were found to employ staff to listen to some Alexa recordings and Google Assistant recordings respectively. However, Apple values its reputation for user privacy highly, regularly wielding it as a competitive advantage against Google and Amazon. The report mentions that in January, Apple bought a billboard at the Consumer Electronics Show in Las Vegas announcing that “what happens on your iPhone stays on your iPhone”.