Tech

Apple no longer letting humans grade Siri recordings amid privacy concerns

Apple has temporarily halted the practice of having humans grade Siri recordings following a report that the automated assistant secretly listens to users having sex.

“While we conduct a thorough review, we are suspending Siri grading globally,” the company told The Verge. “Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

The company had been using outside workers around the world to review and grade how the software responded to requests.

A quality-control contractor revealed to The Guardian that Siri inadvertently picks up on some of its users most intimate and vulnerable moments.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the contractor said.

“The regularity of accidental triggers on the watch is incredibly high.”

The whistleblower said the innocuous sound of a zip could mistakenly trigger Siri to record.

Apple refused to answer if it would also stop saving Siri voice recordings on its servers, The Verge said.

But when the news first broke, the tech giant downplayed the revelations, telling the Guardian that less than 1% of daily Siri recordings get randomly reviewed by its contractors “to improve Siri and dictation.”

“Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”