Apple representatives apologized to its users on Wednesday for employing third-party contractors to listen to audio recordings picked up by its Siri voice assistant. This includes incidents when the voice assistant program was accidentally triggered by muffled background noise, The Washington Post reported.
Apple’s apology comes after a whistleblower, a former Apple contractor, exposed the company’s eavesdropping practice to The Guardian newspaper in June. The recording process was quickly suspended thereafter.Companies including Facebook, Microsoft and Amazon have stated that they are listening to recordings gathered using their various programs – from Facebook’s messenger app and Xbox’s voice recognition, to Amazon’s Alexa and Google smart microphone.
Though the original intention of having Apple contractors listening to recordings was to grade Siri’s performance – the practice made waves after the whistleblower said the voice assistant routinely recorded people having sex, as well as making drug deals and discussing confidential medical information.
“We realize we haven’t been fully living up to our high ideals, and for that we apologize,” Apple said on Wednesday.
“The sound of a zip, Siri often hears as a trigger,” the contractor said.
Sometimes, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch,” The Guardian reported the contractor saying.
Going forward, the iPhone maker says it will start the Siri grading program up again under new guidelines that will allow users to opt into the program. It will not keep users’ audio recordings for grading Siri without their permission.
Apple will now also allow only its own employees to review the audio recordings, not third-party contractors. It also pledged to work to delete “any recording which is determined to be an inadvertent trigger of Siri.”
“We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place,” the company said. “Those who choose to participate will be able to opt out at any time.”