
Screenshot from Apple.com
Apple on Wednesday apologized for how it handled the audio files when customers accessed its Siri assistant, and announced a series of changes aimed at better safeguarding customer privacy.
Why it matters: In recent weeks it has come to light that several of the major tech companies, including Apple, Google and Amazon, had been letting workers access a portion of virtual assistant conversations as part of their efforts to assess and improve quality.
Apple allowed contractors to listen to a small subset of customers' recordings, but put that program on hold earlier this month amid customer concerns.
The iPhone maker announced three significant changes on Wednesday.
- By default, it won't keep audio recordings of Siri interactions, but will use computer-generated transcripts to improve quality.
- Customers will be able to opt in to a program to share their audio files with Apple to help Siri get better. "We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place," Apple said. "Those who choose to participate will be able to opt out at any time."
- When customers do opt in, Apple said only its employees — and not its contractors — will be allowed to listen to the audio files. Apple also said it will endeavor to delete recordings in which Siri was inadvertently triggered.
What they're saying: "We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading," Apple said in a statement. "We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We've decided to make some changes to Siri as a result."
Go deeper: What Apple knows about you