Apple apologized for allowing human contractors to listen to snippets of people's recorded conversations with its digital assistant Siri, a practice that undermined its efforts to position itself as a trusted steward of privacy.
As part of the apology posted Wednesday, Apple reiterated an earlier pledge to stop keeping audio recorded through Siri unless consumers give their permission.
The Cupertino, California-based company earlier this monthcalled "grading," a quality control measure that relies on contractors hired by Apple to listen to audio recordings to determine whether its Siri service accurately heard them. The information is used to improve Siri's future performance and capabilities.
On Wednesday, Apple said if consumers grant permission, only its own employees will be allowed to review audio to help improve the service.
"We realize haven't been fully living up to our high ideals, and for that we apologize," Apple conceded.
It's not yet clear how Apple will seek permission, though in the past, Apple has typically requested permissions through prompts during software update installations.
In recent months, Facebook, Google, Amazon, Microsoft and Apple have all acknowledged that people have been reviewing users' interactions with artificial intelligence assistants in order to improve the services. But users aren't typically aware that humans and not just computers are reviewing audio.
The use of humans to listen to audio recordings is particularly troubling to privacy experts because it increases the chances that a rogue employee or contractor could leak details of what is being said, including parts of sensitive conversations.
Apple said it will still use computer-generated transcripts to improve services, even if a user hasn't explicitly granted permission, or opted in.