The Washington PostDemocracy Dies in Darkness

Apple delays the rollout of its plans to scan iPhones for child exploitation images

Security and privacy advocates were angered by the controversial decision to scan on its customers’ devices.

September 3, 2021 at 3:57 p.m. EDT
Apple CEO Tim Cook attends the Apple Tower Theatre retail store opening in the Broadway Theater District in Los Angeles on June 24. (Patrick T. Fallon/AFP/Getty Images)
2 min

Apple said Friday it would delay the rollout of its controversial plans to scan Apple devices for child exploitation images after security and privacy experts warned the software could open a back door to iPhones, giving governments and even hackers access to the devices without permission.

Apple had touted its method of scanning as a privacy enhancement that set it apart from its competitors, but the company seemed unprepared for the overwhelming backlash. Its top executives blamed the public relations blunder on confusion about the technology Apple was using.

The controversy began last month, when Apple said it would begin rolling out new software meant to catch criminals who trade images of child sexual abuse. While the idea was not new, Apple’s method was. A database of numbers representing images of child sexual abuse would live on customers’ phones, where they would be checked against photos uploaded to Apple Photos, the company’s cloud storage service.

How iPhone child-safety photo scanning works — and why privacy advocates are worried

Other technology companies including Facebook, Microsoft and Google scan their own servers for such material. Apple said it would do the scanning on iPhones, using complicated encryption methods to ensure Apple would only ever see a user’s photos if 30 or more child exploitation images were on the phone.

Once that threshold was triggered, the images would be decrypted and viewed by Apple employees, who would verify that the images constituted child exploitation. The Apple customer would then be turned over to authorities.

Apple spokesman Fred Sainz said he would not provide a statement on Friday’s announcement because The Washington Post would not agree to use it without naming the spokesperson.

While catching people who traffic images of exploited children is a noble cause, the idea of giving Apple customers no choice but to have software on their phones that would look for illegal activity was a step too far for many privacy advocates and security experts.

“It’s encouraging that the backlash has forced Apple to delay this reckless and dangerous surveillance plan, but the reality is that there is no safe way to do what they are proposing,” said Evan Greer, director of Internet advocacy group Fight for the Future, in a statement. “Apple’s current proposal will make vulnerable children less safe, not more safe. They should shelve it permanently,” she wrote.

Opinion: We built a system like Apple’s to flag child sexual abuse material — and concluded the tech was dangerous

Apple said its encryption methods made it impossible for the software to be abused, and said independent security researchers could audit it.

John Tanagho, executive director of the International Justice Mission’s Center to End Online Sexual Exploitation of Children, disagreed with Apple’s decision to delay the software rollout.

“Apple’s changes are a positive step forward and must not be delayed,” he wrote in a statement. “The world should not elevate the hypothetical and unlikely corruption of child safety solutions over the known and rampant misuse of existing technology to harm children.”

Apple has similarly stepped back or delayed on other privacy changes following an outcry. For instance, after a backlash from the advertising industry, Apple delayed changes to the rollout of its new software that forces app developers to ask users if they want to be tracked. It also delayed and ultimately changed rules that would have prohibited kids apps from using analytics software after the owners of many kids apps said the move would put them out of business.