Apple Inc sharesApple Inc

Financial Times/

Apple is delaying the planned rollout of tools to detect photos of child pornography and sex abuse on iPhones, after a fierce backlash from privacy campaigners.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement today.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple said originally that it expected its new child safety features to roll out later this year, after the initial release of its latest mobile software, iOS 15, which is expected this month.

The delay dismayed some child protection campaigners. Andy Burrows, head of child safety online policy at the UK charity NSPCC, said Apple’s concession was “incredibly disappointing” and that the company “should have stood their ground”.

After the changes were announced last month, privacy campaigners were particularly alarmed by one of the features, which involved a system for matching files that were being uploaded from a user’s iPhone to iCloud Photos against a database of known child sex abuse imagery.

The American Civil Liberties Union was among those warning last month that any system designed to detect data stored on a phone could also be used against activists, dissidents and minorities by authoritarian regimes.

“Given the widespread interests of governments around the world, we cannot be sure Apple will always resist demands that iPhones be scanned for additional selected material,” the ACLU’s staff technologist, Daniel Kahn Gillmor, said last month. “These changes are a step toward significantly worse privacy for all iPhone users.”

Apple previously defended its plan, which it said involved “state of the art” cryptographic techniques to ensure the company itself could not see what images were stored on any customers’ devices.

Apple said the system would only be used for child protection and that the involvement of a team of human reviewers, alongside a minimum number of images that must be detected before an account was flagged, would nearly eliminate the potential for mistakes or abuses.

But Craig Federighi, Apple’s senior vice-president of software engineering, admitted that the introduction of the child pornography detection system alongside a separate tool that could warn parents if their children received sexually explicit photos through its iMessage system, was confusing.

“It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” Federighi told the WSJ last month. “In hindsight, introducing these two features at the same time was a recipe for this kind of confusion.”

0

By Editor

Leave a Reply

Your email address will not be published. Required fields are marked *