Apple postpones plan to scan iPhones for child exploitation images

0
196

Silhouette of a mobile user next to a screen projection of the Apple logo in this illustration from March 28, 2018.

Given Ruvic | Reuters

After objecting to privacy rights, Apple said Friday it would postpone its plan to search users’ photo libraries for pictures of child exploitation.

“Last month we announced plans for features to help protect children from predators who use means of communication to recruit and exploit them, and to limit the circulation of child sexual abuse material,” said one Company statement. “Based on feedback from customers, stakeholders, researchers and others, we decided to take additional time in the coming months to gather input and make improvements before we release these critically important child safety features.”

Apple shares fell slightly on Friday morning.

Apple immediately sparked controversy after announcing its system of checking users’ devices for illegal child sexual abuse material. Critics pointed out that the system of comparing images stored in an iCloud account with a database of known “CSAM” images contradicts Apple’s messages about the privacy of its customers.

Instead of scanning a user’s photos, the system searches for known digital “fingerprints” which it compares with the CSAM database. When the system detects enough images on a user’s account, it is then reported to a human monitor, which can confirm the images and relay the information to law enforcement agencies if necessary.

Apple’s CSAM detection system should go live for customers later this year. It’s unclear how long Apple will postpone the release after Friday’s announcement.

Indeed, despite concerns about Apple’s plan, this is a common practice among tech companies. Facebook, Dropbox, Google, and many others have systems that can automatically detect CSAM that has been uploaded to their respective services.