


“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. The Electronic Frontier Foundation has said that “Apple is opening the door to broader abuses”: Immediately after Apple's announcement, experts around the world sounded the alarm on how Apple's proposed measures could turn every iPhone into a device that is continuously scanning all photos and messages that pass through it in order to report any objectionable content to law enforcement, setting a precedent where our personal devices become a radical new tool for invasive surveillance, with little oversight to prevent eventual abuse and unreasonable expansion of the scope of surveillance. Another notifies a child's parents if iMessage is used to send or receive photos that a machine learning algorithm considers to contain nudity.īecause both checks are performed on the user's device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy. One system detects if a certain number of objectionable photos is detected in iCloud storage and alerts the authorities. While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.Īpple's proposed technology works by continuously monitoring photos saved or shared on the user's iPhone, iPad, or Mac.

announced new technological measures meant to apply across virtually all of its devices under the umbrella of “Expanded Protections for Children”.
