Apple to scan US iPhones for images of child sexual abuse
Apple unveiled plans to scan US iPhones for images of child
sexual abuse, drawing applause from child protection groups but raising concern
among some security researchers that the system could be misused, including by
governments looking to surveil their citizens.
The tool designed to detect known images of child sexual
abuse, called “neuralMatch,” will scan images before they are uploaded to
iCloud. If it finds a match, the image will be reviewed by a human. If child
pornography is confirmed, the user’s account will be disabled and the National
Center for Missing and Exploited Children notified.
Separately, Apple plans to scan users’ encrypted messages
for sexually explicit content as a child safety measure, which also alarmed
privacy advocates.
The detection system will only flag images that are already
in the center’s database of known child pornography. Parents snapping innocent
photos of a child in the bath presumably need not worry. But researchers say
the matching tool — which doesn’t “see” such images, just mathematical
“fingerprints” that represent them — could be put to more nefarious purposes.
Matthew Green, a top cryptography researcher at Johns
Hopkins University, warned that the system could be used to frame innocent
people by sending them seemingly innocuous images designed to trigger matches
for child pornography. That could fool Apple’s algorithm and alert law
enforcement. “Researchers have been able to do this pretty easily,” he said of
the ability to trick such systems.
Other abuses could include government surveillance of
dissidents or protesters. “What happens when the Chinese government says, ‘Here
is a list of files that we want you to scan for,’ ” Green asked. “Does Apple
say no? I hope they say no, but their technology won’t say no.”
Tech companies including Microsoft, Google, Facebook and
others have for years been sharing digital fingerprints of known child sexual
abuse images. Apple has used those to scan user files stored in its iCloud
service, which is not as securely encrypted as its on-device data, for child
pornography.
Apple has been under government pressure for years to allow
for increased surveillance of encrypted data. Coming up with the new security
measures required Apple to perform a delicate balancing act between cracking
down on the exploitation of children while keeping its high-profile commitment
to protecting the privacy of its users.
But a dejected Electronic Frontier Foundation, the online
civil liberties pioneer, called Apple’s compromise on privacy protections “a
shocking about-face for users who have relied on the company’s leadership in
privacy and security.”
Meanwhile, the computer scientist who more than a decade ago
invented PhotoDNA, the technology used by law enforcement to identify child
pornography online, acknowledged the potential for abuse of Apple’s system but
said it was far outweighed by the imperative of battling child sexual abuse.
“Is it possible? Of course. But is it something that I’m
concerned about? No,” said Hany Farid, a researcher at the University of
California at Berkeley, who argues that plenty of other programs designed to
secure devices from various threats haven’t seen “this type of mission creep.”
For example, WhatsApp provides users with end-to-end encryption to protect
their privacy, but also employs a system for detecting malware and warning
users not to click on harmful links.
Apple was one of the first major companies to embrace
“end-to-end” encryption, in which messages are scrambled so that only their
senders and recipients can read them. Law enforcement, however, has long
pressured the company for access to that information in order to investigate
crimes such as terrorism or child sexual exploitation.
Apple said the latest changes will roll out this year as
part of updates to its operating software for iPhones, Macs and Apple Watches.
“Apple’s expanded protection for children is a game
changer,” John Clark, the president and CEO of the National Center for Missing
and Exploited Children, said in a statement. “With so many people using Apple
products, these new safety measures have lifesaving potential for children.”
Julie Cordua, the CEO of Thorn, said that Apple’s technology
balances “the need for privacy with digital safety for children.” Thorn, a
nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help
protect children from sexual abuse by identifying victims and working with tech
platforms.
But in a blistering critique, the Washington-based nonprofit
Center for Democracy and Technology called on Apple to abandon the changes,
which it said effectively destroy the company’s guarantee of “end-to-end
encryption.” Scanning of messages for sexually explicit content on phones or
computers effectively breaks the security, it said.
The organization also questioned Apple’s technology for
differentiating between dangerous content and something as tame as art or a
meme. Such technologies are notoriously error-prone, CDT said in an emailed
statement. Apple denies that the changes amount to a backdoor that degrades its
encryption. It says they are carefully considered innovations that do not
disturb user privacy but rather strongly protect it.
Separately, Apple said its messaging app will use on-device
machine learning to identify and blur sexually explicit photos on children’s
phones and can also warn the parents of younger children via text message. It
also said that its software would “intervene” when users try to search for
topics related to child sexual abuse.
In order to receive the warnings about sexually explicit
images on their children’s devices, parents will have to enroll their child’s
phone. Kids over 13 can unenroll, meaning parents of teenagers won’t get
notifications.
Apple said neither feature would compromise the security of
private communications or notify police.
Comments
Post a Comment