Apple’s decision to cancel its CSAM scanning draws praise and criticism
When Apple introduced its slate of initiatives to prevent the spread of child sexual abuse material, or CSAM, last year, they were controversial, to say the least. While some praised the company for taking action, there was also no shortage of detractors, some of whom said that Apple’s plans to do on-device scanning for illegal content would require an unacceptable huge hit to user privacy.
The backlash caused Apple to delay some of the features in September 2021, and earlier this week, the company confirmed it has abandoned its efforts to create the hashing system that would’ve searched people’s iCloud photo libraries for illegal materials. We contacted some of the organizations that had spoken out either in support of or against Apple’s initiative to see what they had to say now that it’s gone.
The National Center for Missing & Exploited Children
The National Center for Missing & Exploited Children, or NCMEC, was going to be one of Apple’s partners for its image scanning system, with the center providing both the hashes of known CSAM images and assistance with reviewing anything the system found before contacting the authorities.
As you might imagine, NCMEC isn’t particularly pleased with Apple’s decision to drop the feature, and the company’s simultaneous announcement of even stronger iCloud privacy measures that will end-to-end encrypt backups doesn’t seem to be helping matters. “The National Center for Missing & Exploited Children opposes privacy measures that ignore the undisputed realities of child sexual exploitation online,” said Michelle DeLaune, the organization’s president and CEO, in a statement to The Verge. The rest of the statement reads:
We support privacy measures to keep personal data secure – yet privacy must be balanced with the reality that countless children are being sexually victimized online every day. End-to-end encryption without a solution in place to detect child sexual exploitation will allow lawless environments to flourish, embolden predators, and leave child victims unprotected.
Proven technology tools exist and have been used successfully for over a decade that allow the detection of child sexual exploitation with surgical precision. In the name of privacy, companies are enabling child sexual exploitation to occur unchecked on their platforms.
NCMEC remains steadfast in calling upon the technology industry, political leaders, and academic and policy experts to come together to agree upon solutions that will achieve consumer privacy while prioritizing child safety.
The Center for Democracy and Technology, the Electronic Frontier Foundation, and Fight for the Future
In August 2021, the Center for Democracy and Technology (CDT) posted an open letter to Apple expressing concern over the company’s plans and calling on it to abandon them. The letter was signed by around 90 organizations, including the CDT. “We’re very excited, and we’re counting this as a huge victory for our advocacy on behalf of user security, privacy, and human rights,” said Mallory Knodel, chief technology officer for the organization, talking about Apple’s cancellation announcement.
Knodel thinks that Apple’s change of heart may have been in part a response to the urging of CDT and others but also because it saw the winds shifting on the topic of client-side scanning. “Earlier this year, Meta had a similar conclusion when they asked for a human rights impact assessment of their possible decision to move towards end-to-end encryption of their messaging platforms, both on Instagram messenger kids and Facebook Messenger,” she said. When the organization conducting the assessment suggested a similar type of scanning, though, Knodel says Meta was “very, very strong in saying ‘under no circumstances are we going to pursue client-side scanning as an option.’ And that, I think, has helped.”
Other organizations that signed the original letter echoed some of Knodel’s sentiments.
“Encryption is one of the most important tools we have for maintaining privacy and security online,” said Andrew Crocker, senior staff attorney for the Electronic Frontier Foundation. “We applaud Apple for listening to experts, child advocates, and users who want to protect their most sensitive data.”
Meanwhile, Fight for the Future’s Caitlin Seeley George called Apple’s announcement on Wednesday “a huge victory,” adding that “on-device scanning of messages and photos would have been incredibly dangerous — Apple would essentially have forced malware on its users, which would go completely against the company’s ‘pro-privacy’ marketing, would have broken end-to-end encryption, and would not have made anyone safer.”
Knodel hinted, however, that the fight isn’t necessarily over. “As people who should be claiming part of this victory, we need to be really loud and excited about it, because you have, both in the EU and in the UK, two really prominent policy proposals to break encryption,” she said, referencing the Chat Control child safety directive and Online Safety Bill. “With Apple making these strong pro-encryption moves, they might be tipping that debate or they might be provoking it. So I’m sort of on the edge of my seat waiting.”
Not all of Apple’s child protection plans were scrapped. Parents or guardians can enable a communication safety system for iMessage that can scan photos sent to minors for nudity. However, contrary to Apple’s initial announcement, parents aren’t automatically alerted if the minor chooses to look at the image. Instead, it’s left up to the child as to whether they want to alert their parents, though the system makes it very easy to do so.