Does Apple’s New Photo Surveillance Plan Cross the Creepy Line?

Stephen DeAngelis

August 10, 2021

Child pornography is universally recognized as an evil that decent societies cannot and should not tolerate. Yet, a dozen and a half years ago, journalist Joshua Brockman (@JoshuaOnAir) reported, “The sexual exploitation of children on the Internet is a $20 billion industry that continues to expand in the United States and abroad, overwhelming attempts by the authorities to curb its growth.”[1] Of course, the size of the child pornography industry is only an estimate. As reporter Carl Bialik (@CarlBialik) noted, “Unlike, say, the soft-drink or airline industries, the child-pornography industry doesn’t report its annual sales to the Securities and Exchange Commission.”[2] Despite the inability to determine the actual size of the problem, everyone knows child pornography is an evil deserving of attention and action. Enter Apple. Tech journalist Dev Kundaliya reports, “Apple has announced a new feature for iPhones and iPads that is intended to limit the spread of child sexual abuse material (CSAM) online.”[3]

 

Apple’s Plans

 

According to Kundaliya, “Apple says that its upcoming versions of iOS and iPadOS — due to be released later this year — will have ‘new applications of cryptography’ — enabling the company to detect CSAM images as they are uploaded to iCloud Photos, Apple’s online storage. Before an image is stored in iCloud Photos, an on-device matching process will be performed for that image against the database of known CSAM images, compiled by the US National Center for Missing and Exploited Children (NCMEC).” Technology writer Reed Albergotti (@ReedAlbergotti) adds, “Apple unveiled a sweeping new set of software tools that will scan iPhones and other devices for child pornography and text messages with explicit content and report users suspected of storing illegal pictures on their phones to authorities.”[4]

 

According to journalist Robert McMillan, Apple insists its plans are aimed at curbing child pornography without encroaching on consumer privacy. He writes, “Apple, which has built much of its brand image in recent years on promises to safeguard users’ privacy, says that its new software will further enhance those protections by avoiding any need for widespread scanning of images on the company’s servers, something that Apple currently doesn’t perform.”[5] As you might imagine, reactions to Apple’s plans were immediate — and mostly negative.

 

Reactions to Apple’s Plans

 

Media writer Wendy Davis (@wendyndavis) writes, “Apple famously touts itself as the pro-privacy tech company, but plans to roll out new surveillance software that some experts find deeply problematic. … Apple … isn’t asking consumers for permission to install this software.”[6] Davis also published reactions of a few experts who view Apple’s plans as the first step down a slippery slope. For example, Davis reports:

 

Matthew Green, the security researcher and Johns Hopkins professor who first reported news of Apple’s plans, argued on Twitter that the move has troubling implications for civil rights. ‘Regardless of what Apple’s long term plans are, they’ve sent a very clear signal,’ Green said in a Twitter post. ‘In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content. That’s the message they’re sending to governments, competing services, China, you.’ Green adds that the types of ‘prohibited content’ Apple could scan for isn’t limited to images of children. ‘The way this will work is that your phone will download a database of ‘fingerprints’ for all of the bad images (child porn, terrorist recruitment videos, etc.) It will check each image on your phone for matches. The fingerprints are ‘imprecise’ so they can catch close matches,’ he said in a Tweet. ‘Whoever controls this list can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you,’ he adds.”

 

Davis notes that security expert Alec Muffett called Apple’s plans, “Absolutely the most important story that broke overnight.” And the digital rights group Electronic Frontier Foundation added, “Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy.”

 

Journalist Andy Greenberg (@a_greenberg) believes Apple is walking a dangerous tightrope with their new plans. “For years,” he writes, “Tech companies have struggled between two impulses: the need to encrypt users’ data to protect their privacy and the need to detect the worst sorts of abuse on their platforms. Now Apple is debuting a new cryptographic system that seeks to thread that needle, detecting child abuse imagery stored on iCloud without — in theory — introducing new forms of privacy invasion. In doing so, it’s also driven a wedge between privacy and cryptography experts who see its work as an innovative new solution and those who see it as a dangerous capitulation to government surveillance.”[7]

 

Apple’s plans could cost the company a significant amount of business. For example, Nadim Kobeissi (@kaepora), a cryptographer and founder of Symbolic Software, told Greenberg, “I’m not defending child abuse. But this whole idea that your personal device is constantly locally scanning and monitoring you based on some criteria for objectionable content and conditionally reporting it to the authorities is a very, very slippery slope. I definitely will be switching to an Android phone if this continues.”

 

Concluding Thoughts

 

In a follow-up article, Kundaliya reports a group of “industry experts and privacy advocates” published an open letter to Apple asking the company to halt its current plans.[8] In part, the letter stated, “While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.” Apple has yet to respond to the open letter, but the company is aware of concerns being raised — such as Albergotti’s concern that Apple’s software might “potentially put innocent users in legal jeopardy.” For its part, Albergotti reports, “Apple said there is a one-in-a-trillion chance of a person being incorrectly flagged, and it said each instance will be manually reviewed by the company before an account is shut down and authorities are alerted. Users can appeal the decision to Apple.” As more people learn about Apple’s plans, and despite the company’s assurances, I suspect many of them will perceive the surveillance technology as dangerous and a bit creepy.

 

Footnotes
[1] Joshua Brockman, “Child Sex as Internet Fare, Through Eyes of a Victim,” The New York Times, 5 April 2006.
[2] Carl Bialik, “Measuring the Child-Porn Trade,” The Wall Street Journal, 18 April 2006.
[3] Dev Kundaliya, “Apple hopes new feature will curb spread of child sexual abuse images,” Computing, 6 August 2021.
[4] Reed Albergotti, “Apple is prying into iPhones to find sexual predators, but privacy activists worry governments could weaponize the feature,” The Washington Post, 5 August 2021.
[5] Robert McMillan, “Apple Plans to Have iPhones Detect Child Pornography, Fueling Privacy Debate,” The Wall Street Journal, 5 August 2021.
[6] Wendy Davis, “Apple To Search Users’ iPhones For Illegal Photos,” MediaPost, 5 August 2021.
[7] Andy Greenberg, “Apple Walks a Privacy Tightrope to Spot Child Abuse in iCloud,” Wired, 5 August 2021.
[8] Dev Kundaliya, “Apple urged to halt plans to roll out new photo scanning feature in open letter,” Computing, 9 August 2021.