Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'

An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters).

Child Safety Feature yellow

"Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children," the groups wrote in the letter.

Some signatories of the letter, organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT), are concerned that Apple's on-device CSAM scanning system could be subverted in nations with different legal systems to search for political or other sensitive content.

"Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit," reads the letter.

The letter also calls on Apple to abandon planned changes to iMessage in family accounts, which would try to identify and blur nudity in children's messages, letting them view it only if parents are notified. The signatories claim that not only could the step endanger children in intolerant homes or those seeking educational material, it would also break end-to-end encryption for iMessage.

Some signatories come from countries in which there are already heated legal battles over digital encryption and privacy rights, such as Brazil, where WhatsApp has been repeatedly blocked for failing to decrypt messages in criminal probes. Other signers are based in India, Mexico, Germany, Argentina, Ghana and Tanzania. Groups that have also signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Apple's plan to detect known CSAM images stored in iCloud Photos has been particularly controversial and has prompted concerns from security researchers, academics, privacy groups, and others about the system potentially being abused by governments as a form of mass surveillance. The company has tried to address concerns by publishing additional documents and a FAQ page explaining how the image-detection system will work and arguing that the risk of false detections is low.

Apple has also said it would refuse demands to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although as Reuters points out, it has not said that it would pull out of a market rather than obeying a court order.

Popular Stories

iPhone SE 4 Vertical Camera Feature

iPhone SE 4 Rumored to Use Same Rear Chassis as iPhone 16

Friday July 19, 2024 7:16 am PDT by
Apple will adopt the same rear chassis manufacturing process for the iPhone SE 4 that it is using for the upcoming standard iPhone 16, claims a new rumor coming out of China. According to the Weibo-based leaker "Fixed Focus Digital," the backplate manufacturing process for the iPhone SE 4 is "exactly the same" as the standard model in Apple's upcoming iPhone 16 lineup, which is expected to...
iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Just Two Months Away: Everything We Know

Monday July 15, 2024 4:44 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
iphone 14 lineup

Cellebrite Unable to Unlock iPhones on iOS 17.4 or Later, Leak Reveals

Thursday July 18, 2024 4:18 am PDT by
Israel-based mobile forensics company Cellebrite is unable to unlock iPhones running iOS 17.4 or later, according to leaked documents verified by 404 Media. The documents provide a rare glimpse into the capabilities of the company's mobile forensics tools and highlight the ongoing security improvements in Apple's latest devices. The leaked "Cellebrite iOS Support Matrix" obtained by 404 Media...
tinypod apple watch

TinyPod Turns Your Apple Watch Into an iPod

Wednesday July 17, 2024 3:18 pm PDT by
If you have an old Apple Watch and you're not sure what to do with it, a new product called TinyPod might be the answer. Priced at $79, the TinyPod is a silicone case with a built-in scroll wheel that houses the Apple Watch chassis. When an Apple Watch is placed inside the TinyPod, the click wheel on the case is able to be used to scroll through the Apple Watch interface. The feature works...
bsod

Crowdstrike Says Global IT Outage Impacting Windows PCs, But Mac and Linux Hosts Not Affected

Friday July 19, 2024 3:12 am PDT by
A widespread system failure is currently affecting numerous Windows devices globally, causing critical boot failures across various industries, including banks, rail networks, airlines, retailers, broadcasters, healthcare, and many more sectors. The issue, manifesting as a Blue Screen of Death (BSOD), is preventing computers from starting up properly and forcing them into continuous recovery...
New MacBook Pros Launching Tomorrow With These 4 New Features 2

M5 MacBook Models to Use New Compact Camera Module in 2025

Wednesday July 17, 2024 2:58 am PDT by
Apple in 2025 will take on a new compact camera module (CCM) supplier for future MacBook models powered by its next-generation M5 chip, according to Apple analyst Ming-Chi Kuo. Writing in his latest investor note on unny-opticals-2025-business-momentum-to-benefit-509819818c2a">Medium, Kuo said Apple will turn to Sunny Optical for the CCM in its M5 MacBooks. The Chinese optical lens company...

Top Rated Comments

stringParameter Avatar
38 months ago
Obviously the start of something very sinister here. I just didn't expect Apple to be the ones leading the way :/
Score: 81 Votes (Like | Disagree)
dragje Avatar
38 months ago


Apple has also said it would refuse demands to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although as Reuters points out, it has not said that it would pull out of a market rather than obeying a court order.

Exactly what Reuters rightfully points out. Even if Apple's intentions are 100% good, this system does create a backdoor that enables the possibility that due law, of any given country, Apple could be forced by court order, to look for images of protestors, or political symbols, to filter out political protestors for purposes that are not good.

I'm surprised to see Apple doing this because they seem to be the front runners of this whole privacy mantra. It counterpoints everything where Apple stands for.

I find it also hard to believe that Apple would pull back all of their iPhones out of China if the Chinese government orders Apple to search for aspects as mentioned above.
Score: 81 Votes (Like | Disagree)
Grey Area Avatar
38 months ago

Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
https://www.apple.com/child-safety/pdf/Expanded_Protections_for_Children_Frequently_Asked_Questions.pdf
https://www.apple.com/child-safety/pdf/Security_Threat_Model_Review_of_Apple_Child_Safety_Features.pdf
https://www.apple.com/child-safety/pdf/Apple_PSI_System_Security_Protocol_and_Analysis.pdf
The open letter was published today, so no, this article was not posted earlier.

Maybe something similar was, and if so, great - more and more organizations are protesting. This will not just go away quietly. I am also glad that these protests come despite the matter involving CSAM, a touchy topic normally well suited to enforce whatever measures. That so many have the courage to speak out against Apple in this indicates that Apple crossed a serious line and that "think-of-the-children" is wearing thin as an alibi.

The technical documents do not address the core objections in any satisfying way. Many people, including experts, have read these documents and still oppose the new system.
Score: 60 Votes (Like | Disagree)
Agit21 Avatar
38 months ago
„build surveillance capabilities into iPhones, iPads, and other products“

That’s exactly what this new “feature“ is Tim!
Score: 48 Votes (Like | Disagree)
Wildkraut Avatar
38 months ago
? w00t unbelievable, these “Screeching Voices of the Minority.”

But I’m sure there are still reasons to side with Apple. Apple is never wrong, Daddy Tim just want our best????????.
Score: 43 Votes (Like | Disagree)
sanook997 Avatar
38 months ago
Regardless of the outcome, I will never feel the same about security with Apple products as I have previously. They can do anything thing they want in the cloud, but not on my phone.
Score: 35 Votes (Like | Disagree)