Apple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis

Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United States.

apple csam flow chart
The plans have sparked concerns among some security researchers and other parties that Apple could eventually be forced by governments to add non-CSAM images to the hash list for nefarious purposes, such as to suppress political activism.

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

To address these concerns, Apple provided additional commentary about its plans today.

Apple's known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation. Apple did not provide a timeframe for global expansion of the system, if such a move ever happens.

Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system's first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed.

Apple also highlighted some proponents of the system, with some parties praising the company for its efforts to fight child abuse.

"We support the continued evolution of Apple's approach to child online safety," said Stephen Balkam, CEO of the Family Online Safety Institute. "Given the challenges parents face in protecting their kids online, it is imperative that tech companies continuously iterate and improve their safety tools to respond to new risks and actual harms."

Apple did admit that there is no silver bullet answer as it relates to the potential of the system being abused, but the company said it is committed to using the system solely for known CSAM imagery detection.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone SE 4 Vertical Camera Feature

iPhone SE 4 Rumored to Use Same Rear Chassis as iPhone 16

Friday July 19, 2024 7:16 am PDT by
Apple will adopt the same rear chassis manufacturing process for the iPhone SE 4 that it is using for the upcoming standard iPhone 16, claims a new rumor coming out of China. According to the Weibo-based leaker "Fixed Focus Digital," the backplate manufacturing process for the iPhone SE 4 is "exactly the same" as the standard model in Apple's upcoming iPhone 16 lineup, which is expected to...
iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Just Two Months Away: Everything We Know

Monday July 15, 2024 4:44 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
iphone 14 lineup

Cellebrite Unable to Unlock iPhones on iOS 17.4 or Later, Leak Reveals

Thursday July 18, 2024 4:18 am PDT by
Israel-based mobile forensics company Cellebrite is unable to unlock iPhones running iOS 17.4 or later, according to leaked documents verified by 404 Media. The documents provide a rare glimpse into the capabilities of the company's mobile forensics tools and highlight the ongoing security improvements in Apple's latest devices. The leaked "Cellebrite iOS Support Matrix" obtained by 404 Media...
tinypod apple watch

TinyPod Turns Your Apple Watch Into an iPod

Wednesday July 17, 2024 3:18 pm PDT by
If you have an old Apple Watch and you're not sure what to do with it, a new product called TinyPod might be the answer. Priced at $79, the TinyPod is a silicone case with a built-in scroll wheel that houses the Apple Watch chassis. When an Apple Watch is placed inside the TinyPod, the click wheel on the case is able to be used to scroll through the Apple Watch interface. The feature works...
bsod

Crowdstrike Says Global IT Outage Impacting Windows PCs, But Mac and Linux Hosts Not Affected

Friday July 19, 2024 3:12 am PDT by
A widespread system failure is currently affecting numerous Windows devices globally, causing critical boot failures across various industries, including banks, rail networks, airlines, retailers, broadcasters, healthcare, and many more sectors. The issue, manifesting as a Blue Screen of Death (BSOD), is preventing computers from starting up properly and forcing them into continuous recovery...
New MacBook Pros Launching Tomorrow With These 4 New Features 2

M5 MacBook Models to Use New Compact Camera Module in 2025

Wednesday July 17, 2024 2:58 am PDT by
Apple in 2025 will take on a new compact camera module (CCM) supplier for future MacBook models powered by its next-generation M5 chip, according to Apple analyst Ming-Chi Kuo. Writing in his latest investor note on unny-opticals-2025-business-momentum-to-benefit-509819818c2a">Medium, Kuo said Apple will turn to Sunny Optical for the CCM in its M5 MacBooks. The Chinese optical lens company...

Top Rated Comments

dmx Avatar
39 months ago
This system is ripe for abuse and privacy creep over time.

Anyone who it would catch will just turn off iCloud photos anyway, defeating the purpose.

Apple should admit that they made a mistake and cancel the rollout.
Score: 156 Votes (Like | Disagree)
ScottishDuck Avatar
39 months ago
US government known not to abuse systems
Score: 100 Votes (Like | Disagree)
transpo1 Avatar
39 months ago
This is a horrendous idea with so many ways this tech could go wrong.

Limiting it to the U.S. is not a solution and it’s obtuse of Apple to think so. Apple needs to stop now. Get rid of the feature, both the iCloud and Messages versions. No one wants this.
Score: 87 Votes (Like | Disagree)
budafied Avatar
39 months ago

Apple's known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation.
Oh, idk. I thought the US government was pretty ****ing dishonest when it comes to privacy. How did that get approved in the first place?

**** Apple for doing this.
Score: 84 Votes (Like | Disagree)
StrangeNoises Avatar
39 months ago
And when China, or Russia, or India, give them a big list of hashes they want to be notified of, or you don't get to sell phones in those countries any more?
Score: 83 Votes (Like | Disagree)
J___o___h___n Avatar
39 months ago
I’ve nothing to hide, but this just doesn’t seem right to me.

I’m not updating any existing device to iOS15 until this is roll-out is stopped. I don’t want my photos scanned and I don’t want it to happen to my children’s messages. I ensure my children are safe myself. There’s a level of trust and these sort of forced policies just don’t agree with me.
Score: 69 Votes (Like | Disagree)