Security Researchers Express Alarm Over Apple's Plans to Scan iCloud Images, But Practice Already Widespread

Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).

Child Safety Feature
Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple's new image scanning protocol could be used in the future, as noted by Financial Times.

Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud. If there is a match, that photograph is uploaded with a cryptographic safety voucher, and at a certain threshold, a review is triggered to check if the person has CSAM on their devices.

At the current time, Apple is using its image scanning and matching technology to look for child abuse, but researchers worry that in the future, it could be adapted to scan for other kinds of imagery that are more concerning, like anti-government signs at protests.

In a series of tweets, Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to ‌iCloud‌. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.

Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.

Apple for its part says that its scanning technology has an "extremely high level of accuracy" to make sure accounts are not incorrectly flagged, and reports are manually reviewed before a person's ‌iCloud‌ account is disabled and a report is sent to NCMEC.

Green believes that Apple's implementation will push other tech companies to adopt similar techniques. "This will break the dam," he wrote. "Governments will demand it from everyone." He compared the technology to "tools that repressive regimes have deployed."


Security researcher Alec Muffett, who formerly worked at Facebook, said that Apple's decision to implement this kind of image scanning was a "huge and regressive step for individual privacy." "Apple are walking back privacy to enable 1984," he said.

Ross Anderson, professor of security engineering at the University of Cambridge said called it an "absolutely appalling idea" that could lead to "distributed bulk surveillance" of devices.

As many have pointed out on Twitter, multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.


It's also worth noting that Apple was already scanning some content for child abuse images prior to the rollout of the new CSAM initiative. In 2020, Apple chief privacy officer Jane Horvath said that Apple used screening technology to look for illegal images and then disables accounts if evidence of CSAM is detected.

Apple in 2019 updated its privacy policies to note that it would scan uploaded content for "potentially illegal content, including child sexual exploitation material," so today's announcements are not entirely new.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone SE 4 Vertical Camera Feature

iPhone SE 4 Rumored to Use Same Rear Chassis as iPhone 16

Friday July 19, 2024 7:16 am PDT by
Apple will adopt the same rear chassis manufacturing process for the iPhone SE 4 that it is using for the upcoming standard iPhone 16, claims a new rumor coming out of China. According to the Weibo-based leaker "Fixed Focus Digital," the backplate manufacturing process for the iPhone SE 4 is "exactly the same" as the standard model in Apple's upcoming iPhone 16 lineup, which is expected to...
iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Just Two Months Away: Everything We Know

Monday July 15, 2024 4:44 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
iphone 14 lineup

Cellebrite Unable to Unlock iPhones on iOS 17.4 or Later, Leak Reveals

Thursday July 18, 2024 4:18 am PDT by
Israel-based mobile forensics company Cellebrite is unable to unlock iPhones running iOS 17.4 or later, according to leaked documents verified by 404 Media. The documents provide a rare glimpse into the capabilities of the company's mobile forensics tools and highlight the ongoing security improvements in Apple's latest devices. The leaked "Cellebrite iOS Support Matrix" obtained by 404 Media...
tinypod apple watch

TinyPod Turns Your Apple Watch Into an iPod

Wednesday July 17, 2024 3:18 pm PDT by
If you have an old Apple Watch and you're not sure what to do with it, a new product called TinyPod might be the answer. Priced at $79, the TinyPod is a silicone case with a built-in scroll wheel that houses the Apple Watch chassis. When an Apple Watch is placed inside the TinyPod, the click wheel on the case is able to be used to scroll through the Apple Watch interface. The feature works...
bsod

Crowdstrike Says Global IT Outage Impacting Windows PCs, But Mac and Linux Hosts Not Affected

Friday July 19, 2024 3:12 am PDT by
A widespread system failure is currently affecting numerous Windows devices globally, causing critical boot failures across various industries, including banks, rail networks, airlines, retailers, broadcasters, healthcare, and many more sectors. The issue, manifesting as a Blue Screen of Death (BSOD), is preventing computers from starting up properly and forcing them into continuous recovery...
New MacBook Pros Launching Tomorrow With These 4 New Features 2

M5 MacBook Models to Use New Compact Camera Module in 2025

Wednesday July 17, 2024 2:58 am PDT by
Apple in 2025 will take on a new compact camera module (CCM) supplier for future MacBook models powered by its next-generation M5 chip, according to Apple analyst Ming-Chi Kuo. Writing in his latest investor note on unny-opticals-2025-business-momentum-to-benefit-509819818c2a">Medium, Kuo said Apple will turn to Sunny Optical for the CCM in its M5 MacBooks. The Chinese optical lens company...

Top Rated Comments

macrumorsuser10 Avatar
39 months ago
Apple should add scanning for:

1. Photos of the confederate flag.
2. Photos of people not wearing Covid masks.
3. Photos of Chinese people disrespecting the Chinese government.
4. Photos of Middle eastern women not wearing burkas.
5. Photos of a group of people with too many whites, not enough blacks.
Score: 74 Votes (Like | Disagree)
Bandaman Avatar
39 months ago

If you're not doing anything wrong, then you have nothing to worry about.
This is always the de facto standard for terrible replies to privacy.
Score: 74 Votes (Like | Disagree)
cloudyo Avatar
39 months ago

If you're not doing anything wrong, then you have nothing to worry about.
You should let law enforcement install cameras in your home then, so they can make sure you are not doing anything illegal while you take a shower, for example. After all, you have nothing to hide, do you?
Score: 57 Votes (Like | Disagree)
Bawstun Avatar
39 months ago

If you're not doing anything wrong, then you have nothing to worry about.
This simply isn’t true. As the article notes, the technology can easily be changed to other things in the future - what if they scanned for BLM supporter images or anti-government images? What if they wanted to scan and track certain political parties?

It’s not about child sex material, everyone agrees that that is wrong, it’s about passing over more and more of our rights to Big Tech. Give them an inch and they’ll take a foot.
Score: 51 Votes (Like | Disagree)
contacos Avatar
39 months ago

If you're not doing anything wrong, then you have nothing to worry about.
Depends on the definition of „wrong“. Sometimes it is up to self serving definitions of dictators
Score: 50 Votes (Like | Disagree)
jarman92 Avatar
39 months ago
"Other companies already participate in this outrageous invasion of privacy" is not nearly the defense of Apple these people seem to think it is.
Score: 48 Votes (Like | Disagree)