iOS 17 Expands Communication Safety Worldwide, Turned On by Default

Starting with iOS 17, iPadOS 17, and macOS Sonoma, Apple is making Communication Safety available worldwide. The previously opt-in feature will now be turned on by default for children under the age of 13 who are signed in to their Apple ID and part of a Family Sharing group. Parents can turn it off in the Settings app under Screen Time.

communication safety feature yellow
Communication Safety first launched in the U.S. with iOS 15.2 in December 2021, and has since expanded to Australia, Belgium, Brazil, Canada, France, Germany, Italy, Japan, the Netherlands, New Zealand, South Korea, Spain, Sweden, and the U.K. With the software updates coming later this year, Apple is making the feature available globally.

Communication Safety is designed to warn children when receiving or sending photos that contain nudity in the Messages app. Apple is expanding the feature on iOS 17, iPadOS 17, and macOS Sonoma to cover video content, and it will also work for AirDrop content, FaceTime video messages, and Contact Posters in the Phone app.

When the feature is enabled, photos and videos containing nudity are automatically blurred in supported apps, and the child will be warned about viewing sensitive content. The warning also provides children with ways to get help. Apple is making a new API available that will allow developers to support Communication Safety in their App Store apps.

Apple says Communication Safety uses on-device processing to detect photos and videos containing nudity, ensuring that Apple and third parties cannot access the content, and that end-to-end encryption is preserved in the Messages app.

iOS 17, iPadOS 17, and macOS Sonoma will be released later this year. The updates are currently available in beta for users with an Apple developer account.

Related Roundups: iOS 17, iPadOS 17
Related Forums: iOS 17, iPadOS 17

Popular Stories

iPhone SE 4 Vertical Camera Feature

iPhone SE 4 Rumored to Use Same Rear Chassis as iPhone 16

Friday July 19, 2024 7:16 am PDT by
Apple will adopt the same rear chassis manufacturing process for the iPhone SE 4 that it is using for the upcoming standard iPhone 16, claims a new rumor coming out of China. According to the Weibo-based leaker "Fixed Focus Digital," the backplate manufacturing process for the iPhone SE 4 is "exactly the same" as the standard model in Apple's upcoming iPhone 16 lineup, which is expected to...
iPhone 16 Pro Sizes Feature

iPhone 16 Series Is Just Two Months Away: Everything We Know

Monday July 15, 2024 4:44 am PDT by
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
iphone 14 lineup

Cellebrite Unable to Unlock iPhones on iOS 17.4 or Later, Leak Reveals

Thursday July 18, 2024 4:18 am PDT by
Israel-based mobile forensics company Cellebrite is unable to unlock iPhones running iOS 17.4 or later, according to leaked documents verified by 404 Media. The documents provide a rare glimpse into the capabilities of the company's mobile forensics tools and highlight the ongoing security improvements in Apple's latest devices. The leaked "Cellebrite iOS Support Matrix" obtained by 404 Media...
tinypod apple watch

TinyPod Turns Your Apple Watch Into an iPod

Wednesday July 17, 2024 3:18 pm PDT by
If you have an old Apple Watch and you're not sure what to do with it, a new product called TinyPod might be the answer. Priced at $79, the TinyPod is a silicone case with a built-in scroll wheel that houses the Apple Watch chassis. When an Apple Watch is placed inside the TinyPod, the click wheel on the case is able to be used to scroll through the Apple Watch interface. The feature works...
bsod

Crowdstrike Says Global IT Outage Impacting Windows PCs, But Mac and Linux Hosts Not Affected

Friday July 19, 2024 3:12 am PDT by
A widespread system failure is currently affecting numerous Windows devices globally, causing critical boot failures across various industries, including banks, rail networks, airlines, retailers, broadcasters, healthcare, and many more sectors. The issue, manifesting as a Blue Screen of Death (BSOD), is preventing computers from starting up properly and forcing them into continuous recovery...
New MacBook Pros Launching Tomorrow With These 4 New Features 2

M5 MacBook Models to Use New Compact Camera Module in 2025

Wednesday July 17, 2024 2:58 am PDT by
Apple in 2025 will take on a new compact camera module (CCM) supplier for future MacBook models powered by its next-generation M5 chip, according to Apple analyst Ming-Chi Kuo. Writing in his latest investor note on unny-opticals-2025-business-momentum-to-benefit-509819818c2a">Medium, Kuo said Apple will turn to Sunny Optical for the CCM in its M5 MacBooks. The Chinese optical lens company...

Top Rated Comments

mdatwood Avatar
14 months ago
Queue the tons of people who confuse what this feature is and does.
Score: 7 Votes (Like | Disagree)
Apple Fan 2008 Avatar
14 months ago
Having a porn blocker opt-in for kids was weird anyways. Good decision to have it on by default.
Score: 5 Votes (Like | Disagree)
SDJim Avatar
14 months ago
As a parent I love these kinds of platform improvements.
Score: 5 Votes (Like | Disagree)
Apple Fan 2008 Avatar
14 months ago

I think the feature is fine/good, but the wording is so ... infantile. Or is that only shown for kids?
That’s only for kids
Score: 3 Votes (Like | Disagree)
CarlJ Avatar
14 months ago

Essentially it is a same machine looking for something, be it sensitive images (whatever that means) or CSAM or union activity it doesn't really matterm this machine looks for what someone told it to look for. Also for children notification is sent to the parents IIRC. Which infringes on the privacy of the children especially if it is a false positive.
The two mechanisms are completely different. The CSAM scanning mechanism was never machine learning. It was looking for matches to a specific set of images already in the possession of NCMEC (National Center for Missing and Exploited Children), which is the only entity authorized to catalog such images. No “looking for things that look like naughty bits”, it was only looking for a specific set of images. The technical paper ('https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf') that explains the mechanism is freely available.

This mechanism is entirely different from the CSAM detection mechanism, and does look for nudity, with machine learning. If it finds something it thinks might be that, it tells the person holding the phone, right at the point of being about to view the image. The notion of sending messages to the parents was removed, very early on, when it was pointed out some kids might be unsafe situations (like, say, parents who would harm their kids if they found out their kid was gay). So, it isn't sending a notification to anybody, it’s just asking the kid if they really want to see the image - that’s all.
Score: 3 Votes (Like | Disagree)
Cinder6 Avatar
14 months ago
I think the feature is fine/good, but the wording is so ... infantile. Or is that only shown for kids?
Score: 2 Votes (Like | Disagree)