Apple to Automatically Blur Sexually Explicit Content for Kids 12 and Under, a Change NCOSE had Requested

By Published on June 9, 2023

WASHINGTON, D.C.  (June 8, 2023)  — The National Center on Sexual Exploitation (NCOSE) commends Apple for implementing a major child safety change that NCOSE had requested: Apple’s nudity detection and blurring feature will now apply to videos as well as stills and will be on by default for child accounts age 12 and under.

“This is a major step in child protection by Apple and a victory for families. In partnership with our ally Protect Young Eyes, NCOSE has been calling on Apple for years to proactively turn on all child safety features as the default. We are grateful to Apple for enacting this change that will protect children from nudity, sexually explicit images, and pornographic content that can be harmful to minors in and of itself, but that is also often used to groom children for sexual abuse,” said Lina Nealon, Vice President and Director of Corporate Advocacy, National Center on Sexual Exploitation.

Predators send sexually explicit content as a way to build trust with both children and adults, usually asking for images in return. Once sexually explicit content is obtained, the predator may use it for sextortion (to obtain money, additional images, or as leverage for other demands), or may share it with others — possibly even posting the content on social media and pornography sites.

Please Support The Stream: Equipping Christians to Think Clearly About the Political, Economic, and Moral Issues of Our Day.

Teens and adults will also be able to opt-in to this critically important safety feature. NCOSE hopes Apple will eventually turn this feature on by default for everyone — especially for children ages 13-17 (87% of whom own an iPhone) who are also targeted for sexual exploitation and sextortion at increasing rates and deserve greater protections. And even more and more adults are falling victim to sextortion, image-based sexual abuse, and “cyberflashing.” Apple’s AirDrop feature in particular has been under fire for years as a means to “drop” sexually explicit content into other people’s phones.

Apple has also made this technology available to developers through API — a commendable example of cross-platform collaboration around child safety. Discord said it plans to use it on its platform through i0S.

“We urge all companies — especially those most popular with teens like Snapchat and TikTok — to integrate this child safety measure for free through API. Apple has set an industry standard and in effect has issued an invitation to their tech peers to join them: we trust other companies will prove they truly care about protecting their most vulnerable users by accepting Apple’s offer,” Nealon said.

While NCOSE applauds Apple for this significant improvement, the Apple App Store remains on the 2023 Dirty Dozen List for deceptive app age ratings and descriptions that mislead families about the content, risks, and dangers to children on available apps.

 

Founded in 1962, the National Center on Sexual Exploitation (NCOSE) is the leading national non-partisan organization exposing the links between all forms of sexual exploitation such as child sexual abuse, prostitution, sex trafficking and the public health harms of pornography.

Print Friendly, PDF & Email

Like the article? Share it with your friends! And use our social media pages to join or start the conversation! Find us on Facebook, Twitter, Instagram, MeWe and Gab.

Inspiration
Military Photo of the Day: Soaring Over South Korea
Tom Sileo
More from The Stream
Connect with Us