Apple announces new iPhone features to detect child sex abuse

Following a report on work the company was doing to create a tool that scans iPhones for child abuse images, Apple has published a post that provides more details on its efforts related to child safety. With the release of iOS 15, watchOS 8 and macOS Monterey later this year, the company says it will introduce a variety of child safety features across Messages, Photos and Siri.
To start, the Messages app will include new notifications that will warn children, as well as their parents, when they either send or receive sexually explicit photos. When someone sends a child an inappropriate image, the app will blur it and display several warnings. "It's not your fault, but sensitive photos and videos can be used to hurt you," says one of the notifications, per a screenshot Apple shared.
As an additional precaution, the company says Messages can also notify parents if their child decides to go ahead and view a sensitive image. "Similar protections are available if a child attempts to send sexually explicit photos," according to Apple. The company notes the feature uses on-device machine learning to determine whether a photo is explicit. Moreover, Apple does not have access to the messages themselves. This feature will be available to family iCloud accounts.
Apple will also introduce new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company says it will use the technology to notify the National Center for Missing and Exploited Children (NCMEC), which will in turn work with law enforcement agencies across the US. "Apple’s method of detecting known CSAM [Child Sexual Abuse Material] is designed with user privacy in mind," the company claims.
Rather than scanning photos when they're uploaded to the cloud, the system will use an on-device database of "known" images provided by NCMEC and other organizations. The company says that the database assigns a hash to the photos, which acts as a kind of digital fingerprint for them.
A cryptographic technology called private set intersection allows Apple to determine if there's a match without seeing the result of the process. In the event of a match, an iPhone or iPad will create a cryptographic safety voucher that will encrypt the upload, along with additional data about it. Another technology called threshold secret sharing makes it so that the company can't see the contents of those vouchers unless someone passes an unspecified threshold of CSAM content. "The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account," according to the company.
It's only when that line is passed that the technology Apple plans to implement will allow the company to review the contents of the vouchers. At that point, the tech giant says it will manually review each report to confirm there's a match. In cases where there is one, it will disable the individual's iCloud account and forward a report to NEMEC. Users can appeal a suspension if they believe their account has been mistakenly flagged.
Lastly, Siri, as well as the built-in search feature found in iOS and macOS, will point users to child safety resources. For instance, you'll be able to ask the company's digital assistant how to report child exploitation. Apple also plans to update Siri to intervene when someone tries to conduct any CSAM-related searches. The assistant will explain "that interest in this topic is harmful and problematic," as well as point the person to resources that offer help with the issue.
Apple's decision to effectively work with law enforcement agencies is likely to be seen as something of an about-face for the company. In 2016, it refused to help the FBI unlock the iPhone that had belonged to the man behind the San Bernardino terror attack. Although the government eventually turned to an outside firm to access the device, Tim Cook called the episode "chilling" and warned it could create a backdoor for more government surveillance down the road.
Following a report on work the company was doing to create a tool that scans iPhones for child abuse images, Apple has published a post that provides more details on its efforts related to child safety. With the release of iOS 15, watchOS 8 and macOS Monterey later this year,…
Recent Posts
- Elon Musk says Grok 2 is going open source as he rolls out Grok 3 for Premium+ X subscribers only
- FTC Chair praises Justice Thomas as ‘the most important judge of the last 100 years’ for Black History Month
- HP acquires Humane AI assets and the AI pin will suffer a humane death
- HP acquires Humane AI assets and the AI pin may suffer a humane death
- HP acquires Humane Ai and gives the AI pin a humane death
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010