Apple pushes back against child abuse scanning concerns in new FAQ


In a new FAQ, Apple has attempted to assuage concerns that its new anti-child abuse measures could be turned into surveillance tools by authoritarian governments. “Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any government’s request to expand it,” the company writes.
Apple’s new tools, announced last Thursday, include two features designed to protect children. One, called “communication safety,” uses on-device machine learning to identify and blur sexually explicit images received by children in the Messages app, and can notify a parent if a child age 12 and younger decides to view or send such an image. The second is designed to detect known CSAM by scanning users’ images if they choose to upload them to iCloud. Apple is notified if CSAM is detected, and it will alert the authorities when it verifies such material exists.
The plans met with a swift backlash from digital privacy groups and campaigners, who argued that these introduce a backdoor into Apple’s software. These groups note that once such a backdoor exists there is always the potential for it to be expanded to scan for types of content that go beyond child sexual abuse material. Authoritarian governments could use it to scan for politically dissent material, or anti-LGBT regimes could use it to crack down on sexual expression.
“Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the Electronic Frontier Foundation wrote. “We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ‘terrorist’ content that companies can contribute to and access for the purpose of banning such content.”
However, Apple argues that it has safeguards in place to stop its systems from being used to detect anything other than sexual abuse imagery. It says that its list of banned images is provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations, and that the system “only works with CSAM image hashes provided by NCMEC and other child safety organizations.” Apple says it won’t add to this list of image hashes, and that the list is the same across all iPhones and iPads to prevent individual targeting of users.
The company also says that it will refuse demands from governments to add non-CSAM images to the list. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” it says.
It’s worth noting that despite Apple’s assurances, the company has made concessions to governments in the past in order to continue operating in their countries. It sells iPhones without FaceTime in countries that don’t allow encrypted phone calls, and in China it’s removed thousands of apps from its App Store, as well as moved to store user data on the servers of a state-run telecom.
The FAQ also fails to address some concerns about the feature that scans Messages for sexually explicit material. The feature does not share any information with Apple or law enforcement, the company says, but it doesn’t say how it’s ensuring that the tool’s focus remains solely on sexually explicit images.
“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” wrote the EFF. The EFF also notes that machine-learning technologies frequently classify this content incorrectly, and cites Tumblr’s attempts to crack down on sexual content as a prominent example of where the technology has gone wrong.
In a new FAQ, Apple has attempted to assuage concerns that its new anti-child abuse measures could be turned into surveillance tools by authoritarian governments. “Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any…
Recent Posts
- Everything missing from the iPhone 16e, including MagSafe and Photographic Styles
- Reddit is reportedly experiencing some outages
- Google may be close to launching YouTube Premium Lite
- Someone wants to sell you a digital version of the antiquated typewriter but without a glued-on keyboard (no really)
- Carbon removal is the next big fossil fuel boom, oil company says
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010