Apple delays plans to roll out CSAM detection in iOS 15 after privacy backlash


Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology that it chaotically announced last month, citing feedback from customers and policy groups.
That feedback, if you recall, has been largely negative. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.
In a statement on Friday morning, Apple told TechCrunch:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple’s so-called NeuralHash technology is designed to identify known CSAM on a user’s device without having to possess the image or knowing the contents of the image. Because a user’s photos stored in iCloud are encrypted so that even Apple can’t access the data, NeuralHash instead scans for known CSAM on a user’s device, which Apple claims is more privacy-friendly than the current blanket scanning that cloud providers use.
But security experts and privacy advocates have expressed concern that the system could be abused by highly resourced actors, like governments, to implicate innocent victims or to manipulate the system to detect other materials that authoritarian nation states find objectionable.
Within a few weeks of announcing the technology, researchers said they were able to create “hash collisions” using NeuralHash, effectively tricking the system into thinking two entirely different images were the same.
iOS 15 is expected out later in the next few weeks.
This report has been updated with more detail about NeuralHash and to clarify iCloud Photos are encrypted but not end-to-end encrypted.
Read more:
Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology that it chaotically announced last month, citing feedback from customers and policy groups. That feedback, if you recall, has been largely negative. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures…
Recent Posts
- One of the best AI video generators is now on the iPhone – here’s what you need to know about Pika’s new app
- Apple’s C1 chip could be a big deal for iPhones – here’s why
- Rabbit shows off the AI agent it should have launched with
- Instagram wants you to do more with DMs than just slide into someone else’s
- Nvidia is launching ‘priority access’ to help fans buy RTX 5080 and 5090 FE GPUs
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010