Apple Announces Limits to Child Sex Abuse Image Scanning System After Privacy Backlash

Apple on Friday provided new details of how its planned child sexual abuse material (CSAM) detection system would work, outlining a range of privacy-preserving limits following backlash that the software would introduce a backdoor that threatens user privacy protections. The company addressed concerns triggered by the planned CSAM feature, slated for release in an update for U.S. users later this year, in a 14-page document (pdf) that outlined safeguards meant to prevent the system from erroneously flagging child pornography on Apple devices or being exploited for malicious surveillance of users. “The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised,” the company said in the document. Apple’s reference to “possibly-colluding entities” appears to address …

Similar Posts