Apple Says It Will Reject Government Demands To Use New Child Abuse Image Detection System for Surveillance (cnbc.com) 96
Apple defended its new system to scan iCloud for illegal child sexual abuse materials (CSAM) on Monday during an ongoing controversy over whether the system reduces Apple user privacy and could be used by governments to surveil citizens. From a report: Last week, Apple announced it has started testing a system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It says it can do this without learning about the contents of a user's photos stored on its servers. Apple reiterated on Monday that its system is more private than those used by companies like Google and Microsoft because its system uses both its servers and software running on iPhones.
Privacy advocates and technology commentators are worried Apple's new system, which includes software that will be installed on people's iPhones through an iOS update, could be expanded in some countries through new laws to check for other types of images, like photos with political content, instead of just child pornography. Apple said in a document posted to its website on Sunday governments cannot force it to add non-CSAM images to a hash list, or the file of numbers that correspond to known child abuse images Apple will distribute to iPhones to enable the system.
Privacy advocates and technology commentators are worried Apple's new system, which includes software that will be installed on people's iPhones through an iOS update, could be expanded in some countries through new laws to check for other types of images, like photos with political content, instead of just child pornography. Apple said in a document posted to its website on Sunday governments cannot force it to add non-CSAM images to a hash list, or the file of numbers that correspond to known child abuse images Apple will distribute to iPhones to enable the system.