Apple announced this Friday that it will not now release its controversial Child Sexual Abuse Material Detection (CSAM) technology, announced last month. The company explained to the website TechCrunch which will “set aside additional time in the coming months” to further analyze and tweak before the release of these security features.
The postponement was made, according to the company, “based on feedback from customers, advocacy groups, researchers and others.” As previously reported, this feedback from users was largely negative. The civil liberties NGO Electronic Frontier Foudation alone collected more than 25,000 consumer signatures against the measure. The respected ACLU (American Civil Liberties Union) also asked Apple to drop the procedure.
Source: Apple/DisclosureSource: Apple
How does Apple’s CSAM detection technology work?
Called NeuralHash, Apple’s new feature consists of an algorithm supposedly capable of detecting child pornography content on users’ devices, without obtaining the image or even knowing its content. Since photos stored on iCloud are encrypted end-to-end — so that even the owner of the system can’t access it — NeuralHash looks for CSAM directly on computers and cell phones.
If the intentions sound as good as they can be, cybersecurity experts and internet privacy advocates have been expressing their concern that the system could be misrepresented by powerful actors, such as totalitarian governments, to exploit and oppress their citizens.
Another concern is possible false positives. A user of the social aggregator Reddit used reverse engineering to create “hash collisions” in NeuralHash, that is, he used the very mathematical algorithm that transforms data into characters (hash) to trick the system. Comparing a different image from the device’s CSAM database, the expert “deceived” him as if the images were the same.