In this episode of HJ Talks about Abuse podcast, Alan Collins and Feleena Grosvenor address Apple’s new system to scan iPhones for child sex abuse material.
Organisations, including Apple and Facebook, have been under criticism for, supposedly, prioritising customer privacy and keeping individuals safe from hackers and criminals. This is rather than using processes to identify and report child sexual abuse material.
This is related to the previous podcast “Encryption on Tackling Child Sexual Abuse” whereby Alan and Feleena identified the issues with end-to-end encryption on apps such as Whatsapp and Facebook messenger.
Apple have announced details of a system which both limits the spread of child sexual abuse material and protects user privacy.
It is a system which, before the image is stored in iCloud photos, scans for child sexual abuse material from the existing database of known child abuse images. The system would identify not only the original but edited or similar versions of the original image.
Apple claims that it has an extremely high level of accuracy and each report will be manually reviewed to confirm if there is a match. If there is, it would then disable the user’s account and report to the authorities.
The limitation is that the images have to be in their iCloud Photos account to be caught by the system. On the one hand, this limits the benefit to tackling the spread of child sexual abuse images, but on the other it limits the negative impact on customer privacy as their images can be saved elsewhere.
Whether the system goes too far or not far enough, ultimately, it identifies prohibited content and may serve to encourage other organisations to introduce the same.
Sources: