What’s happened?
Apple has announced that future versions of its operating system for iPhones, iPads, Watches, and Macs will scan for Child Sexual Abuse Material (CSAM).
So I’ll be able to scan my Apple device for CSAM?
Err.. no.
Apple will be scanning for illegal images on your device before they are uploaded to iCloud Photos, by comparing the hashes (sometimes known as checksums) of your photos with a database of known CSAM image hashes. If the hashes match, then there is a good chance that a child sexual abuse image has been found.
If suspected CSAM is found, a “cryptographic safety voucher” containing the match result and additional encrypted data about the image is uploaded to iCloud. Apple says it cannot interpret the contents of the safety voucher unless the user’s account has reached a certain threshold for image matches.
“Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.”
NCMEC?
The National Centre for Missing & Exploited Children. It is the United States clearinghouse and reporting centre for all issues related to preventing online child abuse and exploitation.
NCMEC has previously spoken of the importance of online security and privacy, but argues that there should be exceptions for detecting child sexual exploitation.
Well, I agree with them. People who abuse and exploit children need to be caught.
I think almost everyone wants that, but many in the cybersecurity community are concerned that systems like this could be misused.
How could the system be abused?
Imagine that, once the infrastructure is in place, a government puts pressure on Apple to scan for more than just hashes of CSAM. Imagine that the government wants to identify the owners of iPhones which contain other content that is considered unlawful by the regime.
It’s a lot easier to deny the requests of a government or over-reaching law enforcement agency if no system is already in place to scan users’ devices for banned content. Much harder when it is already up and running.
It seems like this is more complicated than I imagined.
Yes, we all want to stamp out child sexual abuse material. But there are genuine and considerable concerns about the loss of privacy. You may feel you have nothing to hide, but there are plenty of people who have very legitimate reasons to keep their activities or beliefs secret from authoritarian regimes.
Are Apple the only ones scanning for illegal child sexual abuse material?
No. Far from it.
For instance, Microsoft developed PhotoDNA, which is used by NCMEC and online service providers to prevent the redistribution of child sexual abuse images.
The Internet Watch Foundation (IWF) similarly works with internet companies to remove CSAM from servers, and collect evidence for police investigations.
Meanwhile, Google scans for CSAM across its properties including YouTube, and through a content safety API helps companies like Facebook detect child sexual abuse material.
So why the big fuss about Apple doing more about it?
Apple, and other companies, have received considerable pressure from law enforcement agencies and governments to open a backdoor into it systems which would weaken privacy and security.
In many people’s eyes, Apple’s announcement is another step on a slippery slope.
That certainly appears to the opinion of the EFF:
“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
Apple has been talking so much about privacy in recent years, and using it as a differentiator. Isn’t there a danger that it is taking itself in a different direction with this initiative?
Yes.
And trust me, governments are watching and will seize the opportunity to push harder to weaken privacy and security further as a result.