Apple CSAM scanning

Is Apple’s controversial CSAM scanning a precursor to something that Big Brother will HATE?

Earlier this month, Apple announced that they will scan their customers’ devices for Child Sexual Abuse Material (CSAM). That triggered a privacy firestorm that threatens to erode’s reputation for privacy.

But is it possible that this CSAM scanning controversy is a necessary evil that is a precursor to implementing something big that will further enhance Apple’s privacy image?

For starters, scanning cloud storage for CSAM is not controversial. Anyone who uploaded known CSAM into Microsoft OneDrive, Google Drive, Dropbox, Facebook, etc will be flagged. Last year, Facebook reported 20 million CSAM images to NCMEC. Guess how many CSAM did Apple reported?

265

Yup, a grand total of only 265 (compared to 20 million from Facebook). Clearly, Apple has a lot to catch up on catching people (pun intended) who store CSAM in their iCloud Photos. In fact, Apple is conspicuous in the lack of CSAM being reported. It is extremely unlikely that CSAMs are not being uploaded to Apple’s servers. More likely, Apple has not been catching offenders.

That is the reason why Apple is getting into scanning for CSAM. But what makes Apple scanning so controversial is that the scanning is done INSIDE your device, which is where your personal private space resides. On the other hand, Google, Microsoft, Dropbox and Facebook, do the scanning OUTSIDE your device, in their cloud storage servers. This is the line that Apple crossed that freaked out a lot of civil libertarians.

But I suspect that Apple may have another trick up their sleeve.

You see, the reason why Microsoft, Google, Dropbox and Facebook can scan for CSAM in their cloud storage servers is that they can see everything that you have uploaded into their cloud storage service. The reason why they can see everything is that all your files are not end-to-end encrypted. That is, they hold the cryptographic keys to decrypt all your files in their cloud storage. To put it simply, all the files that you have uploaded to OneDrive, Google Drive, Dropbox, Facebook, etc are not private to them.

So, why don’t Apple implement CSAM scanning the least controversial way by scanning the images in their cloud storage servers? That is what Google, Microsoft, Dropbox and Facebook are currently doing. Had Apple done that, there will be no controversy. Why did they go about doing it the most controversial way by scanning the photos INSIDE your devices?

I suspect the reason is that Apple is going to introduce end-to-end encryption on their iCloud Photos service. Once they introduce it, Apple will NOT have the cryptographic keys to decrypt the photos that you have uploaded to their server. If they don’t have the cryptographic keys, then they cannot see any of your photos. If they can’t see any of your photos, they cannot scan for CSAM images. If they cannot scan for CSAM, then their iCloud Photos service will be a haven for paedophiles.

This will definitely make law enforcement agencies furious. If Apple pisses them off, you can be sure that the government will breathe down hard on Apple with new legislation. Therefore, to placate law enforcement agencies, the only way to ensure that their soon-to-be-announced end-to-end encrypted iCloud Photos service wouldn’t be abused by paedophiles is to do the scanning BEFORE it gets uploaded.

So, I suspect there is a method behind Apple’s apparent madness of introducing on-device CSAM scanning. This will generate a privacy firestorm that will no doubt be pleasing to law-enforcement agencies. Then at the right moment, Apple will announce end-to-end encryption for iCloud Photos. This time, law-enforcement agencies will be boxed in because they cannot argue that Apple is helping paedophiles.

Am I right? Only time will tell. Watch this space.


DON'T GET HACKED!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from iSecurityGuru

Subscribe now to keep reading and get access to the full archive.

Continue reading