Thoughts on $AAPL and NCMEC

I spent about 8 hours over the weekend driving so had some time to catch up on podcasts and do some thinking. One of the ones I listened to, This Week in Tech, had an in-depth discussion on $AAPL working with the NCMEC (National Center for Missing and Exploited Children) database to compare hashes of iMessage and iCloud photos to look for child porn. This is a little bit of a cold take because this was hot news a couple weeks back, but a little time and distance, and listening to others try and sort through it in real time, allowed system 2 to take over and really give this some consideration. This is an incredibly canny move by $AAPL and, I believe, reflects well on their leadership team. At the same time this is terrifying, but probably not for the reasons you think it is or for the people you think it might be terrifying to.

$AAPL has been under constant and increasing pressure, as have almost all tech platforms, from the Five Eyes alliance and other quarters over their use of strong encryption. Governments want to be able to pry into our devices to assist them in all sorts of mundane and extraordinary law enforcement cases because it gives them an easy button for collecting evidence. In the interest of keeping this short, we won’t get into the vagaries and legalities of all they are or have tried to do but, suffice it to say they want this access badly. So bad in fact that governments like the Australians (and others I’m sure but I can’t keep up with all of them) have proposed legislation to back door encryption. Every time this subject comes up, like the CESTA-FOSTA debate, they trot out the old show pony of child pornography. How can you expect people to vote against that? That’s a vote for abusing children! Undeniably child pornography is horrible. That’s why this move by $AAPL is so brilliant. It is a cryptographically secure way for them to short circuit that argument. By doing on device examination of hashes of images, they can identify, flag, and pass on for review or prosecution anyone using their device for storage or transmission of these preidentified hashes contained in the NCMEC database (and ONLY those hashes that are stored in the database) while keeping the other data on device secure. Next time a lawmaker trots out a “think of the children” argument for their fresh plan to force $AAPL to leave a back door on their system, $AAPL can say they are already scanning for child abuse imagery and there is no need to fundamentally break encryption to do that. Brilliant defensive play to protect their core value proposition of privacy.

Of course, you can’t have your cake an eat it too. Is there potential for say, China, to pull $AAPL aside and say we also need you to scan using this database for images critical of Chairman Xi? Sure, and it might even happen, but don’t forget that all iCloud backups and iMessage traffic for Chinese citizens are already stored in facilities in China and presumably the CCP has the keys to open them. No, the real path to abusing this system is much more insidious, and you will likely never know for sure if it is occurring. Let’s take the case of Jamal Khashoggi. We know with reasonable certainty that his phone was compromised by a 0day exploit from the NSO Group allowing for Saudi intelligence to completely take over his phone which they used to track and eventually kill him. Instead of going through all that trouble though, and of course avoiding the bad press that results from brutally murdering a critic, they just uploaded child abuse imagery known to be in the NCMEC database to his phone without his knowledge. This system would quickly identify and flag it, and bam, you now have an open and shut case against your critic as a pedophile which has the effect of discrediting his previous work as well as silencing him. And the Saudis are far from the only regime that would have the means and motives to do this to silence dissidents and critics that are otherwise outside of their reach.

Disclosures: I am currently long $AAPL