As for future costs, however, Apple’s proposed approach embraced at least three worrisome premises: that we don’t fully own the devices that store so much private information about us; that tech giants that sell us those devices can ethically load them with spyware; and that the evil deeds of a tiny fraction of users justify the mass surveillance of data that millions of totally innocent users put on their phone.
If Apple accepts those premises, and most of its customers go along without objecting, then future iPhones will almost inevitably scan for more than child porn. The logic of catching a few evil actors by denying the cloak of privacy to everyone will inexorably expand to more and more areas that powerful societal factions want to target. Some of those factions will themselves be evil. Many are likely to be illiberal or repressive.
The issue — of course — wasn’t that Apple was looking at find ways that prevented the proliferation of CSAM while making as few device security concessions as possible. The issue was that Apple was unilaterally making a massive choice that would affect billions of customers (while likely pushing competitors towards similar solutions), and was doing so without external public input about possible ramifications or necessary safeguards.
I am not sure how much meaningful changes Apple can make to address its strongest critics.
~
Thanks for reading.