Before an image is uploaded to iCloud Photos, those hashes are matched on the device against a database of known hashes of child abuse imagery, provided by child protection organizations like the National Center for Missing & Exploited Children (NCMEC) and others. NeuralHash uses a cryptographic technique called private set intersection to detect a hash match without revealing what the image is or alerting the user.
Through a software update rolling out later this year, Messages will be able to use on-device machine learning to analyze image attachments and determine if a photo being shared is sexually explicit. This technology does not require Apple to access or read the child’s private communications, as all the processing happens on the device. Nothing is passed back to Apple’s servers in the cloud.
Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.
To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.
Once we learn that Apple can decrypt your photo data and share it with authorities, and authorities learn this, I don’t see what’s to stop them from asking for access to other kinds of potentially illegal data or even just data that would be considered concerning by some parties. Again, I don’t know if I even have a problem with using access to such data to find, for instance, white nationalists before they march on The Capitol again. On the other hand, these tools in the hands of authoritarian societies could be used to stamp out activism that doesn’t align with the current regime.
In a series of tweets, Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a “really bad idea” because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that’s uploaded to iCloud. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.
Be responsible in what you do and say, to whom, and how. Reflect, slow down, and think about what you share. Assume that someone is always listening.
Apple could remind us to protect ourselves by dropping the pretense of privacy.
In early 2019, an eavesdropping bug in group FaceTime calls would have let attackers activate the microphone, and even the camera, of the iPhone they were calling and eavesdrop before the recipient did anything at all. The implications were so severe that Apple invoked a nuclear option, cutting off access to the group calling feature entirely until the company could issue a fix. The vulnerability—and the fact that it required no taps or clicks at all on the part of the victim—captivated Natalie Silvanovich.
“The idea that you could find a bug where the impact is you can cause a call to be answered without any interaction—that's surprising,” says Silvanovich, a researcher in Google's Project Zero bug-hunting team. "I went on a bit of a tear and tried to find these vulnerabilities in other applications. And I ended up finding quite a few.”
Scam apps have made their way onto both Apple's App Store and the Google Play Store before, with some making millions of dollars. Now, however, Apple is being accused of actively promoting apps that reportedly do little or nothing, and yet can charge users up to $500 (AU$676) per year.
Apple today announced a new Apple Music for Artists feature called Shareable Milestones, which is designed to allow Apple Music artists to share key milestones and successes with their fans on social media.
Over the last few weeks, an ever-increasing number of Apple users have been frustratingly sharing issues they’re experiencing with iTunes Match, Apple’s service that allows users to upload songs to iCloud from other sources, such as CDs.
The support documents walk through the many different display setups that can be used with the GPUs and how to use AMD’s Infinity Fabric Link technology for increased performance and faster data transfer between the modules.
If you work out regularly and want something to free up your wrist, the ActionSleeve 2 is a great choice.
Projects provide teams and workplaces with a complete set of features that help manage and save time, meet deadlines, and delegate and track tasks to stay on schedule with projects.
The optimistic me is looking forward to the day after tomorrow. The pessimsitic me is waiting to see what happens tomorrow before planning for the day after tomorrow.
And that's why I am probably not to buy the MagSafe battery pack just yet.
:-)
~
Thanks for reading.