Last Updated on
Apple's protective measures for children might open a backdoor to worse
Last Updated on
Apple will begin to scan your photos stored on their cloud storage service, iCloud, as an update in the upcoming iOS 15, iPad OS 15, watchOS 8, and macOS Monterey.
The feature, neuralMatch, is designed to scan photos and material stored on the cloud for any evidence of child abuse. When it detects something, it’ll alert a team of human reviewers to determine whether or not law enforcement should be involved.
This also applies to Siri, which will “intervene” whenever someone potentially searches for that content via those systems. Siri will also assist in reporting content by pointing you to the right websites.
With the scan in place, it’ll also begin to warn younger users and parents in the Messages app if something isn’t particularly suitable to be sent. Via Machine Learning, Apple intends to prevent explicit content from being sent from or to their younger customer and if the parent is connected via parental controls, they’ll get a notification and the child a warning that the notification will be sent.
In 2015, Apple rejected the FBI’s request to be allowed a ‘backdoor’ into the San Bernardino shooter’s phone. Now, the Electronic Frontier Foundation (EFF) has raised concerns about the potential for this Apple-approved backdoor to be used by certain governments to censor users who might be sharing LGBT+ content or to search for dissenting members of the population.
In a 2019 article, the EFF explained that adding in a client-side scanning service for explicit content is a sure-fire way to break privacy measures.
The EFF reiterates that in their 2021 article about this new system from Apple, is that it’s a breaking of the promise that Apple has repeatedly stood by in regards to privacy.
You’ll never be able to lock the backdoor once it’s opened and the nefarious uses that a system intended to prevent the nefariousness being committed can suddenly turn a once secure system into something that could actively harm the users.
iMessage (or just Messages) is no longer the secure service it purports to be. The method of detection gives Apple a larger insight into their customer’s data and content, while the intention is good, the method is likely to be abused by higher powers than Apple in areas of the world not particularly interested in only ensuring the safety of children, but oppressing others.
However, speaking with The Guardian, the developer of PhotoDNA – a similar piece of software – Hany Farid, says he isn’t worried about the inclusion due to other programs using it, including WhatsApp, which will scan messages for harmful links or files.
Meanwhile, Matthew Green, a professor and teacher of Cryptography at Johns Hopkins University, claims that this is on the beginning for ‘mission creep’, a term used to describe the changing objectives over a long period of time.
In a Twitter thread, Green also brings back up the fact that the American government asked for this on behalf of other nations, to allow them into the backdoor for ‘security’ purposes.
The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over. Here’s an open letter signed by former AG William Barr and other western governments. https://t.co/mKdAlaDSts
— Matthew Green (@matthew_d_green) August 5, 2021
In a lengthy Twitter Thread, the head of WhatsApp commented on Apple’s plans for scanning photos, claiming it to be a “surveillance system”.
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
— Will Cathcart (@wcathcart) August 6, 2021
He states that Whatsapp has managed to report over 400, 000 incidents of child abuse shared on the platform without breaking end-to-end encryption over the last year but also fails to mention that it was only recently that WhatsApp was in the news, with The Information reported that Facebook (WhatsApp’s parent company) was trying to circumvent E2E for advertising purposes.
Cathcart rejects the notion, but also, Facebook members of staff aren’t exactly the most trusted at the moment considering their bannings of political researchers in the US (among other things), with none of the participants in the research being banned.
We're not pursuing homomorphic encryption for @WhatsApp. I've been asked this before (link below). We should be skeptical of technical claims that apps like ours could see messages in "good" cases only. That's just not how technology works. https://t.co/Y55p25QmR8 https://t.co/mZgNYyorkm
— Will Cathcart (@wcathcart) August 3, 2021