“This Is About Control, Not Children”: Eric Weinstein Calls Out Apple’s Virtuous Pedo-Hunter Act
Last week, Apple announced that they would begin analyzing images on its devices before they’re uploaded to the cloud in order to identify child pornography and report it to the authorities, sending privacy advocates through the roof.
No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.
They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk
— Edward Snowden (@Snowden) August 6, 2021
Apple defended the decision – claiming there’s a ‘1 in 1 trillion chance of false positives.’
“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices,” the company said in an announcement.
Privacy advocates have pointed out the obvious slippery slope of allowing big tech to infiltrate our personal lives under the guise of fighting [evil thing], and the next thing you know Apple is hunting dissidents for human rights abusers, reporting who owns what guns, or people taking ‘suspicious’ routes that deviate from their normal pattern. As the Electronic Frontier Foundation notes:
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.
All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. -EFF
They’re hypocrites anyway
Hitting it on the nose, as usual, is Eric Weinstein – who calls out Apple’s Tim Cook for scanning our *private* photos for child porn, while utterly ignoring Jeffrey Epstein’s high-level child-sex trafficking and ties to intelligence.
This is an easy tell: people who have **less** than **zero** interest in elite pedophiles connected to intelligence, want intelligence on everyone, dominated by ordinary non-pedophiles. If you’re dumb enough to believe this i can’t help you.
This is about control, not children. pic.twitter.com/wx1j5qI6Ix
— Eric Weinstein (@EricRWeinstein) August 8, 2021
Prove me wrong @alexstamos, @apple and Tim Cook: join me in calling for hearings into the Intelligence Community, its connection to Jeffrey Epstein and state protected pedophilia in our highest “elite” Echelons.
No? Thought so. This is sickening. Creepy Cowardice? That’s iPhone. pic.twitter.com/PonA18O75E
— Eric Weinstein (@EricRWeinstein) August 8, 2021
Either that, or drop the privacy campaign. You can’t have it both ways.
— Eric Weinstein (@EricRWeinstein) August 8, 2021
Also critical of Apple is WhatsApp head Will Cathcart – who says he’s “concerned” about Apple’s announcement, and “a setback for people’s privacy all over the world.”
I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world.
People have asked if we’ll adopt this system for WhatsApp. The answer is no.
— Will Cathcart (@wcathcart) August 6, 2021
More via Threadreader App:
More notable hot takes:
Here. I said it in @Telegraph. Any kind of data may be potentially targeted. Apple’s system is a world-precedent in the area of remote inspection of private data/files. This is a huge power/capability. https://t.co/bnGQkvrhWU pic.twitter.com/0kKBNW19ft
— Lukasz Olejnik (@lukOlejnik) August 7, 2021
Maybe Apple will reverse course and maybe they don’t. But I don’t think they have come to terms with the deep well of antipathy and mistrust this move is going to create. It’s going to be with them for years, undoing billions of dollars in priceless customer trust and marketing. pic.twitter.com/ZgbQl1vYPC
— Matthew Green (@matthew_d_green) August 7, 2021
We previously obtained a sample of malware deployed in Xinjiang that scans phones for 70,000 specific files related to politics, Uighurs, etc. Why wouldn’t Chinese authorities try to get Apple to leverage its new child abuse system to instead fulfil this https://t.co/NZxIJIuOC8
— Joseph Cox (@josephfcox) August 6, 2021