“This Is About Control, Not Children”: Eric Weinstein Calls Out Apple’s Virtuous Pedo-Hunter Act – ZeroHedge News

“This Is About Control, Not Children”: Eric Weinstein Calls Out Apple’s Virtuous Pedo-Hunter Act

Last week, Apple announced that they would begin analyzing images on its devices before they’re uploaded to the cloud in order to identify child pornography and report it to the authorities, sending privacy advocates through the roof.

No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.

They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk

— Edward Snowden (@Snowden) August 6, 2021

Apple defended the decision – claiming there’s a ‘1 in 1 trillion chance of false positives.’

“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices,” the company said in an announcement.

Privacy advocates have pointed out the obvious slippery slope of allowing big tech to infiltrate our personal lives under the guise of fighting [evil thing], and the next thing you know Apple is hunting dissidents for human rights abusers, reporting who owns what guns, or people taking ‘suspicious’ routes that deviate from their normal pattern. As the Electronic Frontier Foundation notes:

We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. -EFF

They’re hypocrites anyway

Hitting it on the nose, as usual, is Eric Weinstein – who calls out Apple’s Tim Cook for scanning our *private* photos for child porn, while utterly ignoring Jeffrey Epstein’s high-level child-sex trafficking and ties to intelligence.

This is an easy tell: people who have **less** than **zero** interest in elite pedophiles connected to intelligence, want intelligence on everyone, dominated by ordinary non-pedophiles. If you’re dumb enough to believe this i can’t help you.

This is about control, not children. pic.twitter.com/wx1j5qI6Ix

— Eric Weinstein (@EricRWeinstein) August 8, 2021

Prove me wrong @alexstamos, @apple and Tim Cook: join me in calling for hearings into the Intelligence Community, its connection to Jeffrey Epstein and state protected pedophilia in our highest “elite” Echelons.

No? Thought so. This is sickening. Creepy Cowardice? That’s iPhone. pic.twitter.com/PonA18O75E

— Eric Weinstein (@EricRWeinstein) August 8, 2021

Either that, or drop the privacy campaign. You can’t have it both ways.

— Eric Weinstein (@EricRWeinstein) August 8, 2021

Also critical of Apple is WhatsApp head Will Cathcart – who says he’s “concerned” about Apple’s announcement, and “a setback for people’s privacy all over the world.”

I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world.

People have asked if we’ll adopt this system for WhatsApp. The answer is no.

— Will Cathcart (@wcathcart) August 6, 2021

More via Threadreader App:

Child sexual abuse material and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.
We’ve worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it’s shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption.
Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.
Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.
We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?
Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy?
What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?
There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.
Apple once said “We believe it would be in the best interest of everyone to step back and consider the implications …”
…”it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.” Those words were wise then, and worth heeding here now.

More notable hot takes:

Here. I said it in @Telegraph. Any kind of data may be potentially targeted. Apple’s system is a world-precedent in the area of remote inspection of private data/files. This is a huge power/capability. https://t.co/bnGQkvrhWU pic.twitter.com/0kKBNW19ft

— Lukasz Olejnik (@lukOlejnik) August 7, 2021

Maybe Apple will reverse course and maybe they don’t. But I don’t think they have come to terms with the deep well of antipathy and mistrust this move is going to create. It’s going to be with them for years, undoing billions of dollars in priceless customer trust and marketing. pic.twitter.com/ZgbQl1vYPC

— Matthew Green (@matthew_d_green) August 7, 2021

We previously obtained a sample of malware deployed in Xinjiang that scans phones for 70,000 specific files related to politics, Uighurs, etc. Why wouldn’t Chinese authorities try to get Apple to leverage its new child abuse system to instead fulfil this https://t.co/NZxIJIuOC8

— Joseph Cox (@josephfcox) August 6, 2021

Read More

Leave a Reply

WP Twitter Auto Publish Powered By : XYZScripts.com