Furious WhatsApp says your iPhone IS spying on you – and 'it's Apple's fault'

WHATSAPP'S chief has blasted Apple over its plans to check iPhone users' photos for child sex abuse imagery.

In a series of tweets, Will Cathcart said that his messaging app would not be adopting the safety measures, calling Apple's approach "very concerning".

  • Read all the latest Phones & Gadgets news
  • Keep up-to-date on Apple stories
  • Get the latest on Facebook, WhatsApp and Instagram

Apple last week unveiled plans to scan U.S. iPhones for images of child sexual abuse.

The move has drawn applause from child protection groups but raised concerns among security researchers and tech experts.

Those concerned claim the system could be misused – particularly by governments who may be looking to spy on their citizens.

Following the unveiling of the plans on August 6, Cathcart tweeted: "This is the wrong approach and a setback for people's privacy all over the world.

"People have asked if we'll adopt this system for WhatsApp. The answer is no."


The tool, called neuralMatch, is designed to detect known images of child sexual abuse and will scan such images before they are uploaded to iCloud.

If the system finds a match, the image will be reviewed by a human.

Once child sex abuse content has been confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will, however, only flag images that are already in the center's database of known child sex abuse images.


Cathcart continued: "Child sexual abuse material [CSAM] and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.

"Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.

"Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven't shared with anyone. That's not privacy.

"We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works."


Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images.

Apple has used those to scan user files stored in its iCloud service – which is not as securely encrypted as its on-device data – for child sex abuse imagery.

The company has been under government pressure for years to allow for increased surveillance of encrypted data.

Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.

"Apple's expanded protection for children is a game changer," John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement.

"With so many people using Apple products, these new safety measures have lifesaving potential for children."

Meanwhile the Electronic Frontier Foundation, the online civil liberties pioneer, called Apple's compromise on privacy protections a shocking about-face for users who have relied on the company's leadership in privacy and security.

More than 50,00 people have signed an online petition to stop the plans, including security and privacy experts, researchers, legal experts and more.


Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography.

That could fool Apple's algorithm and alert law enforcement, Green said.

He added that researchers have been able to trick such systems pretty easily.

Other abuses could include government surveillance of dissidents or protesters.

"What happens when the Chinese government says, 'Here is a list of files that we want you to scan for,'" Green asked. "Does Apple say no? I hope they say no, but their technology wont say no."

Best Phone and Gadget tips and hacks

Looking for tips and hacks for your phone? Want to find those secret features within social media apps? We have you covered…

  • How to get your deleted Instagram photos back
  • How can I change my Facebook password?
  • How can I do a duet on TikTok?
  • Here's how to use your iPhone's Apple logo as a BUTTON
  • How can I change my Amazon Alexa voice in seconds?
  • What is dating app Bumble?
  • How can I increase my Snapchat score?
  • How can I test my broadband internet speed?
  • Here's how to find your Sky TV remote in SECONDS

In other news, a Google Maps fan has spotted a "secret" military base tucked away in the middle of the Sahara desert.

Samsung has teased a glimpse of the design for its highly anticipated Galaxy Z Fold 3 smartphone.

And, the next iPhone will come in a new pink colour and start at just under £800, according to recent rumours.

We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]

    Source: Read Full Article