Apple to Begin Scanning Your Photos and Encrypted Messages

Unsplash
image_pdf
  • Save
image_print
  • Save
Apple is using the excuse of protecting children from sexual abuse to snoop in users’ images, but the tool called NeuralHash could also be “misused” by others, including governments, to surveil citizens. If child pornography is detected, the user’s account will be disabled, and the National Center for Missing and Exploited Children notified. Researchers say innocent people can easily be framed by sending the target seemingly innocuous images designed to trigger matches for child pornography. Other abuses could include government surveillance of dissidents or protesters. Apple also plans to scan users’ encrypted messages for sexually explicit content as a child-safety measure, which could be used to frame innocent people.

Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detected known images of child sexual abuse, called “NeuralHash,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the center’s database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t “see” such images, just mathematical “fingerprints” that represent them — could be put to more nefarious purposes.

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Read full article here…

Visit our Classified ads.

Check out our Classified ads at the bottom of this page.

Recent stories & commentary

  • Save
Kakistocracy

Colorado Governor Pays Children to Get Tested for Covid and Encourages People to Lie to their Families about Getting Vaxxed

Governor Polis encouraged kids and spouses to lie about taking a Covid vaccine: “A spouse, a parent you live with who doesn’t want to get it, is against it… you can still privately get that vaccine without your partner knowing if that’s what keeps the harmony in the family and you are protecting yourself.”

Classifieds

For classified advertising rates and terms, click here. The appearance of ads on this site does not signify endorsement by the publisher. We do not attempt to verify the accuracy of statements made therein or vouch for the integrity of advertisers. However, we will investigate complaints from readers and remove any message we find to be misleading or that promotes anything fraudulent, illegal, or unethical.

Subscribe
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Wynn
Wynn
2 months ago

create the problem have a solution that screws the public population.