_____________________________ ST. NORBERT CHURCH           RATES _______________________


Apple Will Scan iPhones for Child Pornography

August 7, 2021


Apple has announced the rollout of a new feature for later this year that will scan photos and text messages on Apple devices looking for known images of child sex abuse.

And some people have a problem with that.

One expert in cybersecurity said Apple “has gone out of its way to make this as privacy friendly as possible.

“There will be part of the program that has access to data,  what they call hashes of imagery, in other words, the picture reduced to a numeric formula. Apple will use that numeric formula to look for things, images, that match it on your device.”

But security watchdogs are concerned the new software could be exploited by hackers and foreign governments.

Will Cathcart, the CEO of WhatsApp, the message platform that is encrypted and does not leave a trail of messages after deletion, said he “was concerned.”

Really? Was Cathcart concerned when the Trumps and Jared Kushner used his platform to communicate with the Saudis and other corrupt officials endangering U.S. national security, only to Later delete them?

In an interview Cathcart said. “Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world. Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.”

We get it Cathcart, if there is child porn on someone’s phone, unless they shared it, they should not be investigated?

Apple shot that down saying “the program is designed so the chances of it making a mistake … of it saying something is child pornography when it’s not … are infinitesimally small.

“We want to help protect children from predators who use communication tools to recruit and exploit children, and limit the spread of Child Sexual Abuse Material (CSAM).”

If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children will be notified.

The features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.