Apple Will Scan iCloud For Child Abuse Imagery: But What About Privacy?


Tech giant Apple has announced that its iPhone, iPad and Mac software updates planned for Autumn will include a new feature to fight child abuse and exploitation. However, since the software will scan images on the devices, questions about privacy have surfaced…

There’s a fine line…

Apple said that Siri and search features will receive an update, providing parents and children with information to help them seek support in “unsafe situations.” The program will also “intervene” when users try to search for child abuse-related topics.

Apple's plan to scan phones for child abuse worries privacy advocates - CNET

Apple will also warn parents and children when they might be sending or receiving a sexually explicit photo using its Messages app, either by hiding the photo behind a warning that it may be “sensitive” or adding an informational pop-up.

There’s a fine line in this instance between how much privacy Apple users are owed. On the one hand, child abuse must be fought through all means necessary. On the other, there are many innocent users whose data is going to be filtered and scanned. And as one professor has warned, there’s great potential for misuse.

Professor Matthew Green stated in a series of tweets that overall, this is not a good idea. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear”, he started.

Apple now automatically scans iPhone photos for sick child abuse images

He further explained that “Initially I understand this will be used to perform client-side scanning for cloud-stored photos. Eventually, it could be a key ingredient in adding surveillance to encrypted messaging systems.” This means that if one photo matching the “description”, so to say”, is found, it won’t be flagged. If multiple photos are found though, that will raise the alarm with Apple.

NSPCC child abuse helpline has record call numbers in pandemic - BBC News

Although the original intention is noble, the end game might not be quite as much given its capability to evolve into something that can pinpoint a person’s exact location. However, there are other queries that even loving parents have raised.

It’s nothing new, and there’s certainly no shame in it, to snap a photo of your child having a blast during bath time. If you’ve taken a couple, would that cause the red flags to start waving at Apple HQ?

Much to be concerned about

Similar technology certainly has been abused in the past. That was the case with NSO Group, an Israeli software firm, which makes government surveillance tech. Its Pegasus spyware should have been used to fight criminals and terrorists. Instead, it was reportedly used in hacking attempts on around 50,000 phone numbers attributed to activists, government leaders, journalists, lawyers and teachers around the globe.

Yeesh No GIF - Yeesh No Creepy - Discover & Share GIFs

All in all, there’s much to be concerned about, and in our bid to win the battle against child exploitation, we must equally at all costs keep our eye on the background war with our rights to privacy.

What do you make of all this? Tell us in the comments!