Home Commercial Awareness Apple Plans to Scan U.S. iPhones for Child Abuse Imagery

Apple Plans to Scan U.S. iPhones for Child Abuse Imagery

by Allison Davis

Apple has announced that the organization intends to install software on iPhones in the United States, to scan for child abuse imagery.

How it would work

Essentially, Apple’s neural match algorithm will continuously scan photos stored on user’s iPhones and that has been backed up to iCloud.

Photos will be converted into a string of numbers using a process called hashing. Each string of numbers will then be compared to those of images on a database of known images of child sexual abuse.

If the automated system detects any potentially illegal imagery, Apple will enable the photos to be decrypted and the system will proactively alert a team of human reviewers. If the material is verified to be illegal, the team would then contact law enforcement to take action.

The system has already been trained on 200,000 sex abuse images that have been collected by the US non-profit National Center for missing and exploited children.

Although other current cloud-based storage systems and social networking sites already scan for child abuse imagery, current processes become more complex when attempting to access data on private devices.

Apple’s system, however, is less invasive as the screening is done directly on the phone and if and only if there is a match of abusive imagery, a notification is sent back to those searching.

Why is this happening now?

Tech companies from the likes of Apple and Facebook have long defended their increased use of encryption in their products and services to prevent unauthorized access to digital information. However, pressure from law enforcement to increase access has only intensified after Apple went to court with the FBI in 2016 over accessing a suspect’s iPhone following an act of terror, shooting in San Bernardino, California.

Apple acknowledges and seeks a compromise between its own promise to protect customers’ privacy and the ongoing demands from governments.

Such a proposal is Apple’s solution to the increased U.S. national security-related requests from governments, law enforcement agencies, and child safety campaigners for better assistance in criminal investigations such as acts of terrorism and child pornography.

What are the potential effects?

While some feel an action like this is too late, security researchers have raised concern over the potential of greater surveillance on millions of personal devices.

Despite their support of the efforts being used to combat child abuse, many are concerned that by opening the back door, Apple is risking further enticing governments around the world to seek access to citizens’ data.

While the system is currently only trained to detect child sex abuse, in the future it could be adapted to scan for virtually any other targeted imagery and text.

As the system develops, it has the potential to go well beyond the original intent.

Further, many are concerned that Apple’s plans will increase pressure on other tech companies to use similar software and techniques.

Despite their good intentions and promise that what happens on your iPhone, stays on your iPhone, Apple has been accused of ​​paving the road to mandated security weakness around the world. Further enabling the acceptability of scanning through individuals’ personal lives and private communications if intentions are deemed valid enough.

To keep up with the latest commercial news, click on commercial to get your daily dose.

Donate & Support

You may also like

Leave a Comment