Reviews | Apple wants to protect children. But this creates serious privacy risks.

Apple last week announced plans to introduce new tools that will allow it to search iPhones for images related to child sexual abuse and exploitation. Apple is billing these innovations as part of a child safety initiative, and indeed, they can help make the online world a safer place for children, which couldn’t be a more laudable goal.

But these tools, which should become operational in the coming months, also open the door to worrying forms of surveillance. Apple should refrain from using these technologies until we can better study them and understand their risks.

Apple’s plan has two main components. First, parents can choose to have their children’s iMessage accounts scanned for nude images sent or received, and be notified if this happens in the case of children under the age of 13. All children will receive warnings if they seek to see or share a sexually explicit image. .

Second, the company will analyze the photos you store on your iPhone and compare them to recordings of known child sexual abuse material provided by organizations such as the National Center for Missing and Exploited Children. Apple says it will only do this if you upload your photos to iCloud Photos as well, but this is a political decision, not an essential technological requirement.

The technology involved in this plan is fundamentally new. While Facebook and Google have a long history of scanning photos that people share on their platforms, their systems don’t process the files on your own computer or phone. Since Apple’s new tools have the power to process files stored on your phone, they pose a new threat to privacy.

In the case of the iMessage child safety service, the intrusion on privacy is not particularly serious. At no time are Apple or law enforcement notified of a nude image sent or received by a child (again, only parents of children under 13 are notified), and children have the ability to opt out of a potentially serious error without informing their Parents.

But the other technology, which allows Apple to scan photos on your phone, is more alarming. While Apple is committed to using this technology to search for child pornography only, and only if your photos are uploaded to iCloud Photos, nothing in principle prevents this type of technology from being used for other purposes and without your consent. It is reasonable to question whether law enforcement in the United States could force Apple (or any company that develops such capabilities) to use this technology to detect other types of images or documents stored on them. computers or people’s phones.

While Apple is introducing the Child Sexual Abuse Detection feature only in the United States for now, it’s not hard to imagine that foreign governments will be keen to use this type of tool to monitor. other aspects of the lives of their citizens – and could put pressure on Apple. comply. Apple does not have a good track record of resisting such pressure in China, for example, after transferring data from Chinese citizens to Chinese government servers. Even some democracies criminalize broad categories of hate speech and blasphemy. Would Apple be able to resist demands from legitimately elected governments to use this technology to help enforce these laws?

Another concern is that the new technology has not been sufficiently tested. The tool is based on a new algorithm designed to recognize known images of child sexual abuse, even if they have been slightly altered. Apple says it’s extremely unlikely that this algorithm will accidentally flag legitimate content, and it has added some safeguards, including asking Apple employees to review images before submitting them to the National Center for Missing and Exploited Children. But Apple has allowed few or no independent IT specialists to test its algorithm.

The IT and policymaking communities have spent years examining the types of issues raised by this type of technology, trying to strike the right balance between public safety and individual privacy. The Apple plan turns all of these deliberations upside down. Apple has more than a billion devices worldwide, so its decisions affect the security plans of every government and every other tech company. Apple has now sent a clear message that it is safe to create and use systems that directly scan people’s personal phones for banned content.

Protecting children from harm is an urgent and crucial goal. But Apple has created a model for doing this that may well be abused for decades to come.


Source link

Comments are closed.