Apple’s controversial new detection system makes zero mistakes – says Apple

October 14, 2021 by No Comments

Any iPhone with iOS 15 operating system – available this fall – will check whether users upload child pornography images to iCloud storage service. It’s an extremely sensitive topic for Apple, the company that sells iPhones as the privacy-friendly alternative to Google Android. The feature is controversial because the scanning is done on the device itself – without the owner’s request.

In a background conversation with European journalists this week, Apple announced that this system will also work outside the United States. What does that mean for Dutch users? And can this technology not be misused by investigative services or meddling governments?

Five questions about Apple’s detection technique.

1 How does it work?

The internet is overflowing with images of children being sexually abused. Unlike storage services from Google and Microsoft, for example, Apple does not scan its servers for child pornography, but will do so on the phone itself. Feeling “morally obliged” to combat child abuse, the company has designed a detection system as part of the iOS 15 operating system. Once the owner of an iPad or iPhone uploads photos to iCloud, Apple’s storage service, the images are matched against a database of child pornography material.

Apple’s algorithm, NeuralHash, does not look at the content of the images but generates a hash, a sum of all pixels that is translated into a short digital code. Once this digital fingerprint matches material in a previously found child pornography database of NCMEC (National Center for Missing & Exploited Children), the photo is marked as suspicious. The system sounds the alarm at – for the time being – thirty suspicious pictures by telephone, after which a team of human assessors checks whether a user is indeed trying to store child pornography in the cloud.

Only when this is the case will the NCMEC be alerted. “We expect zero misreporting,” Apple said.

2 Can it go wrong?

Photos that parents take themselves of children in the bath or on the beach are not in the dataset and are not detected, according to Apple; the algorithm would be less wrong than once every 1,000 billion times per account. But researcher Asuhariet Ygvar found a previous version of Apple’s hash database, which allows you to give different images the same fingerprint. That leaves room for manipulation.

Apple says such computer errors are part of it – hence the NeuralHash reports are checked by human reviewers. “We won’t pass anything on to NCMEC if nothing can be found. The worst that can happen is that people on our audit team waste their time on images that have nothing to do with child porn,” said one of the Apple experts.

The system will also work if someone sends you prohibited material via WhatsApp, for example, and your phone automatically saves it in iCloud.

3 Can the system be abused?

Experts are afraid of function creep; the detection system can be expanded to search other types of images as well. Apple could be forced to modify the hash database to, for example, search for beheadings – to track down supporters of terrorist organizations.

“Once such a mechanism has been built, more parties will want to use it,” says Jaap-Henk Hoepman, associate professor of privacy design at Radboud University. governments.” In China, Apple does not oppose forced censorship or government surveillance.

Apple states that it will only use hash databases that have been approved by multiple countries. The company finds NeuralHash more transparent and privacy-friendly than scanning on servers. And if you don’t use iCloud, the detection system won’t work, Apple points out. “Whoever subscribes to iCloud is clearly pointed to the control.”

Alex de Joode, risk manager at internet exchange ASM-IX, sees no harm in it. “You also have your PC scanned by Microsoft, right? That goes much further; such virus scanners look into every file.” De Joode helped the Online Child Abuse Expertise Agency (EOKM) with the infrastructure with which Dutch hosting services can check servers for child pornographic images.

“The pain of child abuse victims cannot be underestimated,” Alex Stamos, one of the US child pornography authorities, tweeted. “Privacy experts tend to overlook real-world issues,” said Stamos, who has been responsible for Facebook’s security in the past. He regrets that Apple has opted for its own approach and does not consult with other tech companies, such as Facebook, who are also trying to prevent the distribution of child pornography.

4 How does it work in the Netherlands?

The detection technique will first be introduced in the United States, but will eventually also work outside the US, Apple said. The system pays attention to the region that an iPhone or iPad user chooses when installing a new device. The Online Child Abuse Expertise Agency has already had contact with the technology company. But Apple will not be able to report possible offenders to us, says EOKM director Arda Gerkens on the phone. “NCMEC is also partly an investigation service that can receive reports of possible perpetrators, EOKM is not. We are not even allowed to have data about perpetrators, due to privacy legislation.”

According to Gerkens, the Dutch police already receive tens of thousands of reports from the NCMEC every year, but they do not have the capacity to investigate all those cases.

Apple’s methodology, including the 30-file threshold, strikes a good balance between privacy and child abuse, she says – the detection primarily helps prevent the spread of older, known abuse cases. Possession and distribution of such material is prohibited and perpetuates the system of sexual abuse.

Gerkens does not think that private companies such as Apple should be responsible for investigation, but politics. The EU is preparing legislation requiring the scanning of messaging and cloud services for prohibited material. New laws are also being drafted in other regions, such as India and the United Kingdom.

5 Why is Apple proposing this now?

Apple is pre-sorting for new legislation and thus makes it easier to introduce mandatory scanning.

The announcement — mid-summer — was timed to get past the inevitable discussions with privacy experts and law enforcement before iOS 15 hits the market in late September. Then the new iPhones will also be in the shops, good for two-thirds of Apple’s turnover ($275 billion in 2020, or 234 billion euros). Apple prefers to emphasize what your new phone can do – not what it can’t.

What is confusing, Apple also acknowledges, is that iOS 15 aims to combat sexual abuse in two different ways. The NeuralHash technique tackles possession and distribution of child pornography (via iCloud). In addition, iMessage, the chat app on the iPhone, will receive protection against grooming – manipulation of minors. Parents can protect their children’s phones with a filter that blocks nude photos. The latter function is controversial because parents interfere in the private communication of their children.

The announcement of two detection systems at once in iOS 15 was unfortunate, Apple software chief Craig Federighi told The Wall Street Journal. “It caused unnecessary misunderstanding.”