Pages

Friday 6 August 2021

Fury at Apple's plan to scan iPhones for child abuse images and report 'flagged' owners to the police after a company employee has looked at their photos

 Data privacy campaigners are raging today over Apple's plans to automatically scan iPhones and cloud storage for child abuse images and report 'flagged' owners to the police after a company employee has looked at their photos.  

The new safety tools will also be used to look at photos sent by text messages to protect children from 'sexting', automatically blurring images Apple's algorithm's detect could child sexual abuse material [CSAM]. 

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide.   

The plans have been blasted as a 'huge and regressive step for individual privacy' over fears the system could easily be adapted to spot other material and is open to abuse.   

Privacy campaigners have said they fear Apple's plans to scan iPhones for child abuse images will be a back door to access user's personal data after the company unveiled a trio of new safety tools on Thursday

Privacy campaigners have said they fear Apple's plans to scan iPhones for child abuse images will be a back door to access user's personal data after the company unveiled a trio of new safety tools on Thursday

Its Messages app will use on-device machine learning with a tool known as 'neuralHash' to look for sensitive content. In addition, iOS and iPadOS will 'use new applications of cryptography to limit the spread of Child Sexual Abuse Material online'

Its Messages app will use on-device machine learning with a tool known as 'neuralHash' to look for sensitive content. In addition, iOS and iPadOS will 'use new applications of cryptography to limit the spread of Child Sexual Abuse Material online'


The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user's photo album. 

Instead, the system will look for matches, securely on the device, based on a database of 'hashes' - a type of digital fingerprint - of known CSAM images provided by child safety organisations. 

As well as looking for photos on the phone, cloud storage and messages, Apple's personal assistant Siri will be taught to 'intervene' when users try to search topics related to child sexual abuse. 

Child safety campaigners who for years have urged tech giants to do more to prevent the sharing of illegal images have welcomed the move - but there are major privacy concerns emerging about the policy.

There are concerns that the policy could be a gateway to snoop on iPhone users and could also target parents innocently taking or sharing pictures of their children because 'false positives' are highly likely. 

Others fear that totalitarian governments with poor human rights records, could, for instance, harness it to convict people for being gay if homosexuality is a crime. 

Security researcher Alex Muffett said Apple was 'defending its own interests, in the name of child protection' with the plans and 'walking back privacy to enable 1984'. 

He raised concerns the system will be deployed differently in authoritarian states, asking 'what will China want [Apple] to block?' 

Greg Nojeim of the Center for Democracy and Technology in Washington, DC said that 'Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship.'

This, he said, would make users 'vulnerable to abuse and scope-creep not only in the United States, but around the world.'

'Apple should abandon these changes and restore its users' faith in the security and integrity of their data on Apple devices and services.' 

Matthew Green, a security professor at Johns Hopkins University, said the plan was a 'really bad idea' and open to abuse. 

He said it could be an issue 'in the hands of an authoritarian government,' adding that the system relies on a 'database of 'problematic media hashes' that consumers can't review.   

Speaking to The Financial Times, which was first to report the news early Thursday, Green said that regardless of the intention, the initiative could well be misused. 

'This will break the dam — governments will demand it from everyone,' Green told the news outlet. 

'The pressure is going to come from the UK, from the US, from India, from China. I'm terrified about what that's going to look like', Green told WIRED.

Ross Anderson, professor of security engineering at Cambridge University, branded the plan 'absolutely appalling'. 

'It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops', he said. 

India McKinney and Erica Portnoy of the digital rights group Electronic Frontier Foundation said in a post that 'Apple's compromise on end-to-end encryption may appease government agencies in the United States and abroad, but it is a shocking about-face for users who have relied on the company's leadership in privacy and security.'   

It said Apple's system would 'break key promises of the messenger's encryption.. and open the door to broader issues', describing the system as Apple 'watching over the user's shoulder'.  


Under the plan, when a child receives a sexually explicit photo, the photo will be blurred, the child warned and told it is okay if they do not want to view the photo. 

The child can also be told that their parents will get a message if they view the explicit photo.

Similar measures are in place if a child tries to send a sexually explicit image. 

In addition to the new features in the Messages app, iOS and iPadOS will 'use new applications of cryptography to help limit the spread of [Child Sexual Abuse Material] online, while designing for user privacy,' the company wrote on its website

'CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.'

It was not clear if the system would still work if iCloud syncing was turned off.  

Additionally, Apple is updating Siri and Search with expanded information on what to do if parents and children 'encounter unsafe situations,' with both technologies intervening if users try to search for CSAM related topics. 

The updates will be a part of iOS 15, iPadOS 15, watchOS 8 and macOS Monterey later this year. 

Apple said Thursday it will scan US-based iPhones for images of child abuse

Apple said Thursday it will scan US-based iPhones for images of child abuse


Apple outlined CSAM detection, including an overview of the neuralHash technology, in a 12-page white paper listed here

The company has also posted a third-party review of the cryptography used by Apple. 

Other tech companies, including Microsoft, Google and Facebook have shared what are known as 'hash lists' of known images of child sexual abuse.

In June 2020, 18 companies in the Technology Coalition, including Apple and the aforementioned three companies, formed an alliance to get rid of child sexual abuse content in an initiative dubbed 'Project Protect'.

Child protection groups and advocates lauded Apple for its moves, with some calling it a 'game changer.' 

'Apple´s expanded protection for children is a game changer,' John Clark, President & CEO, National Center for Missing & Exploited Children, said in a statement. 

'With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,'

Julia Cordua, the CEO of anti-human trafficking organization Thorn, said that Apple's technology balances 'the need for privacy with digital safety for children.'  

Former Attorney General Eric Holder said Apple's efforts to detect CSAM 'represent a major milestone' and demonstrate that child safety 'doesn't have to come at the cost of privacy.'   

In January 2020, Jane Horvath, a senior privacy officer for Apple, confirmed the company scans photos uploaded to the cloud to look for child sexual abuse images

In January 2020, Jane Horvath, a senior privacy officer for Apple, confirmed the company scans photos uploaded to the cloud to look for child sexual abuse images

In January 2020, Jane Horvath, a senior privacy officer for the tech giant, confirmed that Apple scans photos that are uploaded to the cloud to look for child sexual abuse images. 

Speaking at the Consumer Electronics Show, Horvath said other solutions, such as software to detect signs of child abuse, were needed rather than opening 'back doors' into encryption as suggested by some law enforcement organizations and governments.

'Our phones are small and they are going to get lost and stolen', said Ms Horvath.

'If we are going to be able to rely on having health and finance data on devices then we need to make sure that if you misplace the device you are not losing sensitive information.'

She added that while encryption is vital to people's security and privacy, child abuse and terrorist material was 'abhorrent.'

The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data.

Apple was one of the first major companies to embrace 'end-to-end' encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.

No comments:

Post a Comment