What Apple can do next to combat child sexual abuse


In May 2019, Melissa Polinsky, head of Apple’s global investigations and child safety team, faces investigators in the UK investigating child sexual abuse cases.During two hours of questioning, Polinsky Accepted Apple employs only 6 people in its global team to investigate images of child abuse. Polinsky also said that Apple’s technology used to scan existing child abuse images online is “effective.”

Fast forward two years, and Apple’s work on addressing child sexual abuse materials has gone off track. On September 3, the company rarely turned around in public because it suspended plans to introduce a system that can find known child sexual abuse materials (CSAM) on iPhones and iPads of US users. “We decided to spend more time collecting opinions and making improvements in the next few months before releasing these vital child safety features,” Apple said in a statement, citing the “feedback” it received .

So what will Apple do next? The company is unlikely to win everyone or please everyone with the next thing-and the consequences of its plan have caused great chaos. The technical complexity of Apple’s proposal has reduced some public discussions to blunt statements of support or opposition, and explosive language has been Some situations, A polarizing debate. The European Commission is formulating child protection legislation and may force technology companies to scan CSAM.

“this action [for Apple] It’s time for some kind of content censorship,” said Victoria Baines, a cybersecurity expert who has worked on child safety investigations on Facebook and Europol. US law requires technology companies to report to the US non-profit child safety organization. The Center for Missing and Exploited Children (NCMEC) reported any CSAM they found online, but Apple has historically lagged behind its competitors.

In 2020, NCMEC received 21.7 million CSAM reports, From 16.9 million in 2019. Facebook is at the top of the 2020 list-it released 20.3 million reports last year. Google did 546,704; Dropbox 20,928; Twitter 65,062, Microsoft 96,776; and Snapchat 144,095. In 2020, Apple only submitted 265 CSAM reports to NCMEC.

Baynes said there are multiple “logical” reasons for this difference. Not all technology companies are created equal. For example, Facebook is based on sharing and connecting with new people. Apple’s main focus is on its hardware, and most people use the company’s services to communicate with people they already know. Or, more frankly, no one can search iMessage for children who can send pornographic messages to them. Another issue here is detection. The number of reports the company sends to NCMEC can be based on its efforts to find CSAM. Better detection tools may also mean more abusive materials are found.There are also some technology companies Do more than others Eradicate CSAM.

Detecting existing child sexual abuse materials mainly involves scanning the content sent or uploaded by people when the content reaches the company’s servers. Codes (called hashes) are generated for photos and videos and compared to existing hashes of previously identified child sexual abuse material. Hash lists are created by child protection organizations such as NCMEC and the Internet Watch Foundation in the United Kingdom. When a match is determined, the technology company can take action and report the results to NCMEC. The most common process is done through PhotoDNA developed by Microsoft.

Apple’s plan to scan the CSAM uploaded to iCloud overturned this method and used some clever cryptography to transfer some of the tests to people’s phones. (Apple has scanned iCloud mail for CSAM Since 2019, But does not scan iCloud photos or iCloud backups. ) The proposal proved to be controversial for a variety of reasons.



Source link

Recommended For You

About the Author: News Center