[ad_1]
August, Apple Several new features designed to prevent the dissemination of child sexual abuse materials are introduced in detail.Strong opposition from cryptographers to privacy advocates Edward Snowden He himself was almost instantaneous, largely related to Apple’s decision, not only Scan iCloud photos for CSAM, but also check for matches on iPhone or iPadAfter several weeks of continued protests, Apple is withdrawing. At least until now.
The company said in a statement on Friday: “Last month, we announced plans to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse materials.” Based on feedback from customers, advocacy groups, researchers, and others, we decided to spend more time collecting opinions and making improvements in the coming months before releasing these vital child safety features.”
Apple did not provide more guidance on what these improvements might take or how the input process works. But privacy advocates and security researchers are cautiously optimistic about the suspension.
“I think this is a smart move by Apple,” said Alex Stamos, a former chief security officer of Facebook and co-founder of the Krebs Stamos Group, a cybersecurity consulting firm. “This problem involves a series of extremely complex trade-offs, and it is extremely unlikely that Apple will find the best solution without listening to various stocks.”
The CSAM scanner works by generating a cryptographic “hash” (a type of digital signature) of known abusive images, and then combing through large amounts of data for matching. Many companies have taken some form, including Apple’s iCloud Mail. But in its plan to extend scanning to iCloud photos, the company recommends taking additional steps to check these hashes on your device, if you have an iCloud account.
The introduction of the feature that compares the image on the phone with a set of known CSAM hash values (provided by the National Center for Missing and Exploited Children) immediately raised concerns that the tool might one day be used for other purposes . Riana Pfefferkorn, a research scholar at the Stanford Internet Observatory, said: “Apple will deploy CSAM scanning on everyone’s mobile phones. The government can and will subvert this surveillance tool so that Apple can also search for other materials in people’s mobile phones.”
In the past, Apple rejected multiple requests from the US government to develop a tool that allows law enforcement agencies to unlock and decrypt iOS devices.But the company also has Make concessions In countries like China, customer data there exists on state-owned servers. As lawmakers around the world step up their efforts to break encryption more broadly, the introduction of CSAM tools feels particularly worrying.
Johns Hopkins University cryptographer Matthew Green said: “They obviously find this politically challenging, and I think this shows how untenable their stance that’Apple will always reject government pressure’ is.” If they feel they must scan, they should scan unencrypted files on the server.” This is the standard practice of other companies, such as Facebook, which not only scans CSAM regularly, but also scans terrorism and other prohibited content types.Green also suggested that Apple should make iCloud storage End-to-end encryption, So it can’t view these images even if it wants to.
The controversy surrounding Apple’s plan is also technical. Hashing algorithms can produce false positives, erroneously identifying two images as a match, even if they are not. These errors are called “collisions” and are particularly worrying in the context of CSAM. Soon after Apple announced the news, researchers began to find conflicts in the iOS “NeuralHash” algorithm that Apple planned to use. Apple said at the time that the version of NeuralHash available for research is not exactly the same as the version that will be used in the program, and that the system is accurate. Paul Walsh, founder and CEO of the security company MetaCert, said that collisions may not have a substantial impact in practice, because Apple’s system requires 30 matching hashes before issuing any alerts. After that, human reviewers will Can distinguish what is CSAM and what is false positive.
[ad_2]
Source link