Apple’s photo scanning program triggered strong protests from policy organizations

[ad_1]

Over 90 Policy groups from the United States and around the world signed an open letter urging Apple to abandon its plan Have an Apple device Scan photos of child sexual abuse material (CSAM).

“Signature organizations dedicated to civil rights, human rights, and digital rights around the world are writing letters urging Apple to abandon its announcement on August 5, 2021, to build surveillance capabilities into iPhone, ipad, And other Apple products,” Letter To Apple CEO Tim Cook. “While these features are designed to protect children and reduce the spread of child sexual abuse materials (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and safety of people around the world, and have disastrous consequences. The consequences for many children.”

Center for Democracy and Technology (CDT) Announce this letter, CDT Security and Surveillance Project Co-Director Sharon Bradford Franklin said, “We can expect the government to use the surveillance capabilities Apple has built in iPhones, iPads, and computers. They will require Apple to scan and block images that violate human rights. Political protests and other content that should be protected as freedom of speech constitute the pillars of a free and democratic society.”

This open letter was signed by groups from six continents (Africa, Asia, Australia, Europe, North and South America). The American Civil Liberties Union, Electronic Frontier Foundation, Fight for the Future, LGBT Technology Partnership and Research Institute, New American Open Technology Institute, STOP (Surveillance Technology Supervision Project), and the Sex Worker Project are some of the signatories in the United States Judicial Center. Signatories also include groups from Argentina, Belgium, Brazil, Canada, Colombia, Dominican Republic, Germany, Ghana, Guatemala, Honduras, Hong Kong, India, Japan, Kenya, Mexico, Nepal, Netherlands, Nigeria, Pakistan, Panama, Paraguay, Peru, Senegal, Spain, Tanzania and the United Kingdom.The complete list of signers is here.

Scan iCloud photos and information

Apple Announce Two weeks ago, devices with iCloud Photo enabled will upload images to cloudThe iPhone uploads each photo to iCloud immediately after it is taken, so if the user has previously turned on iCloud photos, the scan will happen almost immediately.

Apple Indicates that its technology “analyzes the image and converts it into a unique number specific to that image” and tags the photo when its hash is the same or nearly the same as any hash that appears in the known CSAM database. When about 30 CSAM photos are detected, an account can be reported to the National Center for Missing and Exploited Children (NCMEC), which is a threshold set by Apple to ensure that “the given probability of mislabeling is less than 10,000 per year One billionth” account. “This threshold may be changed in the future to maintain a false alarm rate of 1 in 1 trillion.

Apple has also added a tool to the Messages application that will “analyze image attachments and determine whether the photos are pornographic” without allowing Apple to access these messages. This system is optional for parents. If it is turned on, it will “warn children and their parents when receiving or sending pornographic photos.”

Apple said that the new system will be launched later this year, with updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. In the beginning it will only be in the United States.

Both scanning systems are related to the signer of the open letter. On message scanning that parents can enable, the letter says:

As we all know, algorithms designed to detect pornography are unreliable. They are prone to mislabeling art, health information, educational resources, promotional information, and other images. The right of children to send and receive such information is protected by the United Nations Convention on the Rights of the Child. In addition, the system developed by Apple assumes that the “parent” and “child” accounts involved actually belong to the adult who is the parent of the child, and that these people have a healthy relationship. This may not always be the case; the abusive adult may be the organizer of the account, and the consequences of parent notification may threaten the safety and well-being of the child. The parents of LGBTQ+ youths in the family account who are not sympathetic to them are particularly at risk. Due to this change, iMessages will no longer provide confidentiality and privacy to these users through an end-to-end encrypted messaging system, in which only the sender and intended recipients can access the information sent. Once this backdoor feature is built in, the government can force Apple to extend notifications to other accounts and detect objectionable images for reasons other than pornography.

[ad_2]

Source link