Craig Federighi, Apple’s senior vice president of software development, in a recent interview with the Wall Street Journal, defended the user photo analysis system and talked about measures to protect the system from possible abuse.
Mr. Federighi clarifies that two new tools were a subject of an announcement last week. The former is designed to identify illegal baby images stored in the cloud, while the latter allows parents to track which photos their children are sending and receiving. According to the company’s vice president, Apple has built a reputation for protecting user privacy. The manufacturer continues to work in this direction, and decisions related to child protection will not affect the situation in any way. At the same time, Apple and many other companies are under constant pressure from government departments around the world, they are increasingly insisting on the creation of solutions to combat child pornography. The company developed such a solution, but it caused a flurry of criticism from colleagues and ordinary users.
Apple vice president supported checking user photos on iPhone
The public has concerns that the technology could be in use for other purposes that have nothing to do with child protection. These accusations, Mr. Federighi flatly rejected and said that Apple’s decision is protected from such encroachments “by several levels of control”. One of the systems that Apple implements involves notifying the company’s profile department; in the event that a user tries to upload images of child abuse to the iCloud cloud service. Unlike other cloud providers, Apple does not validate user content stored in remote storage. Instead, processing is taking place directly on the iPhone, the system locally verifies the photos with the available samples.
Gizchina News of the week
This means that if a person does not upload anything to iCloud at all; then Apple will not receive any notifications. If, however, several dozen notifications related to a specific user accumulate, the company’s employees independently conduct the necessary expertise; and if the suspicions are correct, they notify not the security forces; but the National Center for Missing and Exploited Children (NCMEC); this non-profit organization acts as a center exchanging messages about the facts of child abuse. At the same time, it is important to note that none of the employees during the check will be able to access the entire library of the user’s photos.
Apple defends iPhone photo scanning
People who criticize Apple initiative felt that photos that had nothing to do with the goal could be the subjet of an upload to the system as samples in order to analyze, for example, political activity. But in response to this accusation, Mr. Federighi said that the database of prohibited images is created from various sources; this is not only NCMEC, but also other specialized non-profit organizations; and at least two of them are in foreign jurisdictions. In this case, they add a photo to the database if it comes from several sources at once. In addition to this, representatives of these organizations and independent experts will conduct an audit of the database of prohibited images.
The vice president of Apple also recalled that the company is introducing another similar function. It will allow parents to protect their children from receiving or sending intimate images through the iMessage app. When the system detects this fact, parents receive a corresponding text notification. At the same time, Apple itself does not receive any notifications.
Mr. Federighi attributed the criticism of the innovations to the fact that the company announced two of these tools at the same time; and the participants in the discussion could confuse the functions of the two systems.