Is the child’s safe cost too high?Apple suffers from controversial new features

Is the child’s safe cost too high? Apple suffers from controversial new features

A recent scandal about privacy has attracted everyone. Apple new features aimed at preventing child abuse of CSAM detection to divide the whole world: worry and actively optimistic two camps (very coincidentally, most Apple staff are optimists).

We have worked hard in AdGuard. The safety of minors is critical, but new features violations of personal privacy is extremely possible, so there must be special protection methods. And on this, we may be able to help.

Latest news (2021.9.3):

We will undoubtedly hear more about CSAM testing in the future, but now I can see that public public opinion has strong energy and feels very good.

basic knowledge

Let’s clarify it before you ask questions. We didn’t play the wrong word. Fraud (English: SCAM) is fraud, but CSAM refers to the English abbreviation of child sexual abuse. Apple: “We want to protect children from illegal molecules illegally exploited and spread them through communication tools.”

The purpose is really good, but the result is not necessarily. Who will oppose the protection of children from a brutal behavior? So what is the reason why everyone is angry?

The main problem is that Apple applies its new technique – Apple needs to scan all the images of the Apple ecosystem (ie iPhone, iPad, MacBook, etc.) on your device, and detect the elements of child abuse. If Apple detected any child abuse, they will report to the police station.

Of course, they will not look at the episode picture to look or scroll the digital page as we will look at us. Let’s take a look at the working principle of new technology.

They will first get images collected by children’s safety organizations and verify that they belong to CSAM. At this stage is manual work.

Then they turn this stack of images to a hash value. The hash value is a line symbol, used to describe the contents of the picture. Regardless of whether the picture is changed, cut, adjust the size, the symbol order will remain unchanged.

They will be uploaded to your devices. The CSAM gallery is hard coded into the OS image, so you will always have a hash value in this warehouse in your device. Original images are not stored on your device. However, who wants to save the CSAM hash value warehouse?

They calculate a hash value for all pictures of apple iCloud.

Compare the hash value calculated on the device with the CSAM hash value. Compare it on the device.

Here, you should also mention a point, the picture on the app is uploaded by default. If necessary, users need to manually disable this feature.

If you find a matching symbol order, Apple will manually check your account. If the reviewer is confirmed is a picture in the CSAM image database, Apple will report to the NCMEC complaint. NCMEC refers to the national missing and exploiting children’s center. This is an organization established and cooperated with law enforcement agencies in 1984.

Robots will make mistakes

The operational effect of artificial intelligence algorithm based on machine learning is all probability characteristics (“” “” The following behavior is most likely or may appear on the picture “). Artificial intelligence is not possible to correctly judge. There will always have a certain proportion of false positives (in our case, there will always be an error to determine the case where the picture belongs to the CSAM project).

There will be other defects in the operation of the recognition algorithm. For example, the neural network will always suffer from fitting. They learn how to analyze a set of data to understand format and relevance, in the future use of these formats to find relevance on other data. The preforation can occur in a group of learned data is too small or too simplified. As a result, artificial intelligence “adapt” has learned the data set, and other data or examples of real world cannot be analyzed. This way it thinks that Apple can only be red or unable to distinguish Kihuahua and Blueberry Marthen.

The user has searched different pictures with the same hash value. This is a natural hash collision (neuralhash), and this is what we are worried.

Image Source:

However, Apple also considered this problem, so they intend to participate in suspicious situations. Who is these people? Who did they report? What are their specific responsibility? Is their intention pure?

People make mistakes

Only one year ago, Facebook was exposed to millions of contractors to make the contractor to transcribe their service users. No one explains the source of recording and transcription to them. Just let them transcribe. Some workers cause psychological trauma because they need to listen to others. Sometimes they have to listen to people quarrels and common content. Those who we can think about “Cabin Talk” may not be very happy that there is a free worker listening to their conversation.

In fact, this is a very common practice of many social media companies. The reason for Facebook is criticized because its work organization is confusing. In August 2019, Google, Amazon and Apple allowed customers to disable the records sent to the voice assistant to the company’s default settings. That’s right, some people can listen to others asking Siri and Alexas, and report to Apple and Amazon. In addition, 2020 studies have also found it. There are more than 1,000 words sounds like the name or instructions of the voice assistant. For example, Alexa can identify its own name on ?Election? (Election), send your discussion on Amazon’s contractors on a national presidential election.

How high is the possibility of working principles and problems?

So when is it not a robot and an Apple choice starts to see your picture?

This is the function of the probability mentioned earlier.

The picture hash value compared to the CSAM hash value. The result is saved in the so-called security credentials compared to the results. It refers to data that “encodes the matching results and additional encrypted data about the picture”.

What is these extra data? It is a picture. Apple mentioned the official document of CSAM to mention “Visual DeriVative”). Show the picture of the picture. It is your picture but its quality is low. The following is a probably process because no one has really known the details of this process.

One trillion, it sounds like it is good?

However, let me point out that this is the possibility of marking a particular account of child abuse. When the person reviews and confirms that the photo belongs to the child’s project, the account will be marked. But the possibility of matching false positive for a photo hash value is already high. This means that because there is a system error to check the possibility of your photos is one-month.

In this way, even more than one billions of possibilities have been misjudged, how many pictures will people upload in ICLOUD? We can estimate it. For example, we know 2021. The number of photos uploaded less than one year will be ten trillion. Although Facebook is more than Apple users, it is not every user upload all photos on your mobile phone to social media.

First, the number of uploaded photos is multiplied. Second, even if there is only one billion chance to have sharks to eat, you still don’t want to be the “lucky”. Because this is really serious. Even becoming a star, it is not worth it.

The consequences of the wrong mark are very serious. For example, even if you finally get a justice, or Apple acknowledges its wrong behavior, it is suspected that child abuse or makes you enter a blacklist of a human resource department.

Why do people worry about CSAM testing?

Let us summarize it.

Possible algorithm errors, resulting in destructive consequences for normal life and career.

Software error. Don’t confuse with the first point. In this stage of development, the machine makes the wrong mistakes are considered normal. In fact, the error is also normal, no software that does not make mistakes. However, the cost of making mistakes is different. Errors leading to personal data are generally a highest price.

The opaque system (Apple is arguing due to its unwillingness to disclose its product.). The only choice for users only believe that Apple’s intentions are good, and they attach great importance to user privacy and want to protect it.

Trust lack. After all Apple (and other companies) privacy protection defects and crimes, why should we continue to believe in them?

The potential of extrapolation. The technique is extended to the possibility of analyzing and detecting other types of data. Under the protection umbrella of “Protection Children”, there will be more opportunities to explore your data.

Abuse the possibility of data. By bad guys can upload specific photos to your iPhone, and this photo will match a specific hash value. Incidentally, the image library with a specific hash value has been organized.

The photo on the right is manually adjusted, deliberately let it have the same hash value with the photos on the left. Image Source:

According to the above explanation, why we want to rack your brain, think about the way users manage Apple to analyze their pictures. We launched voting in our social media account. Absolutely most members (about 86%) want to disable CSAM scans. We don’t believe that all voters are abused by children. They just can see potential risks.

We consider using AdGuard DNS to avoid uploading a security credentials to iCloud and mask CSAM detection. How can I do it? This is a way of applying the CSAM detection function. We can’t promise any specific things in detail.

Suppose Apple starts working with third parties, who knows what this image library will become?Every process can be blocked, but we can’t announce which solution is best, and we can simply apply it to our AdGuard DNS.Need more research and test.In addition, we can also block iCloud authorization.For all AdGuard DNS users, this method is still extremely extreme, but we can use it as a solution.The problem is, why can’t I use ICLOUD on my mobile phone?And according to what happens, we do recommend you to consider this.

Uncategorized