No, Apple does not check if you have child pornography with the Finder

For several months, a rumor rustled : the CSAM function (Child Sexual Abuse Material) originally intended to detect child pornography images in Apple customers’ photo libraries would have been enabled in the macOS Finder. However, it is not.

The first thing to explain is simple: faced with the various negative reactions to its technology, Apple has officially abandoned its project to detect child pornography. The Finder therefore does not attempt to detect this type of image and does not send the data to Apple’s servers. The developer Howard Oakley explained it well on his blog The Eclectic Light. However, the Finder does send certain information to Apple.

The diagram of the function Visual Look Up (through Howard Oakley)

Data detection in images

The rumor actually has a basis: technology Visual Look Up. It detects the subject of certain images, such as cats, dogs, paintings, etc. We are not going to detail all the detection steps, but your Mac will analyze the content of each image displayed, perform various processing operations and create a neural hashes. The latter will be compared to other neural hashes and at the end of the process send it to Apple’s servers to get the results, via the process mediaanalysisd. The technology works if you open an image with Preview, but also with Quick Look, for example.

Preview is able to display the breed of dogs in this image.

It is not a unique identifier

The question you might ask yourself is simple: What is the difference with the detection of child pornography images? “. It is legitimate, but comes from a misunderstanding: a neural hashes is not a unique identifier or even an identifier of an image. It is the result of different processing on a part of an image, which does not allow to identify a particular image or to return to the original image. It helps verify that what macOS has detected is probably a dog (and probably is important), but that’s it.

The tests in a virtual machine clearly show that only the neural hashes are transmitted, and they simply do not make it possible to determine the contents of the complete image nor to identify it. And the complete images are obviously not sent to Apple, if only because the volume of data would be absolutely gargantuan.

If ever you do not want to send the data to Apple, it is possible to deactivate the function. In System Settings > Siri and Spotlightyou have to uncheck Suggestions from Siri in Search resultsand the option will disappear.

The option to uncheck.

.

Leave a Comment

Your email address will not be published. Required fields are marked *