News

Apple postpones launch of child pornography scan

In August, Apple announced that it would introduce a scanning function with the next iOS update that would search all photos stored on the device for abuse images and report any findings to authorities. After massive criticism, Apple has now postponed the launch of the function.

Criticism of Apple’s push

In the past, Apple had always attached great importance to the data protection of its devices and software, unlike the other leading digital corporations. The response after the announcement of the automatic and non-disable scanning function was correspondingly negative: Critics see the scanning function first and foremost as a significant intrusion into the privacy of users and, moreover, as an instrument that, once implemented, invites arbitrary state action and could also be used for the automated detection of politically undesirable content, for example. Several times in the course of this, Apple has been asked to reconsider its decision.

Start postponed

Now the company has reacted and at least postponed the introduction of the new function. However, the project is to be continued. So Apple stated on record that it wants to take time to consider the objections of interest groups, researchers and customers and possibly include them in a revision of the function. It is not yet known how exactly Apple wants to revise it and when the release is now planned. The further development therefore remains to be seen.

Simon Lüthje

I am co-founder of this blog and am very interested in everything that has to do with technology, but I also like to play games. I was born in Hamburg, but now I live in Bad Segeberg.

Related Articles

Back to top button