Apple won’t deploy its anti-paedophile weapon yet

The feature would have analyzed photos stored on iCloud, looking for pornographic images of children. Privacy experts have heavily criticized the development.

Apple first announced in early August that it was working on a feature – essentially busting pedophiles- that would have been used to analyze photos saved from iPhones to the company’s iCloud storage space to find footage that would be considered child images.

However, Apple has decided not to introduce it for the time being due to criticism of the system. The development works by having an algorithm run through your device before transferring it to storage (iCloud) to see if recording to the cloud is illegal.

The examined image is compared to child footage seized by authorities stored in an encrypted database. If the system considers that the idea under investigation is problematic, it signals a flesh-and-blood employee who can notify the authorities.

However, the company’s announcement in August and its specific feature were followed by thunderous criticism.

Several experts, including Edward Snowden, who sparked the US eavesdropping scandal, opposed its introduction. Critics argued that the system could be in the wrong hands for remote surveillance, which government agencies could defeat their political opponents.

Apple did not address the concerns raised at first, but today they are quite different: as they wrote, the fact that it is not being introduced for the time being is a sign that the company has heard the expert feedback.

The manufacturer also stated that it would revise the development, and only then would it be decided exactly how it could be introduced.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button