PHONESTECH NEWS
Trending

Apple Scans and Reviews your pictures and messages

Apple is now hunting for signs of child abuse in messages and photos.

Apple is violating iPhone privacy to stop child abuse. The company plans to scan user photos using evidence of an algorithm for evidence of child abuse. If it finds one, the algorithm sends the photo to a human reviewer. The idea that Apple employees are watching legitimate photos of a user’s children is fundamentally worrying.

Apple reviews your pictures and messages

Shortly after the article was published, Apple confirmed that it had software that hunts for child abuse. The “Extended Protection of Children” blog post outlined plans to curb child sexual abuse material (CSAM).

Apple reviews your pictures and messages

As part of these plans, Apple introduces new technology in iOS and iPadOS that “allows Apple to detect known CSAM images stored in iCloud Photos”. Essentially, scanning to the device is done for all media stored in iCloud Photos. If the software finds an image suspicious, it sends it to Apple, which decrypts the image and views it. If you find that the content is, in fact, illegal, you will notify the authorities.

Apple claims that “there is a chance of one in a billion a year incorrectly marking a particular account.”

Source
Android Authority

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button