Apple has created an iOS 15.2 that helps protect children from sexual predators. Back in August, it was revealed that Apple could incorporate an essential feature into the latest version of iOS, version 15.2.
The planned new device seeks to protect children from falling victim to sexual predators. There are indications that Apple is doing this feature and is already testing it in full swing. The CultofMac says the device has been included for the 15.2-inch iOS developers, a beta version given on Tuesday by the Cupertino company.
According to information so far, the system will detect if children want to send or receive nude photos via an iPhone, and if that happens, it will alert parents. According to the page, it is essential that the function can be switched on and off, i.e., it will not be mandatory to use it. In addition, Apple uses machine learning to identify the images, meaning that the photos in question will not be removed from the mobile.
According to the report, if the feature is turned on, the child will see a blurred image in the Messages app running on iPhones and iPads. If you try to see it, you will be asked to see content, not for him.
According to the paper, the message was worded so that children could understand it: “sensitive photos and videos show the private body parts that you cover with bathing suits” depict the intimate part. If the child still opens it after this, the parent will be notified immediately. A similar process occurs when sending, and if the child still sends it despite the warning, the system also reports to the parents.
According to the report, the feature will be turned off by default, and the parent will need to activate it in the family package.