With iOS 15, Messages, Siri, and Gallery apps are starting to offer stricter measures to prevent actions that want to abuse children. WatchOS and macOS will also suffer.
The fact that the use of telephone, internet, and social media has decreased to young ages has also expanded the scope of child abuse. Apple intends to further protect children.
Setting to iOS software
The new measures that Apple will launch with iOS 15 will ensure that children and parents are alert to possible abuse. It is stated that the same measures will be offered to watchOS 8 and macOS Monterrey systems.
One of the measures is to blur images containing possible sexual assault in the Messages app. A warning message is sent to the child after the visual is detected and infected with artificial intelligence. If the child wants to open the image, this time the parents are informed. The same is true if the child wants to send a photo. The measure will be exclusive to family iCloud accounts.
On iOS and iPadOS, there is a measure triggered by the uploading of photos showing children participating in abusive acts on iCloud. This measure sends a warning to the Center for Missing and Exploited Children and terminates the user’s account. This is how law enforcement kicks in.
On Siri, a system has been developed to warn the child during an attempt to make an abuse-focused search. It also outlines ways for users to take steps in the face of abuse.
The artificial intelligence systems used in these measures evaluate the images within the framework of the criteria determined by the Center for Missing and Exploited Children. Comparisons made cannot be followed by Apple in any way. The whole process takes place on the artificial intelligence engines of the device and information is provided.