Today Apple announced that it will start scanning users iCloud photos for any CSAM or child abuse images. They will also have new child safety features which will protect your child from viewing inappropriate content.
There are new parental controls in the Messages app. If a child or user receives a child abuse image which is inappropriate, the device will let the user know and give them instructions on what to do. Parents will receive notifications about the incident and will be able to see if their child viewed it or not. This new feature will help kids stay safe and make sure that they won’t be able to view any inappropriate content.
Apple will also start scanning devices with iCloud photos in the US. The new feature will scan iCloud photos for any existing images of Child Sexual Abuse Material or CSAM. If the software detects any CSAM content which matches to a CSAM database, an employee at Apple will manually review the image. If the manual review identifies CSAM content, Apple will disable the user’s account and report it to law enforcement. Apple claims that there is one in a trillion chance that the system will falsely detect something. They will also allow people to appeal for these cases if the user believes it’s false. This is very important since CSAM is a problem and Apple is helping the world by eliminating CSAM content.
Apple is also updating their Siri and Search Guidance. If users accidentally view images of CSAM or other inappropriate content, Siri and Search guidance will let the user know how they can report it and help them stay safe on the internet. If users search for CSAM content, Siri and Search Guidance will let users know that the content their searching for can cause problems and be harmful to them. They will also provide links and other info to encourage users not to view the content.
These new features and updates will be released by Apple by the end of this year. They will help children and users stay safe on the internet and will also help with eliminating inappropriate content.