Apple’s New Technology to Detect Abusive Photos

Apple is set to release software to detect explicit photos in an update to iOS 15. It is raising alarms in the tech world. This could be a serious threat to the privacy of your iOS device. If a highly intelligent individual were to get their hands on this type of technology and use it for the wrong reasons, who knows what would happen. It is a good thought, but it is untested and is not ready to be developed yet.

Before you upload an image to Apple’s iCloud Photos it is scanned to see if it matches any known photos from organizations like The National Center for Missing & Exploited Children (NCMEC). It will also scan messages for explicit photos on your children’s iOS device and will notify you if they decide to look at the photo. If an image is found in your camera roll as an explicit image it will be sent to Apple they will disable your iCloud account, send the photo to NCMEC, and then they will pass it down to law enforcement if it matches one of their images.

The news about Apple’s new CSAM detection tool, without public discussion, also sparked concerns that this technology could be misused.

Leave a Reply

Your email address will not be published.