Back in 2021, Apple announced a number of new child safety features, including Child Sexual Abuse Material (CSAM) detection for iCloud Photos. However, the move was widely criticized due to privacy concerns. After putting it on hold indefinitely, Apple has now confirmed that it has stopped its plans to roll out the CSAM detection system.
more…