07.12.2022 19:54 Uhr, Quelle: 9to5Mac

Apple confirms that it has stopped plans to roll out CSAM detection system

Back in 2021, Apple announced a number of new child safety features, including Child Sexual Abuse Material (CSAM) detection for iCloud Photos. However, the move was widely criticized due to privacy concerns. After putting it on hold indefinitely, Apple has now confirmed that it has stopped its plans to roll out the CSAM detection system. more…

Weiterlesen bei 9to5Mac

Digg del.icio.us Facebook email MySpace Technorati Twitter

JustMac.info © Thomas Lohner - Impressum - Datenschutz