31.08.2023 22:45 Uhr, Quelle: MacDailyNews

Apple explains its CSAM-scanning about-face: ‘Scanning for one type of content… opens the door for bulk surveillance’

Apple killed an effort to design an iCloud photo scanning tool for detecting child sexual abuse material (CSAM) in the storage service by…

Weiterlesen bei MacDailyNews

Digg del.icio.us Facebook email MySpace Technorati Twitter

JustMac.info © Thomas Lohner - Impressum - Datenschutz