09.08.2021 19:10 Uhr, Quelle: 9to5Mac

Apple confirms CSAM detection only applies to photos, defends its method against other solutions

Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos. The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies. more…

Weiterlesen bei 9to5Mac

Digg del.icio.us Facebook email MySpace Technorati Twitter

JustMac.info © Thomas Lohner - Impressum - Datenschutz