Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.
The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies.
more…