HEALTH

Apple sued for failing to implement tools that would detect CSAM in iCloud

Apple is being sued by victims of child sexual abuse over its failure to follow through with plans to scan iCloud for child sexual abuse materials (CSAM), The New York Times reports. In 2021, Apple announced it was working on a tool to detect CSAM that would flag images showing such abuse and notify the National Center for Missing and Exploited Children…

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button