Key negotiators in the European Parliament have announced making a breakthrough in talks to set MEPs' position on a controversial legislative proposal aimed at regulating how platforms should respond ...
Two years ago, Apple first announced a photo-scanning technology aimed at detecting CSAM—child sexual abuse material—and then, after receiving widespread criticism, put those plans on hold. Read ...
Apple Inc. (NASDAQ:AAPL) is facing a $1.2 billion lawsuit filed on Saturday in U.S. District Court in Northern California for discontinuing its child sexual abuse material detection feature. What ...
A child protection organization says it has found more cases of abuse images on Apple platforms in the UK than Apple has reported globally. In 2022, Apple abandoned its plans for Child Sexual Abuse ...
The CSAM detection system preserved user privacy, data encryption, and more, but it also introduced many potential new attack vectors that may be abused by authoritarian governments. For example, if ...
Major year-over-year increase in CSAM detection and prevention highlights expanded safety innovation in the wake of explicit GenAI content WASHINGTON, Dec. 18, 2025 /PRNewswire/ -- DNSFilter, a global ...
Apple is being sued by victims of child sexual abuse over its failure to follow through with plans to scan iCloud for child sexual abuse materials (CSAM), The New York Times reports. In 2021, Apple ...
The European Union has formally presented its proposal to move from a situation in which some tech platforms voluntarily scan for child sexual abuse material (CSAM) to something more systematic -- ...
A legal opinion on a controversial European Union legislative plan set out last May, when the Commission proposed countering child sexual abuse online by applying obligations on platforms to scan for ...
Apple has encountered monumental backlash to a new child sexual abuse material (CSAM) detection technology it announced earlier this month. The system, which Apple calls NeuralHash, has yet to be ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results