Apple is being sued, to the tune of $1.2Bn, by a potential group of 2,680 victims for its failure to follow through on its 2021 promise to deliver a tool that would detect CSAM (Child Sexual Abuse materials) on iCloud messages and flag the same, report The New York Times.

In 2021, Apple had announced a tool, that would help curtail CSAM content across iCloud. It had then mentioned that the tool would flag images showing such abuse and notify the National Center for Missing and Exploited Children. However, due to issue concerning overall privacy on iCloud, Apple later withdrew that plan. The lawsuit’s basis is largely the withdrawal of the said tool.

The lawsuit mentions Apple as announcing “a widely touted improved design aimed at protecting children,” then failing to “implement those designs or take any measures to detect and limit” this material. It mentions that by failing to implement such a tool, it is making victims relive the trauma and causing immense mental pain.

According to the NYT report, the lawsuit is being initiated by a 27 year old victim of CSAM, who has filed her suit under a pseudonym. The report highlights, that the lawsuit mentioned her ordeal of being molested by a relative when she was an infant and shared images of her online. She also mentions that she still receives law enforcement notices nearly every day about someone being charged over possessing those images.