Android

Snapchat wants New Mexico's child safety complaint dismissed


Snapchat has claimed that the child safety complaint filed by the state of New Mexico is fundamentally flawed. The social media and messaging platform wants the lawsuit dismissed over the technical procedure adopted to allegedly expose how child predators lurk on Snapchat.

Snapchat is a breeding ground for predators, claims New Mexico

Snapchat is currently facing legal action by the state of New Mexico. Specifically speaking, the New Mexico Department of Justice has accused Snapchat of harboring child predators on the platform.

New Mexico recently concluded an investigation that concluded Snapchat’s features “foster the sharing of Child Sexual Abuse Material (CSAM) and facilitate child sexual exploitation”. Investigations by the state revealed a “vast network of dark web sites dedicated to sharing stolen, non-consensual sexual images from Snap.”

In what could be one of the biggest sensational claims, New Mexico has reportedly alleged that Snapchat is “by far” the biggest source of images and videos on the dark web sites that it has seen. The state even called the app“a breeding ground for predators to collect sexually explicit images of children and to find, groom, and extort them.”

Child safety complaint by New Mexico relies on flawed approach, claims Snapchat

In the lawsuit, New Mexico claimed that suspicious and concerning accounts sought out young teens. Specifically speaking, the attorney general’s office was using a decoy account supposed to be owned by a 14-year-old girl.

A user named Enzo (Nud15Ans) allegedly sought out the account, and then the app allegedly suggested over 91 users. Needless to say, a concerning number of suggested accounts belonged to adults looking for sexual content.

Snapchat has reportedly filed a motion to dismiss the lawsuit claiming New Mexico’s “allegations are patently false.” According to Snapchat, it was the decoy account that searched for and added Enzo. Furthermore, it was the attorney general’s operatives who looked for and added accounts with questionable usernames, Snapchat has claimed.

Needless to add, Snapchat has vehemently denied storing Child Sexual Abuse Materials (CSAM) on its servers. The company has stated that it hands over such material to the National Center for Missing and Exploited Children.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.