Snap has accused New Mexico's attorney general of intentionally looking for adult users seeking sexually explicit content in order to make its app seem unsafe in a filing asking the court to dismiss the state's lawsuit. In the document shared by The Verge, the company questioned the veracity of the state's allegations. The attorney general's office said that while it was using a decoy account supposed to be owned by a 14-year-old girl, it was added by a user named Enzo (Nud15Ans). From that connection, the app allegedly suggested over 91 users, including adults looking for sexual content. Snap said in its motion to dismiss, however, that those "allegations are patently false."
It was the decoy account that searched for and added Enzo, the company wrote. The attorney general's operatives were also the ones who looked for and added accounts with questionable usernames, such as "nudenude_22" and "xxx_tradehot." In addition, Snap is accusing the office of "repeatedly [mischaracterizing]" its internal documents. The office apparently cited a document when it mentioned in its lawsuit that the company "consciously decided not to store child sex abuse images" and when it suggested that it doesn't report and provide those images to law enforcement. Snap denied that it was the case and clarified that it's not allowed to store child sexual abuse materials (CSAM) on its servers. It also said that it turns over such materials to the National Center for Missing and Exploited Children.
The New Mexico Department of Justice's director of communications was not impressed with the company's arguments. In a statement sent to The Verge, Lauren Rodriguez accused Snap of focusing on the minor details of the investigation in an "attempt to distract from the serious issues raised in the State’s case." Rodriguez also said that "Snap continues to put profits over protecting children" instead of "addressing… critical issues with real change to their algorithms and design features."
New Mexico came to the conclusion that Snapchat's features "foster the sharing of child sexual abuse material (CSAM) and facilitate child sexual exploitation" after a months-long investigation. It reported that it found a "vast network of dark web sites dedicated to sharing stolen, non-consensual sexual images from Snap" and that Snapchat was "by far" the biggest source of images and videos on the dark web sites that it had seen. The attorney general's office called Snapchat "a breeding ground for predators to collect sexually explicit images of children and to find, groom and extort them." Snap employees encounter 10,000 sextortion cases each month, the office's lawsuit said, but the company allegedly doesn't warn users so as not to "strike fear" among them. The complaint accused Snap's upper management of ignoring former trust and safety employees who'd pushed for additional safety mechanisms, as well.
This article originally appeared on Engadget at https://www.engadget.com/apps/snap-calls-new-mexicos-child-safety-complaint-a-sensationalist-lawsuit-140034898.html?src=rss
Recent Comments