Gold Star Memorial with Woody Williams
Pensacola Memorial Dedicated to Gold Star Families
November 24, 2020
3M logo with gray opaque triangular shapes receding in the background, text in a black box below reads "Earplug Trials"
Aylstock, Witkin, Kreis, & Overholtz Takes On Corporate Giant 3M
April 30, 2021

The Real Social Dilemma: Child Sex Trafficking

A photograph of a person with long hair covering their face. They're the focal point of the image, and the hallway they're in is blurred out. The person appears to be looking downward and seems sad or deep in thought.

The common phrase, “You can’t believe everything you see on social media” is more prevalent than ever. Between the rise in “fake news” and the desire to turn any headline into a sarcastic meme, social media has become a playground for false narratives. Most social media platforms have increased efforts to monitor misleading sources, and many have become more transparent with policies and procedures. General social media guidelines are straightforward: don’t engage in hateful conduct, don’t post graphic content, and don’t threaten other users. Yet, while the parameters for posting are pretty black and white, the manner in which social media companies implement those same policies is a gray area.

Twitter is one of the most popular social media platforms in the world. It has over 330 million users and close to 5000 employees. Twitter users communicate via “tweets,” which are brief posts of up to 280 characters. Twitter has a zero-tolerance policy against violent threats and child sexual exploitation—at least it says it does.

According to a January 20, 2021 lawsuit, Twitter refused to take down pornographic images and videos of a teenage sex trafficking survivor because an investigation did not find a violation of the company’s policies. The lawsuit was filed in the United States District Court for the Northern District of California on behalf of John Doe, a Florida resident, and minor child.

In 2017, when John Doe was 13-14 years old, he engaged in dialogue with an individual on Snapchat. The individual represented to Doe that she was 16 years old, and that she went to Doe’s school. After conversing, Doe and the individual exchanged nude photos. However, the exchange eventually turned into blackmail. The individual began demanding more sexually explicit pictures and videos of Doe, and Doe was threatened if he did not comply.

In 2019, a compilation video of Doe’s child sexual abuse material surfaced on Twitter. Several of Doe’s classmates saw the videos, and Doe faced vicious harassment, bullying, and became suicidal. Doe and his mother alerted Twitter, local law enforcement, and school officials about the material. According to the lawsuit, Twitter ignored Doe’s pleas to take the content down. The content was not removed until January 2020 when Twitter was contacted by a federal agent with the Department of Homeland Security.

Doe’s 11 count lawsuit includes claims for negligence, gross negligence, distribution of private sexually explicit materials, and intrusion into private affairs. The lawsuit claims that while Twitter allows for some offensive content to be easily reported through an accessible report function, there are additional hurdles one must overcome to report material of child sexual abuse. Furthermore, the lawsuit asserts that Twitter even makes search suggestions designed to help users find illegal child sex abuse material. The complaint also alleges that Twitter knowingly benefits from sex trafficking ventures and exploitation.

Twitter may limit its users’ tweets to 280 characters, but the limit on child sexual abuse material on its platform should be kept at 0. The attorneys at Aylstock, Witkin, Kreis, and Overholtz, PLLC are committed to bringing justice for survivors of sex trafficking and childhood sexual abuse and exploitation. Please contact our office to receive more information.

Source: John Doe v. Twitter, Inc., Case No. 3:21-cv-00485 (N.D. Cal. 2021) (ECF Doc. 1)