Twitter has refused to take down the sharing of child pornographic images and videos of a young sex trafficking victim, as an investigation “didn’t find a violation” of the company’s “policies,” a scathing lawsuit alleges.
A lawsuit was made by the victim and his mother in the Northern District of California on Wednesday and alleges that Twitter made money from the shared clips, which included a 13-year-old being sexually abused.
Identified only as “John Doe”, the now 17-year-old victim says that he was manipulated into sharing the nude images of himself with an older student at his school.
Once the images were sent “the correspondence changed to blackmail” and threats were made to share the photos with the victim’s “parents, coach, pastor, and others in his community” if he did not continue to send additional material, which later featured another child.
Sometime during 2019, a “compilation video” featuring Doe, became available and circulated throughout Twitter on at least two accounts.
The teenager shockingly discovered the material being shared in January 2020 after “he learned from his classmates that [the] videos of him and another minor were on Twitter and that many students in the school had viewed them.” He immediately informed his parents.
The mother, identified as “Jane Doe” along with John himself, decided to take up the issue with Twitter directly. John also filed his own complaint and told twitter “These videos were taken from harassment and being threatened. It is now spreading around school and we need them taken down as we are both minors and we have a police report for the situation.”
He also sent a driver’s license photo, conforming his identity as Twitter requested that he confirm who he says he was.
One week later – on January 28 – Twitter finally responded, stating that it found no problems with the sexually explicit videos, and that they would remain on the website for everyone to see.
“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” Twitter said, while insisting that “your safety is the most important thing.”
Doe – clearly outraged – responded “What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down,”
He also included his case number from a local law enforcement agency, however Twitter ignored him and refused to take down the illegal child sexual abuse material.
A couple of days later, Doe’s mom connected with an agent from the Department of Homeland Security and they successfully had the videos taken down on January 30.
“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children,” the suit states, filed by the National Center on Sexual Exploitation and two law firms.
“This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children.”
The lawsuit continues to allege that Twitter knowingly hosts people who use the website to exchange child porn material and make a profit by including ads interspersed between tweets advertising or requesting the material.