A new lawsuit alleges Twitter, which recently purged thousands of right-leaning accounts for allegedly promoting conspiracy theories and “violence,” repeatedly refused to take down several images of child pornography featuring a 13-year-old minor in 2019, saying the images didn’t violate their company policies.
The lawsuit, filed this week in the Northern District of California, alleges the social media giant turned a blind eye to multiple reports of child pornography after images of the then-13-year-old boy were circulated on the platform beginning in 2019.
According to the New York Post, which cited the lawsuit filed Wednesday by the boy and his mother, the boy was coerced into sharing nude images of himself on Snapchat with sex traffickers posing online as teenage girls. He was then blackmailed into sending more explicit images and video clips of himself engaged in sexual acts after the traffickers reportedly threatened to share the original images with his parents, his coach, and his pastor. Under threat, he also included another child in some of the images before finally blocking the traffickers online.
Sometime later, the photos he’d sent began popping up online, including on Twitter. The boy, who’s now 17 and living in Florida, and his mother allege that they reported the issue to the police and to the boy’s school, and that they made multiple complaints to Twitter explaining that the images were of a minor. But according to the lawsuit, the platform failed to remove the images, saying that they “reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.”
The victim says in the lawsuit that he even included the case number from his report to law enforcement to prove the validity of his claims, but that Twitter still refused to remove the images until a federal officer from the Department of Homeland Security became involved in the case.
“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children,” the suit states
A paragraph in Wikipedia's page titled, 'Censorship by Google' reads:
ReplyDeleteLolicon content
As of 18 April 2010, Google censors "lolicon," a Japanese term meaning "attractive young girls", on its search results, hiding meaningful results regarding lolicon material, even if the user types words along with the term which would typically lead to explicit content results; the terms "loli" and "lolita" also suffer from censorship in regards to this content.
Maybe I'm paranoid but this wording sounds rather off to me.