YouTube promoted videos of scantily clad underage girls as young as 5 or 6 to users who viewed “erotic” material on the platform, according to a disturbing study by Harvard researchers.
The team of researchers at the Berkman Klein Center for Internet and Society at Harvard University found the trend when examining the algorithm of YouTube users who viewed certain types of sexual content.
Thefreethoughtproject.com reports: The study showed that after regular users watch erotic videos, they are recommended videos of women dressing as young girls before the algorithm eventually shows them videos of “girls as young as 5 or 6” wearing bathing suits or getting dressed.
As CNBC points out:
ACCORDING TO THE PIECE, YOUTUBE’S RECOMMENDATION SYSTEM CHANGED TO NO LONGER LINK SOME OF THE REVEALING VIDEOS TOGETHER, BUT THE COMPANY TOLD THE NEW YORK TIMES IT WAS “PROBABLY A RESULT OF ROUTINE TWEAKS TO ITS ALGORITHMS, RATHER THAN A DELIBERATE POLICY CHANGE.” YOUTUBE ALSO SAID THAT TURNING OFF ITS RECOMMENDATION SYSTEM ON VIDEOS OF CHILDREN WOULD “HURT ‘CREATORS’ WHO RELY ON THOSE CLICKS” BUT DID SAY IT WOULD LIMIT RECOMMENDATIONS ON VIDEOS IT DEEMS PUTTING CHILDREN AT RISK, THE REPORT SAID.
THE BERKMAN KLEIN CENTER DIDN’T IMMEDIATELY RESPOND TO A REQUEST FOR COMMENT ON WHETHER THE RESEARCHERS WILL BE PUBLISHING ANYTHING ON THE DISCOVERY. GOOGLE DID NOT IMMEDIATELY RESPOND TO A REQUEST FOR COMMENT.
This report, while horrifying enough, comes just months after YouTube was discovered to be a haven for pedophiles.
AT&T and Hasbro are among some of the major corporations who announced earlier this year that they will no longer purchase advertising on YouTube because it allows this pedophile content to flourish.
“Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube,” an AT&T spokesperson told CNBC.
“Hasbro is pausing all advertising on YouTube, and has reached out to Google/YouTube to understand what actions they are taking to address this issue and prevent such content from appearing on their platform in the future,” read a statement from Hasbro.
While YouTube censors videos of US war crimes, guns, and other videos that peacefully question the establishment narrative, they have long allowed this vile content to exist.
As TFTP has reported numerous times, children are the last ones YouTube appears to be concerned with, instead targeting those who’d dare challenge the status quo.
While there is certainly a free speech issue at hand with some of these videos, YouTube has no problem deleting and demonitizing channels who expose government crimes and corruption. So why do they ignore and allow these actual bad actors and promote videos of children to people seeking out pornography?
The Free Thought Project has even been a target of this censorship on multiple occasions. On the same day we were banned from Facebook and Twitter in October of last year, YouTube doled out a strike to us as well for a video that was three years old which appeared on dozens of other mainstream media channels.
The company has been known to target peaceful activists for challenging the paradigm so there is no question that they have the capabilities to remove these videos of child exploitation. However, they appear to be utterly incapable and unconcerned with doing so. And, in fact, they appear to be promoting it through the algorithm.
Indeed, one mother was horrified to find that a video of her 10-year-old innocently playing in a pool in the backyard was picked up by the YouTube algorithm. It had promoted it to more than 400,000 people who were viewing erotic videos, according to the NY Times.
“I’m really scared of it,” said Christiane C. “Scared of the fact that a video like this fell into such a category.”
While YouTube will likely claim this is an unintentional function of their algorithm, the end result does not matter. Unsuspecting users will have this content pushed on them—a de facto promotion of pedophilia.
Although YouTube announced that they made a shift in the algorithm to try to prevent this, the study points out that they refused to change the one thing that would successfully do so.
As the NY Times points out, YouTube has not put in place the one change that researchers say would prevent this from happening again: turning off its recommendation system on videos of children, though the platform can identify such videos automatically.