Companies including Nestle and Fortnite maker Epic Games have suspended advertising on YouTube over claims their ads appeared next to offensive content on the video platform.
It follows allegations from a YouTube vlogger that he had discovered a “wormhole” into a “soft-core paedophile ring” on the site.
US-based Matt Watson, who posts videos to his channel MattsWhatItIs, said he had found instances where paedophiles were targeting videos of young girls on the site.
He said they would use the comments section to make suggestive remarks and flag moments when the girls appeared in compromising positions, such as when performing gymnastics.
Despite pointing out the videos themselves were not sexual in nature, Mr Watson claimed a “glitch” in YouTube’s algorithm meant it was possible to find the videos “in about five clicks”, and recommended content would quickly show dozens of similar clips of young girls, many of which featured similar comments.
Some of the videos appeared alongside advertising from companies such as Disney and Nestle, Mr Watson said.
He said in some cases, he uncovered links to child pornography posted in comments, which had been deleted when reported to YouTube.
In a statement, Nestle confirmed it is “pausing” advertising on the platform while an investigation takes place.
“An extremely low volume of some of our advertisements were shown on videos on YouTube where inappropriate comments were being made,” a spokesman said.
“While investigations are ongoing directly with YouTube and our partners, we have decided to pause advertising on YouTube globally, already effective in North America and several other markets.
“We will revise our decision upon completion of current measures being taken by Google to ensure Nestle advertising standards are met.”
In a statement, Epic Games said: “We have paused all pre-roll advertising. Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service.”
A YouTube spokesman said: “Any content – including comments – that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube.
“We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
YouTube, which is owned by Google, said in the last 48 hours it had taken an aggressive approach, beyond its normal protections, to disable comments and it had terminated more than 400 channels due to comments left on videos.
It is not the first time YouTube has been hit by issues around advertising. In 2017, the company apologised after advertising from some Government agencies and companies such as Marks & Spencer and L’Oreal appeared next to extremist videos on the site.