Fortnite adverts pulled from YouTube by Epic Games over child exploitation controversy

Epic Games pulls Fortnite ads from You Tube following exploitation accusations By Amanda Zelauskas

Epic Games pulls Fortnite ads from You Tube following exploitation accusations By Amanda Zelauskas

A Nestle spokeswoman said the food maker had chose to "pause" YouTube advertising globally while the issue was investigated.

Epic Games, GNC and Nestlé's, among others said they removed advertising on the platform after YouTube creator Matt Watson posted a video highlighting the issue and accused the company of "facilitating the sexual exploitation" of children.

Basically, what has happened, according to a report on The Verge, is pedophiles are frequenting or stealing posted videos of children, making lewd comments, and time-stamping points in the video which, in their eyes, features a sexually gratifying image.

In a statement to the Bloomberg, a spokesperson from Nestle US said it has paused all pre-roll advertising. "We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments", a spokeswoman for YouTube told Bloomberg. "There's more to be done, and we continue to work to improve and catch abuse more quickly", she added. After clicking on several bikini videos, YouTube's recommendation engine suggests that users watch videos with minors, Watson said. AT&T, which just recently started advertising on YouTube again after a two-year boycott, has pulled its ads as well. "The executives also told ad buyers the company will deliver a timeline in 24 hours outlining new restrictions and product changes, one of the people said".

In its report, Wired named some of the major companies which had advertisements playing alongside the pedophilic content, including "Alfa Romeo, Fiat, Fortnite, Grammarly, L'Oreal, Maybelline, Metro: Exodus, Peloton and SingleMuslims.com", as well as Google itself.

YouTube users who clicked on videos of young girls engaged in what could be viewed as a sexually suggestive activity like gymnastics were then given recommendations by YouTube's algorithms to similar video clips, blogger Matt Watson said in a 20-minute video posted Sunday that has been viewed almost 1.8 million times.

While YouTube has suffered from unsafe and pedophilic content for some time, the site has since given a decisive response to this recent exposure of apparent predatory culture across the platform.

Almost a dozen different companies have terminated or paused their ads on YouTube. We asked YouTube if it has identified any problems in its algorithms that helped cause the problem but received no answer to that question.

YouTube is trying hard to clear out disturbing content.

Latest News