New YouTube AI to mix up recommendations and more

The target is also flat-earthers and conspiracy theorists, as the company looks to sharpen up its AI.

Subscribe to our newsletter here!

* Required field

While taking a very positive position on the changes, Google recently revealed that it will be changing the way YouTube works with regards to "see next" and suggested video content. According to the company, "people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles".

While snickerdoodles are serious business, the company is also looking to "reduce the spread of content that comes close to—but doesn't quite cross the line of—violating our Community Guidelines. To that end, we'll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11."

Former Google employee Guillaume Chaslot, who on Twitter recalls a personal story about a friend falling in to a black hole of conspiracy and alien videos, calls the change for "historic" - pointing out how some videos can affect people to the point where they retract from society, and points out that the current AI is designed to increase each user's view time, adding how the current YouTube AI promotes 10x more flat earth videos than "round earth ones" due to an overall goal of increasing views on the channel.

Read it if you dare.

New YouTube AI to mix up recommendations and more

Loading next content