21 Mar 2019

Tech giants profit from extreme content, programmer says

From Nine To Noon, 9:09 am on 21 March 2019

YouTube knew its algorithm pushed viewers towards divisive and extreme content, a former programmer for the tech giant says.

Until 2013, Guillaume Chaslot was one those responsible for designing the algorithms at YouTube, which he says encourages the most extreme and divisive content because that is what drives the most traffic.

Laptop showing YouTube logo being used in Turkey.

Photo: AFP

YouTube fired Chalsot in 2013 because, he says, he agitated for a change to how its algorithms were framed. He has since founded watchdog group AlgoTransparency.

Prime Minister Jacinda Ardern has been vocal about her distaste with Facebook for allowing the live stream, and subsequent uploading to other platforms such as YouTube, of the accused Christchurch shooter's video.

The managers of five government-related funds, worth more then $90 billion, have added their voice to calls for Twitter, Facebook and Google to take action against the spread of obscene content.

“Divisive content is extremely good to keep people watching, to make them click, so YouTube is going to say ‘oh, people want divisive content’ - but they don’t really want divisive content. It’s just easier to click on it but it doesn’t mean that they really want that,” Chaslot told Nine to Noon.

He says he noticed conspiracy theory content was an effective way to increase “watch time” on YouTube when he was working on the video platform's algorithm.

The algorithm recommended viewers watch other conspiracy videos “billions of times”.

Pizzagate - the conspiracy theory that went viral during the 2016 presidential election cycle - was an example of this, Chaslot said.

“So, if you go on YouTube and you do 'your own research' as they call it then you only see one side of the argument and I’d seen that for Pizzagate in 2016, they were the only company saying that Pizzagate was real.”

He says programmers did not intend for the algorithm to focus on extreme or divisive content, but it has learned to do so - because it works.

Ms Ardern wants governments around the world to work together to pressure tech giants into action.

About 4000 people saw the original, live video, before anyone reported it as harmful content.

Chaslot says it is probable that AI detection tools failed to recognise the content for what it was once it found its way onto the YouTube platform. 

“To be fair, the video might be very difficult to identify with deep learning and with this AI technique, because it really looks like a video game. And these AIs are trained to know that video games are fine, there is a lot of video game content on YouTube so it might be one reason why it was so difficult to categorise.”

The problem is platforms such as Facebook, Google and YouTube have no financial incentive to reform, he said.

“All of the platforms now have realised some parts of the problem: Mark Zuckerberg talked about borderline content, he was going to pull down borderline content.

“Twitter acknowledges the problem and says they will do better. YouTube in January said they were going to recommend less conspiracy theories but the incentives are against that - it’s very difficult for them to act in this direction because it will decrease the watch time and decrease the performance of the platform.”

He says it is unlikely and probably undesirable for Facebook to disable its live view function.

“If you do that then you cut interaction with the user, what’s great with live stream is people can interact in real time and for 99.999 of cases this is a good thing, so it’s very tricky to do this kind of thing.”

But tougher legislation to control the tech media giants is needed, he says.

“I think we need more legislation, [but] that’s very tough to do because to do some better legislation you need to know the technical specificities of the platform and legislators usually are not very well trained in that.

“So, technically it is very difficult but there are ways to do legislation that pushes platforms to do the right thing.”