YouTube has taken a lot of flack recently for being unsafe. Google, who owns the company claim it is a positive resource but has reassuringly haven’t disregarded these issues and have claimed they have to work faster and there is a lot to do with regards to safety. Google’s EMEA president Matt Brittin commented saying,
“High-profile headlines about issues give people concern and in the case of some of the content on YouTube they were accurate, what they sometimes miss is proportionality. No impression served on video that shouldn’t be served on is acceptable, but proportionally, in terms of the huge volume of impressions they were getting, these were very small numbers.”
This comes in the wake of a lot of negative press. In particular, YouTube came under fire for how it dealt with the New Zealand terror attacks. A lot of content from the attack surfaced on the platform and remained there for up to 24 hours, many believe this is unacceptable.
The problem lies in that the amount of content uploaded to the platform is astronomical and that makes it difficult to police. It wasn’t that long ago the company came under fire for rude comments on videos aimed at children. That issue caused the platform to lose a lot of ad. spend from some very prominent brands such as Nestle and McDonalds. Britain said,
“I don’t think you can ever do enough to make everything as safe as possible. You have to have zero tolerance but that doesn’t mean to say you can achieve zero occurrences in any walk of life – whether that’s crime on the street or bad actors in the technology world,” he added
“If you look at the violent extremist content, where we started and where we are now, that is huge progress and demonstrates that improved policies, enforcement and people can get us to a better position. There’s a lot to do, we’ve invested a lot. Never satisfied, zero tolerance, but also constantly trying to look at how we can improve on speed.”
The company claims the more than 80 percent of the videos uploaded that violate its strict policies are taken from the site before they are even flagged. But the New Zealand incident pushed YouTubes staff to its limits, so much so that they were relying on AI to remove offensive videos that were being uploaded at a rate as quick as one per second. So when you consider that it’s perhaps unsurprising that some videos slipped through the net.
The site has invested in technology to push a counter-terrorist narrative. So if a user is watching pro-terror videos they will have anti-terror videos suggested to them to show them that there are two sides to the story. It is hoped this will help reduce cases of radicalisation. It also has algorithms that identify when videos are posted by terror organisations and helps stop those videos spreading.
The problem is that as fast as YouTube’s tech develops so does the cleverness of people who would seek to use the site for nefarious purposes. It has played a role in political discourse and as a result, social media sites, not just YouTube but the likes of Facebook too have come under increased government scrutiny. The UK government conducted a report which suggested that big players are not coping tackling these issues.
Brittin suggested that YouTube will never be entirely safe from these issues and pointed out that trust in advertising platforms had been low way before the internet took off. He commented,
“It’s incumbent on us to be in a position where people see we have got the expertise and a will to make changes that help us get to a better outcome together. It’s a responsibility of the ad industry, a technology player like Google, and policymakers. We’re not anti-regulation, we’re pro better regulation and up-to-date regulation. We don’t want to be policing the internet, it’s up for governments to define the framework we work within.” He continued,
“We’re not [ungoverned by rules]. YouTube’s got very serious responsibilities just like any publication or press would have. But they’re different responsibilities because they’re different types of property.”