A new YouTube policy has recently been taking effect, which will newly allow creators with a focus on gaming to upload videos that contain simulated violence without worrying about being automatically hit by age-restriction gates, as much as they once had to.
The company’s new policy for gaming is already how YouTube treats other scripted entertainment formats, like television and movies. It will allow future gaming videos that include scripted or simulated violence to possibly be approved directly without an age-gate. That means those videos will be open to everyone, not just those with an account stating they’re over the age of 18. If the violence is extreme and the sole focus of a video, like a finishing move in Mortal Kombat, the video may still be age-gated.
Overall, the policy means there “will be fewer restrictions for violence in gaming,” but YouTube claims it will “still maintain our high bar to protect audiences from real-world violence,” according to a product update.
The new policy doesn’t apply to advertisement guidelines, though. If a video is considered too violent for advertisers, even if it’s fine by YouTube’s standards, it still runs the risk of being demonetized. YouTube CEO Susan Wojcicki knows this is an issue for creators, many of whom solely rely on YouTube’s AdSense program to earn a living. She addressed those concerns in a recent letter to YouTubes concerned creators.
“We’re working to identify advertisers who are interested in edgier content, like a marketer looking to promote an R-rated movie, so we can match them with creators whose content fits their ads,” Wojcicki wrote. “In its first month, this program resulted in hundreds of thousands of dollars in ads on yellow icon videos [referring to an icon that appears to creators when their videos are demonetised].”
“We know there’s a difference between real-world violence and scripted or simulated violence – such as what you see in movies, TV shows, or video games – so we want to make sure we’re enforcing our violent or graphic content policies consistently,” the company said by way of explanation for the policy change on one of its support pages.
YouTube hinted it was working on a new policy on video game policy late last month when CEO Susan Wojcicki said the company was trying to find companies who would be willing to advertise against more “edgy” content. In that respect, it’s worth noting that the company hasn’t changed its guidelines on advertiser-friendly content. Videos that show gratuitous amounts of violence will still go largely un-monetised.
In September, YouTube reached a settlement with the Federal Trade Commission over its violation of the Children’s Online Privacy Protection Act (COPPA), which required it to pay a $170 million fine and set into place a series of new rules for creators to comply with. These rules require creators to mark videos that are directed at kids (or entire channels, if need be.). This, in turn, will limit data collection, put an end to personalised ads on kids’ content, disable comments and reduce their revenues, creators say.
Creators will also lose out on a number of key YouTube features, including click-through info cards, end screens, notification functions and the community tab.
YouTube creators say they don’t have enough clarity around where to draw the line between content that’s made for kids and content that may attract kids. For example, family vlog channels and some gaming videos may appeal to kids and adults alike. And if the FTC decides a creator is in violation, they can be held liable for future COPPA violations now that YouTube’s new policy and content labelling system is in place. YouTube’s advice to creators on how to proceed? Consult a lawyer, it has said.
In a recent letter, Wojcicki acknowledges the fallout of these changes, but doesn’t offer any further clarity — only promises of updates to come.
“We know there are still many questions about how this is going to affect creators and we’ll provide updates as possible along the way,” Wojcicki said.
Overall, YouTube’s new policy is sure to please most gaming content creators. The platform doesn’t have the best track record when it comes to gaming – as we saw back in September when it un-verified a slew of esteemed channels.
Starting today, scripted or simulated violence in video games will be treated the same as violence in other scripted content like movies & TV.
This means future gaming uploads w/ scripted or simulated violence may get approved instead of age-restricted → https://t.co/N2tJf3ersR https://t.co/W0jB8pr8ax
— TeamYouTube (@TeamYouTube) December 2, 2019
Start the discussion
Login to comment