According to the reports reaching us from some of our sources, Google will start to examine more strictly the YouTube channels that are part of its Google Preferred premium advertising program. Anonymous sources also reveal that the company will be using both human moderators and machine learning to identify those videos that shouldn’t be part of Preferred bundles.
This move is by all accounts a response to advertisers’ concerns over inappropriate videos flooding the platform recently featuring children as well as offensive behavior from YouTube stars like Logan Paul who was kicked off the preferred platform after uploading a video of a dead body in Japan’s Aokigahara forest which aroused a great deal of controversy.
Google touts Preferred as a collection of “the most popular YouTube channels among US 18- to 34-year-olds” and “the most engaging and brand safe content on YouTube,” which is organized into categories like fashion, pop culture, and recipes. But the latest Logan Paul hullabaloo apparently caught YouTube off-guard. Only after it had been widely viewed and copies continued to saturate across the platform, did Paul remove the video himself?
Vetting YouTube premium videos more closely is also part of a larger moderation crisis for Google, which had said last month that it was expanding its staff of moderators to 10,000 people. Though there’s not much detail in this most recent report, it’s not an altogether surprising move for the company to make.
A spokesperson for YouTube stated that “we built Google Preferred to help our customers easily reach YouTube’s most passionate audiences and we’ve seen strong traction in the last year with a record number of brands. As we said recently, we are discussing and seeking feedback from our brand partners on ways to offer them even more assurances for what they buy in the Upfronts.”