YouTube: The Newest Political Issue?

Screen Shot 2017-10-21 at 9.12.13 PM
YouTube’s ‘new’ logo as of August 29, 2017. Photo: YouTube

For a channel to appear on TV, it takes years of work to obtain proper FCC licenses and other approvals. But for a channel to appear on YouTube, it takes a person with a smartphone, 5 minutes, and some ideas. Big new organizations are kept in check by their advertisers. They need to hold at least somewhat mainstream opinions to keep a profit. Advertisers on YouTube rarely even see the videos they put ads on, and as you might guess, they don’t like this.

Let’s step back. Over about the past year, YouTube has been beginning to crack down on the types of videos that are “advertiser friendly.” This move was made to combat the problem of extremism and hate speech. It makes sense that advertisers should have more control over weather or not they want their companies being associated with certain content creators. After all, they are the ones paying for the ads!

This move created some problems with the YouTube community however. The extreme-video-catching net was a little too large, and was beginning to demonetize videos with even the slightest political affiliation. This wasn’t good, because many top creators on the platform depend heavily on YouTube as a source of income, and were suddenly seeing their profits cut by as much as 80%, with no explanation as to why.

So that’s the background information. If you still feel like you don’t understand the situation entirely, or just want to know more, you can read about it here, or watch videos about it here and here.

So what’s the solution to this? It doesn’t involve changing anything with the ads. It involves changing ‘the algorithm’, or the thing that recommends videos to keep viewers engaged on the site as long as possible. Put ads back on all videos. If advertisers appear on some videos they don’t like, tell them so suck it up. I promise, with a platform as big as YouTube, they aren’t going to stop showing ads any time soon. Next, use the auto generated subtitles on videos to determine if they are factually inaccurate. Even though I strongly do not support any form of extremism, including media, if it is factually accurate there’s nothing we can do. Finally, put a visible flag on all videos that do not contain facts and do not cite their sources. Stop pushing them to people. There’s no reason the suggestion algorithm should be bringing them up as suggestions if they are plain lies.

This will overhaul YouTube as a platform. Creators will become more transparent and check their sources. Advertisers will become more happy, as the quality of content improves. And users will be able to trust the content they watch as factually correct.

This isn’t going to be easy for YouTube. But, if they could write 1,000,000 lines of code to suggest videos to keep you engaged and on their site as long as possible, then they can do the same thing to keep the content of quality.

Leave a Reply

Your email address will not be published. Required fields are marked *