YouTube needs to be more transparent with its creators – Polygon



YouTube is currently facing a problem that Twitch is finally beginning to address.

Both companies want to increase transparency between executives and their core creator base, but YouTube isn’t taking any tangible steps to ensure that happens. Twitch and YouTube’s updated code of conduct effectively come down to one philosophy that all users are expected to abide by: Don’t be a jerk. The difference is that Twitch explains what it might look like to be a jerk, while YouTube is much more vague with its rules.

Whereas Twitch then reiterates what some of those actions may be, there’s plenty of room for interpretation within YouTube’s rules. Therein lies the problem.

Robert Kyncl is YouTube’s chief business officer, and he oversees the platform’s creators. Kyncl sat down with longtime YouTube creator Casey Neistat to talk about the company’s new guidelines and ongoing problems facing creators. Kyncl told Neistat the company listens to all feedback from creators and advertisers, adding that it’s disheartening to hear that both creators and advertisers feel like they’re in a “us vs. them” scenario with the company.

YouTube is looking to appease both creators and advertisers by introducing new rules that creators need to abide by in order to maintain good status with the company and receive payment for their videos. The company’s CEO, Susan Wojcicki, issued a blog earlier this year calling transparency between YouTube executives and creators one of the most important goals of 2018.

It’s something that YouTubers are now asking the company for in wake of Logan Paul.

There are consequences

Logan Paul changed the way YouTube creators are viewed. Paul uploaded a video on Dec. 31 that included the body of a man who recently committed suicide in Japan’s Aokigahara forest. Paul’s name made headlines as news outlets starting pointing to past disturbing behavior seen in previous videos. Critics demanded an answer from YouTube over the content, asking why a creator like Paul was being touted as the company’s golden boy.

Paul’s scandal came after a year of other major problems the company was facing with its content. It began with coverage of its top creator, PewDiePie, uploading a video that contained anti-Semitic imagery, and ended with questions over unsettling content being targeted at children.

YouTube needed to act. It instilled new consequences against Paul and suggested that it would take further action against the most problematic creators. The company’s decisions were widely applauded by critics and reporters, but creators came back to one particular word that haunted much of the community in 2017: adpocalypse.

Adpocalypse refers to an ongoing period of skepticism over YouTube’s handling of monetization. After major advertisers threatened to pull their content following reports that ads were being placed on disturbing videos, the company went into lockdown mode. Creators didn’t know if they were going to be paid, or whether their videos were going to be affected. Despite YouTube pointing to its community guidelines, YouTubers suggested the company needed to be more transparent.

YouTube seemed to listen. CEO Susan Wojcicki announced on Dec. 4, 2017 that the company would be detailing more regular reports about how community guidelines were being enforced, but didn’t allude to making those community guidelines clearer. How is YouTube planning to define what parts of prank culture are allowed and what isn’t? How does YouTube determine what type of sensationalist, drama-fueled content is good and what isn’t? How does YouTube determine which videos will be demonetized and what users will face more severe consequences?

All Wojcicki had to say was:

We understand that people want a clearer view of how we’re tackling problematic content. Our Community Guidelines give users notice about what we do not allow on our platforms and we want to share more information about how these are enforced. That’s why in 2018 we will be creating a regular report where we will provide more aggregate data about the flags we receive and the actions we take to remove videos and comments that violate our content policies. We are looking into developing additional tools to help bring even more transparency around flagged content.

YouTube’s policies are expected to herd more than 1.5 billion users. That may be why the rules are so vague. Here’s the company’s current policy on dangerous content:

While it might not seem fair to say you can’t show something because of what viewers might do in response, we draw the line at content that intends to incite violence or encourage dangerous or illegal activities that have an inherent risk of serious physical harm or death.

Videos that we consider to encourage dangerous or illegal activities include instructional bomb making, choking games, hard drug use, or other acts where serious injury may result. A video that depicts dangerous acts may be allowed if the primary purpose is educational, documentary, scientific, or artistic (EDSA), and it isn’t gratuitously graphic. For example, a news piece on the dangers of choking games would be appropriate, but posting clips out of context from the same documentary might not be.

“Dangerous or illegal activities that have an inherent risk of serious physical harm or death” is why TidePod challenge videos (in which people eat a TidePod) are not allowed on YouTube. What about videos that feature teens or young adults climbing to the top of skyscrapers? Is that dangerous content? Does it encourage others to do so? These videos are allowed on YouTube — and are monetized — but technically violate the company’s rules … or do they? That’s the issue — we don’t know.

YouTube representatives have said the company relies on machine learning algorithms and reports to tackle content, but the vagueness of the rules leaves a lot of room open for debate on what is and what isn’t acceptable.

When it comes to people’s financial lifelines, vague policies aren’t enough. YouTube can’t just say it’s going to be more transparent; the company has to be more transparent. Otherwise, creators are never going to change their content to better reflect the company’s values, and the company can’t expect creators to drastically change what’s been working for them.

Transparency is difficult, but it’s integral to a community’s growth and a company’s success. YouTube can’t get away anymore with pretending it’s being open with its creators and only taking action when critics call the company out. Rules need to be enforced, but they also need to be clear.

YouTube has a duty to inform its creators what is and what isn’t allowed, better than it has for the past few years. Transparency is the only way forward.

I never thought I’d be saying this, but YouTube needs to take a page out of Twitch’s book.

What’s Twitch doing differently?

Twitch introduced new guidelines targeted at tackling harassment and sexually suggestive content on the site, trying to be more transparent with its streamers. As soon as the new guidelines were introduced, Twitch users starting pointing out the rules were still too vague.

“Vague ‘update’ on rules,” one person commented. “Overuse of words like ‘intent’ to purposely muddy the waters on what they’ll actually action against. Just Twitch attempting to curb complaints about how biased the enforcement of the rules, by being able to say ‘see! We have these guidelines.’ but they’re guidelines that are fully up for extremely biased enforcement.”

Despite the upset within the community, Twitch managed to address exactly how the enforcement of those rules will work. The community guidelines state:

For minor first and second suspensions, your access to the site will be restricted for 24 hours, but there is no fixed time for the length of subsequent suspensions or suspensions for severe violations. We will consider appeals of indefinite suspensions based on our review of your conduct, account standing, and any other information we have. Our Customer Support and Moderation teams review all requests for reinstatement, and we process these along with all other support requests.

And when people asked how harassment and hateful conduct is defined by Twitch, the company specified those rules, too:

Hateful conduct is any content or activity that promotes, encourages, or facilitates discrimination, denigration, objectification, harassment, or violence based on race, ethnicity, national origin, religion, sex, gender, gender identity, sexual orientation, age, disability, medical condition, physical characteristics, or veteran status, and is prohibited. Any hateful conduct is considered a zero-tolerance violation and all accounts associated with such conduct will be indefinitely suspended.

Harassment is any content or activity that attempts to intimidate, degrade, abuse, or bully others, or creates a hostile environment for others, and is prohibited. Depending on the severity of the offense, your account may be indefinitely suspended on the first violation.

Just in case those guidelines weren’t clear enough, Twitch published an entire FAQ page specifically going over most harassment cases, outlining words, actions and behavior patterns that could result in an indefinite ban. Twitch listened to its streamer base, acknowledged there wasn’t a specific enough description of what harassment constitutes, and acted accordingly. It’s not perfect — nowhere close — but it’s a start.

It’s the same mentality that Mixer, Microsoft’s streaming platform, has worked with since the platform first began a couple of years ago. Mixer co-founder James Boehm told Polygon that when the platform was first created, streamers told the company they wanted to be in the know. The only way for creators to know what they can and can’t stream is by having a solid, go-to list of rules outlining content. Boehm said he heard that feedback and applied it to the company’s core philosophy.

“When we took to drafting our conduct and terms of service, we wanted to make it really clear what was okay and what was not,” Boehm said. “That way we could build a community that everyone could partake in; they knew the rules, knew how to behave. That’s translated into a community that has grown and become increasingly positive, which is something people are noticing.”

Mixer and Twitch’s guidelines on what constitutes harassment, hateful content and subject matter that doesn’t belong on the platform will help streamers make more informed decisions. Although it took Twitch some time to get there, and a series of negative stories pointing to the company’s previously erratic guidelines, executives are finally working on bettering the user experience for all.



Related Post