While critics allege YouTube puts profits over public safety, product head Neal Mohan insists that the Google-owned video site is working to be a better content moderator, in part because it is good for business.
Why it matters: Users spend billions of hours watching videos on YouTube, and the site's content recommendations shape how that time is spent. Facebook and Twitter tend to get more attention on content moderation, but YouTube remains an equally important information battleground.
Driving the news: YouTube is announcing Monday that it now has two million people in its programs that enable creators to get paid. Mohan said a huge part of his focus is trying to find ways to make sure those who play by the rules are rewarded.
- "99.9% of creators are looking to do the right thing," Mohan told Axios, noting that YouTube has paid out $30 billion over the last three years.
- In addition to the 14-year-old program that shares ad money for popular videos, YouTube has also added ways for creators to sell merchandise or be directly compensated by users.
Between the lines: YouTube still faces challenges in making sure it is the creators "doing the right thing" who are benefiting the most, rather than spreaders of viral misinformation.
- It's not just those getting paid by Google who can benefit from gaming the system. Creators with a large enough following can make money indirectly even if they've been "demonetized" — removed from YouTube's own payment programs.
- In the "vast, vast majority of cases that's a good thing," Mohan said, though he acknowledges that it does create opportunities for some creators to profit from borderline content that doesn't meet YouTube's bar.
The big picture: On one of the biggest topics at the moment, COVID-19 misinformation, Mohan pointed to both the work that the company has done to enforce its policies and collaborations between creators and health authorities, as well as the dedicated spots YouTube has set aside for authoritative information.
- "I hope that we are perceived as ultimately a positive voice here," Mohan said.
- Critics, though, point to a vast array of videos that have promoted hesitancy around masks and vaccines. Some were eventually taken down, others have been allowed to remain on the site.
- Mohan noted that the landscape is ever-changing and the company's work around COVID-19 misinformation is ongoing.
- "The work is never done," Mohan said. "I have learned that there is always a new vector of misinformation that will pop up."
By the numbers: YouTube has recently started sharing the rate of policy-violating content that is being shown to visitors. Tech companies and critics agree that this is a more important metric than the total amount of content being removed.
- As of the fourth quarter, YouTube said that rate was 0.16–0.18%, meaning that out of every 10,000 views on YouTube, only 16–18 come from rule-violating content.
Meanwhile: Mohan said he continues to put a lot of effort into YouTube Shorts, which he says is more than just a TikTok competitor.
- Mohan notes that he is trying to add features that take advantage of YouTube's existing strengths, including making it easy for creators to create short remixes of existing YouTube videos.
- "You should look for more of those," Mohan said.