Connect with us

Hi, what are you looking for?

Florida News

Marco Rubio: Why is YouTube Shutting Down Conservative Videos?

This week, U.S. Sen. Marco Rubio, R-Fla., sent a letter to YouTube Chief Executive Officer (CEO) Susan Wojcicki raising concerns with YouTube’s latest censorship of religious and politically conservative speech. Specifically, Rubio questioned three incidents and demanded that the Google-owned company clarify how each incident resulted in a violation of YouTube’s community guidelines.

The letter is below.

Dear Ms. Wojcicki:

I write to you regarding a concerning series of actions by YouTube to censor or otherwise restrict the speech of its users, particularly ones of religiously or politically conservative backgrounds. A combination of high-profile moderating actions have recently made headlines and raised questions over a pattern of apparent political and religious bias on YouTube’s part.

Millions of Americans rely on YouTube every single day to share, learn about, and discuss political and cultural issues. While it is a private company entitled to make its own rules and regulate speech with whatever ideological slant it would like, YouTube also enjoys special protections under Section 230 that shield it from civil liability for the content it hosts.

When Section 230 was initially passed in 1996, it was with the intent to defend small, upstart internet companies from litigious attacks. Now, however, Section 230 effectively immunizes from legal consequence market-dominant firms like YouTube – enormous internet platforms with a unique responsibility to maintain the openness of online discourse. It is with this particular obligation in mind that I ask for clarification about three separate moderating incidents that appear to have restricted users’ speech in an unfair way inconsistent with your own rules:

1. On August 7, respected Presbyterian minister and bestselling author Carl Trueman gave a livestreamed talk to the Sacramento Gospel Conference, broadcast on Immanuel Baptist Church’s YouTube channel. During the presentation, his address was shut down twice – once, for an ostensible copyright issue, but the second time for an unexplained “content violation.” No details were given why, and nothing about the talk – which was an analysis of American cultural attitudes toward sex – could be construed as encouraging hatred or violence. What specific rule did he violate?

2. Earlier in the month, you suspended Senator Rand Paul for pointing out the inefficacy of cloth masks in preventing the spread of the coronavirus. His comments were nearly identical to ones on PBS from Michael Osterholm, Director of the Center for Infectious Disease Research and Policy at the University of Minnesota – and Biden’s own former COVID advisor – who pointed out the “disservice to the public” medical professionals have made in failing to point out the ineffectiveness of “face cloth coverings.” Had he made them on YouTube, would you have also suspended Osterholm for those comments?

3. According to your own rules, YouTube “doesn’t allow content that spreads medical misinformation that contradicts local health authorities’ or the World Health Organization’s medical information about COVID-19.” What specifically did Senator Paul say that contradicted WHO guidelines, which have similarly high standards for fabric masks?

4. Given the evolving nature of the guidelines provided by local health authorities and the WHO, how does YouTube update its moderating policies for when a user’s speech should be restricted?

5. You also recently removed a video of Congresswoman Nicole Malliotakis’s press conference announcing a lawsuit against Bill de Blasio’s vaccine passport policy, claiming it “violate[d] community guidelines.” What specific community guidelines did it violate?

6. Who is in charge of policing these guidelines, and what accountability standards exist when those content moderators fail to police them in an impartial manner? What rules exist on the book in order to maintain impartiality?

7. YouTube’s community guidelines state that after “reviewers decide that content violates our community guidelines,” that content is subsequently removed from the platform. How does YouTube determine that content has reached a violation threshold that merits removal? Is this threshold based on the subjective views of “reviewers,” or are there specific criteria that must be present?

8. How many individuals are involved in the review process, and what is the appeals process – and average determination timeline – for individuals who feel that their content has been wrongfully removed?

I would appreciate a reply no later than September 1, 2021. Thank you for your attention to this critical matter.

Author

  • Kevin Derby

    Originally from Jacksonville, Kevin Derby is a contributing writer for Florida Daily and covers politics across Florida.

    View all posts

Archives

Related Stories