The social media giant Meta, which represents Facebook, is making considerable changes regarding their content moderation policy. This month the company announced that it would end its third-party fact-checking program in the U.S. and replace it with a Community Notes model.
“The changes we are making are aimed at simplifying our policies to allow for more expression on the platforms and help users understand what the rules of the road are,” said Kate Ross, Public Policy Manager at Meta.
When asked about why Meta made these changes, Ross said the response to the change has been positive so far, especially from communities who are looking for more opportunities for free expression.
Ross added that the new Community Notes model will “Democratize the process,” and the experience on Meta’s platform will shift to the user’s control. Users can now decide what they would like to see and interact with on their platforms, including political content.
But what about content that is against the law, such as terrorism, fraud, or scams? Kate shared that, “Meta has great relationships with law enforcement regarding active threats, and the company coordinates with law enforcement for those requests. But Meta wants to ensure they are protecting the free expression of their users and their privacy.”
Meta will continue to proactively remove illegal content, such as child exploitation and direct terrorism content, but will roll back their policy of removing content that does not violate the law. If the content violates Meta’s policies and is reported, it will be removed, but they will no longer search for content to remove. By eliminating the process of automatically removing content that is controversial but not against the law, Meta will minimize the number of posts they remove, enhancing free expression on the platform.
In recent years, Meta has seen pushback from the public regarding the content they have taken down regarding COVID-19. Kate shared that the Biden administration requested Meta to take down posts about COVID-19 content, but Meta CEO Mark Zuckerberg pushed back against these requests in a letter he sent to the House Judiciary Committee. As a private company, Kate confirmed that Meta will continue to push back against requests from the U.S. government to remove content.
If you’d like to learn more about Meta’s new content moderation policy in the U.S. and sign up to be a Community Notes contributor, visit the announcement page.