Key Points
- YouTube will now allow accounts banned for spreading COVID-19 misinformation and misinformation related to the 2020 election to apply for reinstatement.
- The move represents a major policy shift from prioritizing “accuracy” to focusing on “free expression.”
- The change comes as Google faces intense political and regulatory pressure, including antitrust lawsuits.
- This reverses years of policies where YouTube actively removed misinformation and partnered with fact-checkers.
Google’s YouTube is rolling back its strict misinformation policies, announcing that it will now allow accounts previously banned for spreading false information about COVID-19 and the 2020 U.S. election to apply for reinstatement. The move marks a dramatic pivot for the company, which once championed its role in providing “accurate information” but is now shifting its focus to protecting “free expression.”
Google’s new stance comes as the company faces heavy regulatory and political pressure. It recently lost two major antitrust cases brought by the Department of Justice and is also in talks to settle a lawsuit from President Trump over the suspension of his social media accounts after the Jan. 6 Capitol riot.
This is a sharp reversal from just a few years ago. Throughout the pandemic and following the 2016 election, Google and YouTube aggressively promoted their fact-checking initiatives, partnering with third-party organizations to label false claims and removing content that violated their policies.
At the time, the company said it was its responsibility to protect users from conspiracy theories and misinformation.
In a letter to Congress explaining the new policy, a Google lawyer stated the company has an “unwavering” commitment to “free expression” that “will not bend to political pressure.” While the company is changing its approach to removing content, it states that it will continue to invest in other tools, such as watermarking for AI-generated content and “Community Notes” that enable users to add context to videos.