On January 7, 2025, Meta, the parent company of Facebook and Instagram, announced it was removing its factchecking program in hopes of restoring and promoting free speech on its platforms.
According to Meta’s public announcement, “Over time we ended up with too much content being fact checked that people would understand to be legitimate political speech and debate. Our system then…reduced distribution. A program intended to inform too often became a tool to censor.”
Now, instead of full screen warnings that flag posts for misinformation, Meta has adopted a less intrusive, user-based Community Notes model which “[indicates] there is additional information for those who want to see it.”
In this moderation program, users submit notes on a post, and others with diverse perspectives rate the note on its helpfulness. Ultimately, the goal is to provide context so people can make their own decisions regarding the credibility of posts online.
Some students are wary of these policy changes.
“Misinformation is already spread through social media like crazy, and now without fact checking, that’s going to make everything 10 times worse. I’ve always taken everything online with a grain of salt, but most people on Instagram and Facebook will end up believing everything they see,” junior Ariena Thurairajah said.
AP History and Psychology teacher Anna Driver noted the controversy of removing fact checking.
“It’s a double edged situation. Users may feel that people will be able to put any information out there without the concern of somebody saying no, but it could also increase the possibility of misinformation and hateful speech.”
Additionally, Meta plans to allow more speech on its platforms. New changes include removing automated systems that scan for policy violations, lifting restrictions on heavily debated topics such as immigration, and pushing more political content.
“It’s not right that things can be said on TV or the floor of Congress, but not on our platforms,” Meta said.
“After Trump first got elected in 2016, the legacy media wrote nonstop about how misinformation was a threat to democracy,” Meta founder Mark Zuckerberg added.
“Fact-checkers have just been too politically biased and have destroyed more trust than they’ve created, especially in the U.S.”
As the responsibility of fact checking shifts entirely to users, it is more important than ever to critically evaluate news, comments, and videos seen online. The responsibility of discerning fact from fiction is in users’ hands.