Facebook is sick and tired of fake news, so it's rolling out some features to make sure the news that appears on your timeline is real – and not just some crazy hoax.
The new features were announced Thursday. Like if you see something that looks fake, you can report the post and let Facebook know you think it's a hoax.
Facebook is working with third-party fact checking organizations that follow Poynter’s International Fact Checking Code of Principles.
Stories that are reported to be fake will be sent to those organizations to be fact-checked. If the organization decides it's a hoax, the story will get flagged. There will also be a link explaining why the story is flagged as fake.
So fake news stories and hoaxes will still be out there, they'll just have a warning attached to them.
Facebook explains that it's not trying to limit people from expressing their thoughts and opinions though.
"We believe in giving people a voice and that we cannot become arbiters of truth ourselves," Facebook says. "So we’re approaching this problem carefully."
For that reason, Facebook is focusing on "the worst of the worst, on the clear hoaxes spread by spammers."
And if you still want to share stores that are flagged as fake, nobody will stop you. You will see a little warning message pop up, though.
Facebook has a video explaining how the whole thing works. You can check that out below:
These new features do not come without criticism.
Some people worry the organizations doing the fact-checking might have some political bias that ends up limiting or even censoring some stories. You can read more about those concerns here, here and here.