In its ongoing effort to stem the misinformation finding its way into users’ streams, Facebook is testing a new I-button that uses Wikipedia entries to give readers context on the source of the news.
Facebook has been under fire since the 2016 presidential election for the proliferation of fake or slanted news popping up in its feeds and being disseminated to users. As part of its ongoing effort to revamp its process, it is testing a new I-button to links to give readers more information on where the news is coming from.
The information button, when clicked, will pull up a Wikipedia entry for the news source. The whole purpose of the new feature, Facebook said, is to help people make “informed decisions” on what they read and share.
In addition to the Wikipedia entry, users will get trending and related content, and details on how the article is being shared. “Helping people access this important contextual information can help them evaluate if articles are from a publisher they trust, and if the story itself is credible,” the company said.
Facebook will continue to tweak the process by listening to user feedback and working with publishers “to provide people easy access to the contextual information that helps people decide which stories to read, share, and trust, and to improve the experiences people have on Facebook.”
Facebook Product Manager Sara Su acknowledged that the Wikipedia entries can be targeted for abuse given that they can be edited by the public. “Vandalism on Wikipedia is a rare and unfortunate event that is usually resolved quickly,” she told TechCrunch . “We count on Wikipedia to quickly resolve such situations and refer you to them for information about their policies and programs that address vandalism.”
While the company continues to modify how news is displayed and from what sources, it obviously still has a long way to go. After the shooting in Las Vegas claimed 59 lives and injured more than 500, Facebook’s Safety Check feature was overrun with related items pointing to fake news.