After being targeted, and kind of blamed for the rise in “fake news” stories posted on their site, Facebook is fighting back against fake news. Their weapon of choice? A ‘more info’ button. Facebook’s new idea to combat the rise of fake news stories posted on their site plans to show users more information on the source of the stories its user are reading. As part of a News Feed update, Facebook plans to provide more context added to the links people see. Users will be able to click a button and see information from the publisher’s Wikipedia page, a link to follow that publisher’s Facebook page, and also other links that may be related.
Facebook plans to update its system to do this automatically. This leaves no question of human error in this process. It also implies that there will not be a person behind a computer compiling information on a news source. Facebook hopes its users will use the information to better understand where their news is coming from, and will not be fooled by phony stories or individuals with bad intentions.
The idea seems like an honest attempt to take a step in the right direction on Facebook’s part. The problem is that the company seems to be overlooking the real reason that ‘fake news’ stories have an effect on particular groups of users on its site. That reason is, people will believe what they want to believe. No amount of additional info on the source of a story will change that. Once someone has an idea in their head that they truly believe, confirmation in that idea is a good thing to them, regardless of where or who that confirmation comes from. This is why targeted ads by Russian misinformation spreading sources worked so well before and after the 2016 presidential election. Ads were targeted to a specific group of users, and the ads and articles played on every one of the targeted population’s fears. How did Russian sources know the fears of American people who resided in a certain region of the Country? Good question. One that will hopefully be answered before the United States’ next presidential election, so the Country will know what to look out for the next time. Not that it really matters, because people are probably always going to believe whatever they want to believe, regardless.