Most people agree fake news is a problem on social media. When folks pass around headlines without ever checking the sources, bad – and ridiculously inaccurate – news travels at top speed around the world. Now Facebook is taking action by launching a resource meant to help users identify fake news.
At this point, the “tool” is essentially a list of tips on how to identify fake news as well as options for what to do about it. Unfortunately, say critics, this “tool” really doesn’t do much good. The problem is not that people don’t know fake news exists, it’s that they don’t really want to do the work of reading the article in the first place, much less a bunch of ways to determine if the article – or the site that posted it – are legitimate.
Worse, the critics say, the tool doesn’t really address one of the key issues with fake news: all the corroborating sites. When a story breaks and gets good traction online, other sites are set up to grab and republish that content, regardless of nature of that content. It’s about getting clicks and increasing time on site, not about vetting the material.
So, when the average social media user clicks on a site, then, in the rare event they “look it up” … they will find a bunch of other sites all saying the same thing. They don’t really understand that it’s not several sources validating the story, it’s several sites simply copying the un-vetted information whole cloth.
That challenge hasn’t stopped Facebook execs from being optimistic about their new tool. Adam Mosseri, VP of News Feed at Facebook, is on the record as saying he believes people will become more discerning consumers of online information … or, at least, he hopes they will be.
It’s a catch 22 for Facebook. Their platform is one of the easiest ways for fake news purveyors to spread their content, so it stands to reason if someone wants to stop – or at least slow it – they have to address it on Facebook. Then again, Facebook needs the clicks and the shares, so they really don’t want to shoot themselves in the foot economically. It’s a tightrope they have to walk, so they’ve essentially put the onus on the viewer to determine if what they’re reading is real or not.
So, here we are back at the beginning, where, in reality, most people simply don’t want to do the work of reading much more than a headline are being asked to do even more work in order to avoid Facebook taking responsibility for policing its content.