Mark Zuckerberg says that, despite all the fake news it disseminated, Facebook didn’t influence the election. He even called that “a pretty crazy idea.”
It would be easy for any media outlet, including Facebook, to claim that each voter is responsible for filtering information in order to make a responsible decision. That’s the essence of individual agency, an important principle in democratic elections.
Zuckerberg later contended that 99% of the content on Facebook is “authentic.” Of course, with nearly 1.8 billion monthly active users, even 1 percent that isn’t authentic means millions of pieces of fake content every day.
Examining Facebook’s role in spreading false and misleading information isn’t so crazy. Especially if it could have affected an election.
Any major media outlet – be it the New York Times, Fox News or Facebook – has great power to influence its consumers. Although Facebook itself doesn’t produce news, it disseminates content – in this case, via individuals who share news, opinion pieces, blog posts and photos to their friends. An active individual with hundreds of friends might have influence equal to a small-town newspaper.
Facebook’s own influence is through its “Trending” box in the upper-right corner of the website version of the site. (Trending topics pop up on mobile Facebook apps when you use the Search box.)
“Trending” substitutes the wisdom of the masses – Facebook users, that is – for selections that would be made by professional journalists at major news outlets.
The Trending topics on Facebook used to be curated. Real human beings looked through the stories that were listed by an automated system, and then found sources and wrote headlines for the Trending box. These curators also screened out fake news and hoaxes.
But Facebook fired those editors following a scandal about their biases in selecting stories. Fake news and hoaxes began to proliferate this year. The computer algorithms that detect what’s trending had no artificial intelligence to discern when a story was simply made up.
At the same time, partisan websites pushed out stories that were largely based on false information. And those misleading stories were some of the most popular ones. They were sensational and they played into what people were already inclined to believe.
The more users there are and the more actively they use Facebook, the more money Facebook can make from ads and “suggested posts.”
It’s wishful thinking to believe that average readers will filter out fake news from real news. Even moreso, readers these days seem unable to discern between facts and opinions, credible sources vs. rumor-mongers, or innuendo vs. actual reporting. To do so, they would need to have media literacy and critical thinking and skills, and the time and energy to apply them.
I couldn’t find any solid data about this, but critical thinking skills don’t appear to be commonplace among, say, high school graduates. Even people with college degrees don’t necessarily have solid skills in critical thinking, though they certainly should.
Having those skills doesn’t mean that people apply them, either. There’s a very human tendency to engage with information that aligns with what we already believe; we buy into news that conforms to our view of the world.
The Wall Street Journal graphically demonstrates this with its “Blue Feed, Red Feed” page, showing side by side how a liberal Facebook user’s content might look compared to that of a conservative’s. (A sample is the featured image above.)
Facebook has been criticized for creating echo chambers where each of us shares only news and information we agree with, and we unfriend or unfollow people who offer opinions and information that we don’t like. Trending stories makes that worse, by reinforcing the stories based on popularity rather than quality of content.
If I’m having a good day, and one of my friends posts a meme that is a hoax or a so-called news article that’s from one of these propaganda sites posing as journalism, I take the time to point it out.
But who am I, against millions of fake and misleading pieces of content?
I’d like to do something about this. I would like to help people learn how to question the information they’re getting, and decipher whether it’s high quality or not.
Many people will never want to learn these skills. But some will. I just have to figure out how to reach them.
Today’s penny is a 2015, the year that Facebook fired its “Trending” curators.