Facebook Is Asking Users to Vote On News Organization Trustworthiness

In a new move announced on January 19, 2018, Facebook wants you to contribute to deciding which news is Facebook fake news, or not. In essence, Facebook is shrinking its own editorial role and asking a number of selected people to participate in this new trustworthy ranking vote in an effort to prioritize “high-quality” news. These people reportedly are a “representative group” of individuals on Facebook.

The decision comes after Facebook has continued to receive quite harsh criticism for allowing disinformation to spread rampantly throughout the platform. Mark Zukerberg wrote, in a blog post accompanying the announcement, that Facebook is not comfortable deciding which sources are most trustworthy in a “world with so much division” and that “We decided that having the community determine which sources are broadly trusted would be the most objective.”

In a statement to NowThis, a Facebook spokesperson said, “as with any change to how News Feed works, we want to make sure that our community has a voice in the process. That’s why we’re relying on a representative group of people on Facebook to inform our scoring system for news.”

With Facebook being regarded as the number one source of news on all social media, this new announcement raised a few alarms. This move opens up a plethora of new concerns and questions, however. Namely, exactly who are these individuals who will be voting and what portion of Facebook’s users do they actually represent?

Will Judgement Calls Be Biased?

In addition to my first question, this second question is undoubtedly at the top of my list. Recently, we’ve seen President Donald Trump call out news sites, journalists, and credible establishments such as the New York Times in an effort to discredit their reporting of his actions, as well as telling the stories of our time.

Now, I have no idea which side of the political spectrum you land on, but for this question, it doesn’t matter. We could potentially see conservatives marking every news articles from the Washington Post as un-credible while marking every article from Fox News as trustworthy. This concern can also be viewed from the other side, with disgruntled liberals marking every article from Fox News as untrustworthy.

According to “we know from social science research that people’s ideologies and people’s social identities shape what they believe is true and what they believe is credible.”

The good news? Facebook stated that they are aware of such biases and will take them into consideration so they won’t affect rankings “too much”.

“Even if there are some biased people in the random sample responding for ideological reasons, it won’t affect ranking of a given publisher because no one group can. We only value publishers more (or less) if many different groups of people agree the publication is trusted (or distrusted).” – Facebook Spokesperson to NowThis

Will Judgement Calls Be Accurate?

The second part comes solely due to human error. How many times have you been fooled by a story that you later found out was completely untrue? Think of the many, many individuals who are still fooled by articles from The Onion thinking that even those are true?

A month ago I conducted a little experiment in our own office, sharing a story with colleagues via Hipchat. It stated that a morgue employee had decided to take a nap and that another new employee mistook him for a car accident victim and took him to be cremated. Now, the article I shared with everyone was from the URL www.abcnews.go.com-us, rather than the actual ABC news site’s URL (www.abcnews.go.com).

They were fooled.

Facebook Fake News

Read: Buzzfeed Article debunking the story.

Now, if this article fooled a room full of web development professionals (even for just a few minutes) before they caught on to the fake URL and obvious insanity of the story, just think how easy it is to fool individuals who don’t have much overall internet experience.

What’s The Intended Outcome Of This Voting System?

Facebook’s main goal is to be able to prioritize news articles and publishing companies in the News Feed. News sources who have been voted to have higher credibility will; therefore, receive top billing, while other less credible sources (according to voters) will be buried.

How It Works

  1. First, the sample group will be asked if they recognize a news source.
  2. Then, they’ll rank each publication that they know by their trustworthiness according to the following criteria:
    1. Entirely
    2. A lot
    3. Somewhat
    4. Barely
    5. Not at all

Seems simple, no? I am, personally, a little worried that this tactic is going to backfire on Facebook. Leave your comments and opinions below!

Leave a Reply

Your email address will not be published. Required fields are marked *

Read Related Posts