Political Awareness In The Facebook Age

Published in Online Spin, July 22, 2016.

We’re just finished one presidential convention, heading toward another -- and it’s not my job to discuss the candidates or their positions. Instead, I want to discuss the way we come to conclusions about the candidates and their positions: the quality of our data, the depth of our knowledge, the level of our ability to make informed decisions.

The problem is filter bubbles, and I’ve talked about them before.

When I go on Facebook, if I have a history of reading Web sites and articles that trend liberal, surprise! My NewsFeed shows me liberal stories. If my history is conservative, the reverse. (The Wall Street Journal has an excellent demonstration of this if you want to see it in action.)

Surrounded by validation of our existing opinions, it becomes unfathomable that anyone could think differently from us.

We lament, “Who are these idiots voting for the other party? Every single news item just proves our party is right!” But it would be more accurate to say every news item we see proves our party is right -- while every news item the other side sees proves their party is right.

So we become increasingly polarized, increasingly intolerable of anyone whose opinions don’t match our own, increasingly drawn to divisiveness and away from understanding.

And the problem is not that Facebook has a political bias as a company. The problem is that its success metrics causes it to create content expressly biased for each of us.

Facebook lives and dies through our eyeballs. And our eyeballs are obtained through an algorithmic pandering that caters to every one of our human behavioral quirks: that we’re more likely to look at stuff we agree with, that we’re more likely to respond to negative content than to positive content, that we prefer simplistic and hyperbolic headlines to thoughtful and complex investigations.

Eyeballs. Pageviews. Unique visitors. Time on site. Quarterly results. Nowhere in those metrics is there a category for “a more-informed and aware public.”

Historically, the job of helping us be more informed and aware falls to the Fourth Estate: the news media. But the news media can’t get our attention, because we spend all our time on Facebook.

As a society, we need to be exposed to neutral information (to the extent that information can be neutral). We need to be able to see content that is consistently substantive, that we will sometimes agree with and that will sometimes challenge us, depending on our particular biases.

But last month, Facebook changed its algorithm to favor friends and family content at the expense of news content. The New York Timesreported, “The side effect of those changes… is that content posted by publishers will show up less prominently in news feeds, resulting in significantly less traffic to the hundreds of news media sites that have come to rely on Facebook.”

I know we’re terrified of the government getting involved in free speech. (According to The Daily Dot, “Even in the midst of the ‘trending topics’ scandal, polls showed a meager 11 percent of Americans were comfortable with the government imposing regulations regarding content on social networking sites like Facebook.”) But there is a public good at stake here, and we can find a happy medium.

If your site has more than a certain number of visitors who spend more than a certain amount of time there; if your site can be considered a primary source for information and can be proven to influence voter behavior, you should be required to surface a certain amount of content that could be considered neutral, much the way public TV stations are required to make space for public service announcements.

We don’t all need to agree with each other. But when we disagree, we should at least know why.