The results of the 2016 United States presidential election shocked many Americans. Every major news outlet—including Fox News—wrongly predicted former Secretary of State Hillary Clinton’s victory. The New York Times reported that President-elect Donald Trump himself was even surprised by his victory. Many have blamed this shock on an increasingly entrenched feature of modern life: the sealed-off nature of our online interactions and the media we consume. While the causes of this problem—frequently referred to as the “echo-chamber” or the “liberal [or conservative] bubble”—are complex and varied, the biggest culprit is Facebook, and that company bears a responsibility to address it.
There was considerable discussion within Facebook about whether or not the company had affected the outcome of the election, according to The New York Times. Several high-level executives voiced concerns, but Facebook CEO Mark Zuckerberg argued on Thursday Nov. 10 that the claim that Facebook influenced the outcome of the election was “a pretty crazy idea.”
The company boasts a membership of 200 million Americans, with a 2016 Pew Research Center Survey indicating 44 percent of American adults get their news from the social media website. Those numbers make Facebook the biggest media company in the U.S.
Thus far, the site has been reluctant to step into that role. In a Saturday Nov. 12 Facebook post, Zuckerberg wrote, “I believe we must be extremely cautious about becoming arbiters of truth ourselves.” This sentiment’s idealism is admirable, but deeply problematic. When so many Americans are using its product as a news source, Facebook does not have the luxury of pretending not to influence the opinions and quality of information absorbed by the American people.
Facebook could have influenced the election in several ways, but the most important is by locking people into fixed political perspectives. It is far too easy to drown people you disagree with on the platform—two clicks and the person will never appear in your news feed again.
Facebook presents itself as a marketplace of ideas, but a more apt analogy for the way most Americans use Facebook would be that it is an intellectual IMAX theater. It is a place you can enter and select what version of reality you’d like to experience, and then sit back and let others deliver it to you—whatever their agenda.
Facebook must change this situation in order to be consistent with its own values. When the election of Peter Thiel—the prominent Silicon Valley Trump supporter—to the Facebook board was criticized, Zuckerberg defended him and said, “We care deeply about diversity.” Truly caring about diversity would likely not entail allowing millions of people to lock themselves into hermetically sealed echo chambers of opinion.
The degree to which the material users read on Facebook changes their opinions is questionable. But this may be slightly beside the point: most people in the country were surprised by the election’s outcome, and a popped liberal bubble might have spared them the shock. Social media claims to make people more connected, yet in this instance we were systematically ignorant of the views of millions of our compatriots.
Exposing Facebook users to a greater diversity of opinion would not necessarily change their minds, but it might make them more open to doing so. Reading the same point of view over and over again creates the impression that everyone agrees with it, making people on both sides more likely to close their ears and to dig their heels in when they are confronted with opposing points of view.
Social media not only allowed Trump to reach millions of voters without virtually any campaign budget—it also allowed both candidates to make themselves and each other more vehemently unpopular than any other candidates in history. Facebook shares a significant part in this unprecedented public response.