You've successfully subscribed to American Purpose
Great! Next, complete checkout for full access to American Purpose
Welcome back! You've successfully signed in.
Success! Your account is fully activated, you now have access to all content.
Success! Your newsletter subscriptions is updated.
Newsletter subscriptions update failed.
Success! Your billing info is updated.
Billing info update failed.
Fake News and Conspiracy Theories

Fake News and Conspiracy Theories

Francis Fukuyama

We are at a very dangerous and unprecedented point in American politics today. As noted in my last post, something strange has happened to the Republican Party. It has been shifting from a party built around ideology and ideas, to one that resembles a cult of personality focused on one individual, former President Donald Trump. Its core support base is today furious about a fact that simply isn’t true: the notion that Trump won the Nov. 3 election by a landslide, and that it was stolen by “radical, far-Left Democrats” in one of the most outrageous frauds in American history. This is what Trump told the crowd before the January 6 storming of Congress, and why 43 Republican Senators were not willing to vote to impeach the former President in the vote last Saturday.

Over the previous decade, the party had been moving steadily from Reagan-era concerns with taxes and regulation, to one based on identity. To be a Trump Republican meant thinking of oneself as non-elite despised by liberals, someone defined by race, by conservative values, or by place of residence. As Sarah Palin, the Trump before Trump, once said, “real Americans” don’t live in big, crowded cities, but come from places like the small towns from which she hailed.
But over the past year or two, even this identity framing has been inadequate to explain what has happened. There is some deeper sickness spreading, perhaps better understood psychologically than sociologically. According to recent polls, 72 percent of Republican voters believe that there was serious fraud in the election. One third to half have a favorable view QAnon theories, that the Democratic Party is a cabal of pedophile Satanists who traffic in young children. More than 50 percent are distrustful of the Covid vaccines being offered, despite the fact that they were introduced by the Trump administration itself. How do you make the journey from a Howard Jarvis angry about his rising property taxes, to this kind of fantasy?

Of course, much of this shift can be attributed to Trump himself. If you have a charismatic demagogue who is constantly propagating conspiracy theories, then his followers will fall in line. Our particular problem is that America has never had a president of this sort before.

But there is good reason to believe that technology has also played a large role in this transformation. In cyberspace, you can create vast fantasy worlds that are disconnected from any of the institutions that structured and filtered our lives in the pre-digital age. The rules of evidence that are used in judicial proceedings, or that guided mainstream media, don’t apply. People desperately want to believe that Trump won the election, and social media provides endless confirmation that this narrative is true. The consequences of today’s fake news isn’t the spinning of an inconvenient story as politicians have always done; it is having a sizeable minority of voters believing that their democratic institutions are fraudulent and that the current president was not legitimately elected.

There has consequently been a huge amount of discussion over the past four years about how to deal with the problem of fake news and conspiracy theories on the internet. But before we can come up with solutions, we have to understand what the problem is that we are trying to solve.

For some digital activists, the problem is the existence of bad information on the internet itself. They are motivated, of course, by the perfectly understandable concern over the growth of political extremism in America. They have supported the moves by the large internet platforms like Twitter and Facebook to shut down Trump, QAnon, and the dispensers of false information about Covid.  But they worry that the extremists are re-gathering on smaller, encrypted platforms like Signal or Telegram, or on new ones like the struggling Parler.

I share the concern with the growth of extremism, but I think that cyber policy should address the part of that growth that is due to the technology, and not the part that is rooted in society itself. America has a First Amendment that protects the right of free expression, regardless of how repugnant it is. The American understanding of this right is more absolutist than in other developed democracies, but nonetheless reflects the proper view that governments should not be the arbiters of speech, except when that speech constitutes an actual incitement of violence or criminal activity (as did Trump’s on January 6). There has always been extremism in American politics; what we should want is to marginalize it but not to stamp it out.

The difficulty of policing conspiracy theories was illustrated by an article in the British edition of Wired magazine, written by a journalist who was both researching the spread of conspiracy theories and working as a part-time yoga instructor. She began to notice that anti-vaxxer diatribes and false information about the Covid epidemic began seeping into the yoga forums she followed, all the way up to outright QAnon theorizing. There is, she notes, a certain natural affinity between New Age thought and conspiracy theorizing; both are distrustful of traditional approaches to health; yoga is dominated by gurus and other influencers who, to say the least, are not subject to conventional rules of evidence. It is hard to see how any third party regulator, whether the government or the platform itself, could prevent the spread of this kind of information.

The really dangerous thing that the large internet platforms have done goes way beyond the neutral hosting of bad content, however. Rather, Twitter, Facebook, and Google have deliberately accelerated the flow of false information and amplified it on an unprecedented scale. This is done in the name of business models that prize virality and attention, regardless of the quality or effects of the information transmitted. Under huge pressure from civil society groups, they have now tried to turn in the opposite direction, taking down controversial posts and, in Twitter’s case, famously deplatforming Donald Trump.

While we may approve many of these new actions in the short run, they are not a sustainable path for any modern liberal democracy. The large platforms (of which there are three, Twitter, Facebook, and Google), have neither the capacity nor the legitimacy to act as arbiters of democratic political discourse. In order to legitimate its own content curation decisions, Facebook has created an Oversight Board to review its decisions. This is putting lipstick on a pig: what legitimates the Oversight Board, or ensures that it reflects broader democratic choice? This simply cannot be done by a private company.

The first object of any public policy concerning potential political harms posed by the internet must focus not on the quixotic effort to ban fake news and conspiracy theories, but rather to reduce the power of the large platforms to either amplify or silence certain political voices. Over the past year, I have been running a Stanford Working Group on Platform Scale.  The group originally started as an Antitrust Working Group, but we quickly realized that antitrust is too narrow a framework in which to address the problems of platform scale. Antitrust has evolved to address issues of monopoly and anticompetitive behavior, but is not the right instrument to deal with the political harms engendered by the large internet platforms.

We published a White Paper last fall that recommends a different approach to the problem that we label “middleware.” Middleware is software that sits between the platform and the user, and allows the latter to control the kinds of information served up by the platform. Rather than being determined by the platform’s non-transparent algorithm, the user’s feed will be customizable through the outsourcing of content curation to a competitive layer of middleware companies. Middleware companies can reflect the myriad interests and tastes of platform users: colleges and universities can form an consortium to direct students and faculty to reliable sources of information; people who want to buy American or go for environmentally-friendly products on Amazon could use a middleware provider that would guide them. The rank-ordering of search results on Google might reflect deliberate preferences of the searcher, and not simply the targeted advertising interests of Google.

If this approach works, it would solve the central problem of today’s internet. It would reduce the underlying power of the large platforms to control political speech. As the report indicates, there are a variety of other approaches to doing this, including regulation, privacy law, data portability, and the like. All have serious defects, in either their expected effects, or in the political difficulties of implementing them.

If the middleware idea were to take off, it would not solve the problem of fake news and conspiracy theories. There would be anti-vaxxer middleware providers, or perhaps a QAnon-based one that would keep users locked up in narrow filter bubbles. But the objective of public policy should not be, to repeat, to eliminate such constitutionally-protected speech. Rather, it should follow a public health model of reducing the incidence of infection. By restricting bad information to clearly labeled channels, we might be able to get to a world in which the disease can be contained. The patient may still be sick, but at least will still be alive.

Frankly FukuyamaTechnologyDemocracy