In recent years, experts and analysts from government, the private sector, civil society, and academia have raised concerns about the threat misinformation poses to their societies. They have variously maintained that mis/disinformation pollutes the public sphere, degrades democracy, heightens polarization, and undermines the liberal international order (Adler and Drieschova Reference Adler and Drieschova2021; Bennett and Livingston Reference Bennett and Livingston2018; Ecker et al. Reference Ecker, Lewandowsky, Cook, Schmid, Fazio, Brashier, Kendeou, Vraga and Amazeen2022; Kakutani Reference Kakutani2019; Tenove Reference Tenove2020; Wardle and Derakhshan Reference Wardle and Derakhshan2017).
These perspectives on misinformation are far from universally held. In fact, they have faced often vehement opposition from several quarters. In the United States, leftist free speech advocates have bridled at heavy-handed policing of controversial ideas by the government and tech platforms (Burgis Reference Burgis2021; Kuo and Marwick Reference Kuo and Marwick2021). More prominently, critics on the political right have accused technology companies, government agencies, and academic researchers of engaging in deception and advocating censorship (“Censorship-Industrial Complex” 2023). In Europe, there has been a greater effort at regulation, yet populist and illiberal politicians weaponize the idea of misinformation or selectively downplay the issue (Hameleers and Minihold Reference Hameleers and Minihold2022). The backlash against arguments highlighting the supposed threat of misinformation has resonated among segments of the public, threatened the ability of scholars to better understand the flow of (mis)information, and helped fuel attacks on the mainstream press by the second Trump administration (Rutenberg Reference Rutenberg2025).
One way to make sense of these controversies over misinformation is to depict it as a familiar partisan fight, at least in the American context. Over the past decade, Democratic and centrist elites, rightly or wrongly, have generally agreed that misinformation is a problem in need of a solution. Republican elites, in contrast, downplay misinformation and argue that the real aim of those raising concerns about misinformation is to stifle speech they disagree with (Myers and Frenkel Reference Myers and Frenkel2023). Both sides seek to gain advantage when it comes to urging tech companies to intensify or refrain from moderating content (Smith Reference Smith2020).
Yet this partisan framing, although accurate, overlooks important differences between proponents of the dominant “misinformation-as-threat” perspective and the insurgent “misinformation-as-hype” camp. The former, whom I term “incumbents, saw themselves as protecting their political system and its (democratic) values from an onslaught of harmful discourse. Incumbents’ position within established institutions and their credentialed training led them to embrace the notion that whether information is accurate is knowable through positivist methods of analysis and that misinformation can and should be countered.
Members of the insurgent camp, whom I call “challengers,” positioned themselves as outsiders seeking to weaken incumbent institutions and came predominantly from the political right. Rejecting the authority of “experts,” they instead framed the problem as one accessible to laypeople without special training, or they sought to elevate the authority of alternative voices who went against institutional expertise. Proponents of both these ideal-typical approaches sought to advance their institutional and ideological interests in the arena of public discourse via their preferred ways of assessing information. A third group, made up of “skeptics,” were institutionally (and usually ideologically) aligned with incumbents. Yet they were ambivalent about the proper epistemological approach to misinformation and raised concerns about incumbents’ power, credentials, and potential biases.
Central to the divisions between these discursive coalitions is their stance on what constitutes legitimate authority to make determinations about misinformation. By this understanding, debates about specific pieces of (mis)information, which are usually viewed in isolation, proxy for two more important areas of dissensus. First is a question of methodology: On what basis can we determine how much untruth is circulating and harming the public discourse? Second is the question of authority: Who should define the rules and norms of the new information environment? These questions are related. The struggle over authority is fought in part in the realm of epistemology, as actors frame the problem in accordance with their area of competence. If misinformation is deemed to be a technical problem amenable to assessment only by credentialed experts in particular fields, incumbents can claim the authority to determine what is true or false and the prerogative to make decisions on the governance of online discourse. If it is instead viewed as a matter open to lay interpretation or if institutional authorities are inherently biased, then alternative forms of knowledge can be brought to bear and, challengers maintain, they should exercise more control over the public sphere.
Which methodology is applicable and, by extension, whose authority is enhanced depend on the dimension of misinformation in question. Although attention to misinformation usually focuses on the factual content of individual claims, the discourse on misinformation is broader, encompassing questions of attribution, prevalence, consequences, and policies. Some dimensions—content, consequences, and policy—are difficult for incumbents to construct as lying exclusively in the domain of technical expertise, because they involve questions of interpretation and value judgments on which non-experts also have a say. Incumbents relying on their preferred methods are vulnerable to critique by skeptics and challengers based on shared criteria that can yield different conclusions.
In the domains of attribution and prevalence, where misinformation claims involve inaccessible data and sophisticated methodologies, credentialed experts have exclusive skills to make valid empirical assessments. Challengers cannot directly contest incumbent claims and are left to invoke alternative criteria. Framing their objections in grassroots and “democratic” guises and using populist epistemologies (Mede and Schäfer Reference Mede and Schäfer2020), they level broad-brush salvos about the legitimacy of incumbent claimants. This form of “asymmetric” contestation tends to play out differently from discursive struggles waged on more “symmetric” terms.
At a time in which online communication is abundant and central to efforts to shape public opinion, this essay offers a new framework to clarify the contestation over misinformation. It moves beyond the common observation that we live in a “post-truth” era to conceptualize when and how groups marshal different forms of knowledge. It reveals that some features of the information landscape naturally favor conventional forms of expertise, whereas others create openings for heterodox forms of knowledge. In exploring how actors coalesce, frame arguments, and select tactics, it lays out the lines of contention in the politics of misinformation.
The object of analysis in this study is the rhetorical tactics of actors seeking to shape perceptions about misinformation. In this arena, how to define misinformation is itself contested. Most scholarly definitions refer to false or misleading information, but beyond this minimal requisite, there are disagreements around nuances, such as whether misinformation should refer exclusively to online communications or also to those offline; requires an assumption about the messenger’s intentions; or involves the consequences of receiving the message (Osman Reference Osman2024; Osman et al. Reference Osman, Meder, Bechlivanidis and Strong2023). It is also the case that the preferred term of scholars and policy makers has changed over time, from “fake news” to misinformation, even though the underlying phenomenon they refer to remains essentially the same (Farkas and Schou Reference Farkas and Schou2018).Footnote 1 There is also contestation around terms such as disinformation (false information spread deliberately), conspiracy theories, and propaganda, which, in addition to being the objects of study by serious scholars, are also opportunistically wielded as epithets in the struggle for discursive advantage. Because the users of these labels do not always define them, yet they are nevertheless deployed to discredit others, this analysis is concerned with episodes that implicate at least the minimal definition of misinformation given earlier, even if the claimant uses a different term. Because this analysis deals with the ways various actors speak about misinformation rather than interrogating (mis)information itself, I do not check whether they are using the term consistently or correctly, nor do I attempt to adjudicate their claims.
The theoretical framework relies on an analysis of a range of sources. To synthesize the incumbent position, I consulted more than 60 policy documents and reports from government agencies, think tanks, NGOs, and academic researchers complemented by their public statements and news reports on their activities.Footnote 2 I examined party manifestos from Europe and the United States from the Manifesto Project (https://manifesto-project.wzb.eu/) and read transcripts and reports from congressional hearings and EU parliamentary debates. These materials were selected as representative of prominent incumbent positions in the early phase of the period examined, between 2016 and 2020. They were promoted on the platforms of prominent organizations and cited by other incumbents. I treated them as primary documents that reveal the unfiltered views of incumbents as the problem was being constructed and before they encountered major pushback.
For challengers, I read reports on criticism of incumbent discourses to identify prominent politicians and media figures; I then examined their commentary, which was usually on social media and far-right websites. For skeptics, who are predominantly scholars and public intellectuals, I consulted published research, commentaries, and blog posts that critique the motives, interests, claims, or methodology of incumbents.Footnote 3
This essay offers three main interventions into the ongoing study of online misinformation and the discourse that surrounds it. First, it takes a sociological approach to identify how actors come to align themselves with distinct discourse coalitions. Unlike accounts that frame the narrative as the defenders of facts fighting against the peddlers of post-truth (D’Ancona Reference D’Ancona2017; Kakutani Reference Kakutani2019), it considers the interests and motivations of different groups in relation to one another as they vie for strategic advantage. It demonstrates how actors with similar institutional and ideological profiles coalesced around a set of common stances and promoted their views to shape the public discourse on misinformation. With some exceptions, the composition of those coalitions has persisted over time.
Second, by disaggregating the concept of misinformation I demonstrate how contestation among competing coalitions plays out in different ways, depending on the domain of misinformation. I highlight how institutional knowledge and expertise or an outsider/populist stance may be better suited to claims-making depending on the question being asked. Focusing on the differentiated nature of contestation by dimension helps explain, for example, why debates over misinformation content and policy tend to be more contentious than ones about its scale or attribution.
Third, this analysis frames struggles over misinformation as a contest for authority. Situating the misinformation debate as part of broader political and cultural struggles within democracies at a time when trust in institutions is low helps explain why (mis)information is contested so fiercely. The stakes are high not only because the adjudication of misinformation affects whose voices can be aired and at what volume. The issue takes on outsize proportions because whoever prevails in shaping the discourse surrounding misinformation stands to gain authority over the rules governing the public sphere, with implications for the future of free speech and democratic participation.
I begin by examining the evolving popular discourse and scholarly debates about misinformation. I contextualize the emergence of incumbents, a coalition of practitioners and advocates from diverse fields who brought their expertise to bear to understand and communicate about misinformation, followed by two distinct coalitions that leveled political or epistemological critiques at the incumbents. I then introduce a typology that disaggregates questions about misinformation into five distinct domains. This typology elucidates how three distinct modes of discourse were brought to bear, with different implications depending on what aspect of misinformation was at issue. Using examples from the most intense phase of the COVID-19 pandemic, I show how incumbents, skeptics, and challengers engaged to shape the discourse on misinformation in ways that reflected their institutional, ideological, or epistemological positions. In the conclusion, I discuss how the framework contributes to ongoing scholarly debates and the implications of the analysis for the current and future adjudication of misinformation and struggles over the rules governing the public sphere.
Misinformation and Its Malcontents
It has become conventional wisdom that we are living in a “post-truth” era.” Although misinformation has been around for a long time (Bernstein Reference Bernstein2021; Osman Reference Osman2024), it burst into public consciousness as a seemingly novel threat following the dual shocks of Brexit and Trump’s electoral victory in 2016. As the ramifications of these two political earthquakes became clear amid the global populist wave, commentators warned about the threat that fake news, misinformation, disinformation, and conspiracy theories pose to democratic discourse. In 2016, Oxford Dictionaries named “post-truth” the word of the year. Books for the general public addressed these concerns, with titles such as Post-Truth (D’Ancona Reference D’Ancona2017), Fantasyland (Andersen Reference Andersen2018), The Death of Truth (Kakutani Reference Kakutani2019), and The Misinformation Age (O’Connor and Weatherall Reference O’Connor and Weatherall2019). A UN resolution from December 2021 focused on “countering disinformation for the promotion and protection of human rights and fundamental freedoms” (United Nations 2021). As late as 2024, the World Economic Forum identified misinformation and disinformation the “top immediate risk to the global economy” (Chan Reference Chan2024).
Public attitudes have followed accordingly. In a 2018 Eurobarometer survey, 85% of respondents said online fake news is a problem in their country, and 83% agreed it was a problem for democracy (European Commission 2018). A 2020 Pew poll from the United States found that 75% believed it was very or somewhat likely that Russia or other foreign governments would attempt to influence that year’s presidential election (Hartig Reference Hartig2020). According to a 2022 Pew multicountry survey, 70% overall thought the spread of false information online was a major threat, with figures in Germany of 75%, France 74%, Italy 77%, and the United States 70% (Thompson Reference Thompson2022). And a 2023 survey found that 58% of Americans believed AI would increase the volume of misinformation in the 2024 election (Swenson and O’Brien Reference Swenson and O’Brien2023).
Yet these perspectives have not gone unchallenged. Donald Trump notoriously countered the revelation of fake news during the 2016 election by accusing critics of spreading “fake news” (Ross and Rivers Reference Ross and Rivers2018). Legislators, activists, and online personalities, predominantly on the political right, argued that their communications were being suppressed for ideological reasons. Surveys in Europe mirrored those in the United States in showing that conservative and anti-establishment voters are distrustful of fact-checking (Lyons et al. Reference Lyons, Mérola, Reifler and Stoeckel2020). In the United States, pushback against countering misinformation led to lawsuits and congressional investigations of academic misinformation experts (Myers and Frenkel Reference Myers and Frenkel2023).
The scholarly literature has followed a similar trajectory to the public debate, with a surge of works identifying misinformation as a problem and proposing solutions, followed by bouts of skepticism about the claims and implications of the first wave of studies. Studies proliferated in fields such as psychology, political science, economics, communication, and computational social science that demonstrated people’s susceptibility to misinformation and “fake news” (Allcott and Gentzkow Reference Allcott and Gentzkow2017; Martel, Pennycook, and Rand Reference Martel, Pennycook and Rand2020; Muhammed and Mathew Reference Muhammed and Mathew2022; Pennycook and Rand Reference Pennycook and Rand2021; Scheufele and Krause Reference Scheufele and Krause2019; Vosoughi, Roy, and Aral Reference Vosoughi, Roy and Aral2018) and proposing ways to counter it (Ecker et al. Reference Ecker, Lewandowsky, Cook, Schmid, Fazio, Brashier, Kendeou, Vraga and Amazeen2022; Lewandowsky et al. Reference Lewandowsky, Ullrich, Seifert, Schwarz and Cook2012; Lewandowsky and Van Der Linden Reference Lewandowsky and Van Der Linden2021; Pennycook, Cannon, and Rand Reference Pennycook, Cannon and Rand2018; Pennycook and Rand Reference Pennycook and Rand2019).
Yet some scholars critiqued the dominant narrative that misinformation was widespread and problematic. Some raised doubts about the most exaggerated claims about misinformation, seeing them as a category error or moral panic (Bratich Reference Bratich2020; Carlson Reference Carlson2020; Coady Reference Coady, Bernecker, Flowerree and Grundmann2021; Jungherr and Schroeder Reference Jungherr and Schroeder2021). Others questioned the reputed novelty of misinformation or its effects, given people’s relatively low exposure to it online and the barriers to persuasion (Adams et al. Reference Adams, Osman, Bechlivanidis and Meder2023; Allcott and Gentzkow Reference Allcott and Gentzkow2017; Allcott, Gentzkow, and Yu Reference Allcott, Gentzkow and Yu2019; Altay, Berriche, and Acerbi Reference Altay, Berriche and Acerbi2023; Guess, Nyhan, and Riefler Reference Guess, Nyhan and Reifler2018; Scheufele, Krause, and Freiling Reference Scheufele, Krause and Freiling2021). Yet others warned that potential responses to disinformation could be worse than the problem they purported to solve (Freiling, Krause, and Scheufele Reference Freiling, Krause and Scheufele2023; Jones-Jang, Kim, and Kenski Reference Jones-Jang, Kim and Kenski2021; Jungherr and Rauchfleisch Reference Jungherr and Rauchfleisch2024).
Some critical responses to the first wave of misinformation research took aim at institutions, pointing out that (democratic) governments themselves have long spread disinformation in tandem with a complicit press, sometimes at the expense of communities of color (Coady Reference Coady, Bernecker, Flowerree and Grundmann2021; Farkas and Schou Reference Farkas and Schou2019; Kuo and Marwick Reference Kuo and Marwick2021). Others alleged that centrist political interests and technology companies exaggerate the threat of misinformation to minimize their own culpability for social problems and to justify their power (Bernstein Reference Bernstein2021; Bratich Reference Bratich2020; Carlson Reference Carlson2020; Williams Reference Williams2023).
This critical scholarship, by pointing out weaknesses in the early political, journalistic, and scholarly narratives about misinformation, suggests how the collective understanding of the phenomenon has evolved. However, this work does not systematically theorize about the interaction of the contending parties or consider how claims and counterclaims about misinformation are iterative and extend over time. Additionally, the focus on contested facts as the object of most analyses misses the battles taking place in the arena of public discourse, where actors compete not only to proclaim their opinions about “facts” but also to assert their authority to make definitive judgments about (mis)information.
The Constitution of Coalitions
Who sets the terms of debate over misinformation—what it is, how much there is, whether it is a problem, and what to do about it? An inchoate assemblage of researchers and practitioners who were alarmed by the Brexit result and Trump’s unexpected presidency took the initiative to frame misinformation as a problem and claim authority over how to counter it. Subsequently, two groups of critics emerged that rejected the dominant consensus but for different reasons. These three coalitions reflect broad affinities among actors relating to their stances on misinformation and can be distinguished by examining their institutional position, political ideologies, and epistemological approaches.Footnote 4 This section explains how each of these factors implies a particular perspective toward information and its circulation in the public sphere. The next section describes the composition of the coalitions and shows how they articulate those stances.
Institutional position refers to how an actor is situated in relation to the dominant political, economic, and cultural forces in a system (Fligstein and McAdam Reference Fligstein and McAdam2015, 13). A coalition’s position is stronger if it has support from the state, which confers legitimacy on other actors and enforces rules of interaction (68–70). Other institutional forces include financial institutions and actors who control a disproportionate amount of societal resources, as well as cultural elites who are influential in shaping the discourse and dominant norms (Bourdieu Reference Bourdieu1998, Reference Bourdieu and Granovetter2018).
Although influential political, economic, and cultural institutions do not necessarily share the same political ideology, they can be expected to work toward maintaining the machinery and legitimacy of the system of which they are part (Bourdieu Reference Bourdieu and Granovetter2018). In a democracy, this alignment means that the actors in question will stake out positions they believe are favorable toward sustaining democracy. In practice, working to maintain the values of the system—a normative commitment—may conveniently align with the self-interest of actors to maintain the institutional hierarchy of the system and lock in their power within it (Emirbayer and Johnson Reference Emirbayer and Johnson2008; Fligstein and McAdam Reference Fligstein and McAdam2015). Powerful actors also have the means to shape the public agenda in ways that suit their interests (Birkland Reference Birkland, Fischer and Miller2017). They would therefore be expected to defend the rules governing the production and flow of information that enable them to maintain their position.
A second consideration when it comes to misinformation is political ideology. Rather than act to maintain their position in a system (or undermine that of others), coalitions can be characterized by common beliefs about how the political system should operate. Actors seek to advance their ideological interests, usually conceived of in terms of a left–right spectrum, by gaining discursive advantage over ideological rivals and influencing public policy (Snow Reference Snow, Snow, Soule and Kriesi2004).
Historically, a liberal view on information has been associated with promoting openness and free speech as the best way to protect dissenters with unpopular opinions from powerful interests seeking to thwart demands for greater equality (Shiffrin Reference Shiffrin1990). Conservatives were more prone to stifle speech that threatens hierarchy and social order (DelFattore Reference DelFattore1992). These stances were largely reversed owing to a decades-long project by the American right to undermine and supplant the traditional media (Hemmer Reference Hemmer2016). The weakening of news gatekeepers and the development of alternative media companies and personalities gave rise to a self-contained media ecosystem that spread right-wing political views on radio, television, and online (Benkler, Faris, and Roberts Reference Benkler, Faris and Roberts2018; Bennett and Livingston Reference Bennett and Livingston2018).Footnote 5 It enabled conservatives to propagate their messages to enthusiastic audiences and assail their political rivals without being subject to the rigors of fact-checking or basic journalistic norms. The populist right benefited from and thus sought to preserve an unfettered internet, whose affordances it effectively leveraged to mobilize conservative voters (Tripodi Reference Tripodi2022), whereas institutionalists, ranging ideologically from the left to the center-right, favored regulation of online information to protect the institutions under attack (HowTheyVote.eu 2022).
A third factor is epistemological approaches to assessing (mis-)information.Footnote 6 There is an elective affinity between institutional position, political ideology, and preferred epistemological approach. Those who aim to maintain the (democratic) system place faith in the existing methods by which knowledge-producing institutions assess evidence and communicate their conclusions. They rely on well-established procedures and conventions practiced by experts in their field: transparency and replication in science, neutrality in weighing evidence and adversarial justice in the legal system, and investigating and fact-checking in journalism (Bertsou and Caramani Reference Bertsou and Caramani2022).
By contrast, the populist right attacks the legitimacy of institutions and rule by technocrats while granting authority to alternative evidence and uncredentialed “experts” (Caramani Reference Caramani2017; Collins et al. Reference Collins, Evans, Durant, Weinel, Eyal and Medvetz2023; Ylä-Anttila Reference Ylä-Anttila2018). This populist epistemology sees wisdom in popular experience rather than professional credentials and encourages engagement from ordinary social media users, who can circumvent traditional gatekeepers when spreading their (or others’) ideas (Harambam and Aupers Reference Harambam and Aupers2015; Mede and Schäfer Reference Mede and Schäfer2020). Political opportunists may also use populism as a smokescreen by selectively invoking conventional forms of expertise when expedient, despite their professed disdain for elites (Guasti and Buštíková Reference Guasti and Buštíková2020).
Contesting Authority over Misinformation: Incumbents, Challengers, and Skeptics
We can now specify how distinct coalitions of actors coalesced to engage with misinformation based on where they were situated in the landscape of institutional position, ideology, and epistemology. In 2016, a group heavily represented by elected officials, intelligence agencies, business leaders, NGOs, academic institutions, and the media in the United States and Europe depicted misinformation as widespread and consequential, arguing that it posed a threat to open debate and deliberation. I refer to this coalition as incumbents: “actors who wield disproportionate influence within a field and whose interests and views tend to be heavily reflected in the dominant organization” (Fligstein and McAdam Reference Fligstein and McAdam2015, 13). Coming from a broad swath of dominant authoritative institutions, they formulate policy (governments), protect national security (military and intelligence agencies), communicate with the public (politicians and the media), and engage in research (government agencies and academia; Arnoldi Reference Arnoldi, Eyal and Medvetz2023). Early advocacy by policy makers and the media was effective in raising awareness of the problem and prodding reluctant actors in the technology sector to take it seriously (Kurtzleben Reference Kurtzleben2018; Rawlinson Reference Rawlinson2017). Nonprofit fact-checking groups consolidated into consortia and became part of the incumbent coalition (“EDMO—United against Disinformation” n.d.; Poynter 2024).
The incumbent coalition’s discursive unity stemmed in large part from its members’ normative commitments to democracy, which they argued was threatened by the proliferation of misinformation. The EU, responding to Russia’s annexation of Crimea, created the East StratCom Task Force in 2015 to “forecast, address, and respond to Russia’s disinformation campaigns.”Footnote 7 President Obama laid out an early marker of the discourse in the United States in a 2016 press conference with German chancellor Angela Merkel, calling misinformation a threat to “democratic freedoms and market-based economies and prosperity that we’ve come to take for granted” (Solon Reference Solon2016). Officials, including Director of National Intelligence James Clapper and National Security Agency director Mike Rogers publicly endorsed the idea that Russia’s hacking operation threatened democracy (“Foreign Cyber Threats” 2017; “NSA Chief” 2016). Because the twin shocks of Trump and Brexit benefited the political right, US Democrats were early advocates of the issue, including Hillary Clinton, who called fake news an “epidemic” (Merica Reference Merica2016). Yet the network expanded to become bipartisan, including Republicans such as Senators John McCain and Lindsay Graham (“Foreign Cyber Threats” 2017),Footnote 8 and it encompassed European parties and politicians on the center-left and center-right.Footnote 9 The coalition was transnational, emerging in Western democracies but also involving collaboration with activists and fact-checkers around the world, including in nondemocratic states (Stencel Reference Stencel2019).
Incumbents’ dominant institutional position was closely connected with their role in the production and dissemination of knowledge, a form of credibility they leveraged as they worked to construct a conception of misinformation as the type of problem they were well suited to address (Bacchi Reference Bacchi2009; Sending Reference Sending2015). To assert their authority, they marshaled and repurposed their competence in tasks like data analysis and coding, collecting and analyzing foreign intelligence, and pursuing accountability and verifying sources, backed by academic degrees in statistics, psychology, computer science, communications, and journalism.Footnote 10 Incumbents in different sectors and countries would collaborate, present their ideas at conferences, support each other financially, and publicly echo one another’s arguments.Footnote 11 Collectively, they produced numerous reports, speeches, research studies, position papers, media stories, and testimony at legislative hearings, which were amplified in the mass media (Couldry Reference Couldry2003)Footnote 12—an institution that saw its own interests threatened by misinformation (Waisbord 2018).
The success of incumbents in naming and framing the problem was evident in the sharp rise of “fake news” in media mentions and on Google searches starting in early 2017; the increase of “misinformation” in 2019, which peaked in the second half of 2022; and public concern captured in opinion surveys (Hameleers, Brosius, and de Vreese Reference Hameleers, Brosius and de Vreese2021).Footnote 13 Yet several questions remained unanswered: On what authority did misinformation experts act? Who granted that authority? And why should people deem their authority legitimate? Conventionally, authority derives from perceived competence and expertise, which in turn derive from qualities like specialized knowledge, experience, and education (Avant, Finnemore, and Sell Reference Avant, Finnemore and Sell2010, 12). Policy makers tend to defer to experts to define their interests in situations of high uncertainty (Radaelli Reference Radaelli1999; Zito Reference Zito2001), for which misinformation in the late 2010s arguably qualifies. Yet misinformation, although not a new problem, was still in formation as a field of scholarship and policy. Authority was contingent, rather than foreordained.
It is not surprising, therefore, that incumbents were challenged. One line of attack centered on their overweening financial, political, cultural, and symbolic power. It was evident that incumbents wielded enormous resources relating to information, including access to secret intelligence and forensic tools (governments and spy agencies); the ability to propagate, throttle, or conceal the information people produce online (tech companies); data analysis for descriptive and explanatory purposes (academia); and the wherewithal to broadcast news and opinions to the public (media). From a democratic pluralist perspective, incumbents’ sheer might, ubiquity, and the perception of consensus raised concerns about the lack of countervailing power and barriers to dissenting opinions being taken seriously. In an era of distrust of elites, they appeared to exemplify that elite (Collins et al. Reference Collins, Evans, Durant, Weinel, Eyal and Medvetz2023; Eyal Reference Eyal2019).
Other doubts about incumbents involved hubris and potential conflicts of interest. Some actors used their authority to offer opinions outside their areas of expertise, as when politicians spoke with the authority of epidemiologists to debunk questionable claims about COVID or journalists pontificated on algorithms controlling social media feeds (Crovitz Reference Crovitz2019; Lavazza and Farina Reference Lavazza and Farina2020). Some critics noted incumbents’ tendency to make unambiguous judgments about (mis)information even when corresponding scientific research was unsettled rather than definitive (Vraga and Bode Reference Vraga and Bode2020). And critics raised concerns about whether self-interest might influence incumbents’ diagnoses or shade their policy recommendations (May Reference May2021).
These critiques were leveled predominantly by two sets of actors. First were challengers, who rejected the premise of misinformation articulated by incumbents and represent populist and illiberal—and usually far-right—political interests. They comprised a coalition of politicians, media personalities, businesspeople, and online influencers, predominantly based in the United States, who aimed to undermine the authority claims of incumbents and shape the rules governing the public sphere in their favor. Although they posed as institutional outsiders, their most prominent representatives came from privileged positions, exemplified by Elon Musk (the world’s richest person) and Tucker Carlson (one of the most popular right-wing media figures). Yet other exponents of these views were socially marginalized in society’s dominant institutions because of their extreme religiosity or fringe views (Adler-Bell Reference Adler-Bell2022b). Challengers’ antisystemic positioning within democracies aligned them with leaders in authoritarian states such as Vladimir Putin in Russia and Viktor Orban in Hungary. They and other incumbent autocrats were engaged in building alternative knowledge-producing institutions that reflected their regimes’ illiberal valuesFootnote 14 (Yablokov and Chatterje-Doody Reference Yablokov and Chatterje-Doody2021) and were eager to collaborate with illiberal movements abroad to undermine the liberal order (Abrahamsen et al. Reference Abrahamsen, Drolet, Williams, Vucetic, Narita and Gheciu2024; Adler-Nissen and Zarakol Reference Adler-Nissen and Zarakol2020).
Challengers’ contestation around misinformation dovetailed with their disdain for authoritative institutions and their aim of displacing incumbents—and the pro-democratic and left/centrist political interests they represent—from power. Intellectuals of the New Right leveled a Gramscian critique of incumbents’ overweening influence and left-centrist ideology, which they maintained allows incumbents to shape the ideological discourse through their presence in the media, academia, and bureaucracy (Abrahamsen et al. Reference Abrahamsen, Drolet, Williams, Vucetic, Narita and Gheciu2024). Right-wing activists such as libertarian tech financier Peter Thiel and Breitbart News founder and Trump adviser Steve Bannon worked to build an intellectual infrastructure for this project in the Trump era, alongside allied think tanks such as the Heritage Foundation, the Claremont Institute, and the Bradley Foundation (“In Preparation for Power” 2022; Mayer Reference Mayer2021).
Challengers drew on a decades-long attack on institutionalized expertise to call into question incumbents’ credentials and claims of authority over misinformation. Right-wing online personalities such as Alex Jones, Jack Posobiec, and Lara Loomer alleged that incumbents were intent on censoring conservatives; they mobilized their large followings to put pressure on elected officials to embrace their claims (Fandos Reference Fandos2018). That liberals aimed to suppress conservative voices online, in league with their ostensible allies in universities and the tech sector in Silicon Valley, soon became an article of faith among Republican legislators, spearheaded by Jim Jordan, Ted Cruz, and Josh Hawley. In Europe, although anti-incumbent narratives were not as pervasive, right-wing populist parties such as France’s National Rally, Spain’s Vox, and Poland’s PiS articulated similar views criticizing fact-checking or Silicon Valley’s supposed censorship of conservative opinion (Charlish and Wlodarczak-Semczuk Reference Charlish and Wlodarczak-Semczuk2021; Luque Reference Luque2020; Pollet Reference Pollet2022).
In contrast to challengers, skeptics generally share the same liberal-democratic values as incumbents but do not necessarily accept incumbents’ premises, interpretations, or the implications thereof.Footnote 15 They comprise academic and independent critics in the United States and Europe who, although inhabiting some of the same institutions as incumbents, harbor a distrust of overweening power, which is evident in their suspicion of the (national security) state and skepticism toward narratives that privilege certain political or financial interests (Coady Reference Coady, Bernecker, Flowerree and Grundmann2021; Williams Reference Williams2023). Self-avowed classical liberals, such as academics Steven Pinker, Niall Ferguson, Richard Dawkins, and Jonathan Haidt, and neuroscientist Sam Harris were vocal opponents of what they deemed overly sensitive and censorious restrictions of speech on the left, especially on college campuses. Unlike challengers, skeptics do not allege that incumbents necessarily harbor malign intentions, but they are cognizant of the potential for hubris or institutional or psychological biases when it comes to collecting evidence and interpreting data (Kuo and Marwick Reference Kuo and Marwick2021; May Reference May2021). They do not question the authority of experts as such but instead raise doubts about transparency, bias, misinterpretation, and exaggeration (Friedman Reference Friedman and Gunn2022; Uscinski Reference Uscinski2023; Williams Reference Williams2023). Because skeptics’ views on information tend to emerge in response to (overwrought) incumbent claims, they often end up superficially aligned with challengers. Yet skeptics reject what they consider the bad-faith and bullying tactics of challengers when they target incumbents (Adler-Bell Reference Adler-Bell2022a). As such, skeptics address the shortcomings of both incumbents and challengers, and they contribute to shaping the public discourse.
The Coherence of Coalitions
This sorting exercise necessarily makes broad sociological claims that simplify a more complex reality. There are ambiguities at the boundaries of coalitions due to overlapping perspectives between incumbents and skeptics (who are both classically liberal) and between skeptics and challengers (both of whom are critical of incumbent power). The coalition an individual aligns with may depend on the issue, whether it be elections, climate change, or vaccine efficacy, and there are inevitably disagreements among actors within the same coalition. Scholars in the incumbent camp might, for example, critique the epistemological or methodological choices of other incumbents researching the same topic even while they push back against skeptics and challengers (Khazarian, Jalbert, and Dash Reference Khazarlain, Jalbert and Dash2024).
Furthermore, we cannot map actors’ institutional belonging automatically onto their coalitional affiliation. For example, academics and journalists, although they share professional and social backgrounds with (other) incumbents, have incentives to produce original ideas that may override their assumed coalitional affinities. Tech companies pose a different problem for categorization, because their positions shifted with the political and financial winds. Initially dismissive of the notion that misinformation was a problem, social media companies responded to incumbent criticism by embracing content moderation and investing heavily in trust and safety operations. However, after receiving political pushback from challengers and suffering a decline in tech investment, they became less likely to embrace incumbent narratives and returned to their previous laissez-faire approach. The most extreme example of this evolution is Elon Musk, who arguably became the foremost spokesperson for the challengers after he bought Twitter (Ortutay Reference Ortutay2024).
Despite these qualifications, this categorization represents an important conceptual intervention. It offers to bring clarity to an often-muddled picture by highlighting the strong nexus of power, ideology, and discourse on the topic of misinformation whereby contending actors found themselves in the same constellation significantly more often than they diverged. These overlapping affinities led coalitions to take consistent stances toward misinformation over time and position themselves distinctly in relation to the other coalitions. This becomes evident when we examine how actors put forward their arguments.
Disaggregating Misinformation Domains: A Framework
Having established the standpoints of the three coalitions, this section lays out a framework for understanding how they engage in relation to one another. To demonstrate their distinct rationales and rhetorical tactics, I disaggregate misinformation into distinct domains: content, attribution, prevalence, consequences, and policies.Footnote 16 The way contestation plays out within a domain depends on whether it is more conducive to credentialed expertise or lay interpretation. Depending on the domain, incumbents, challengers, and skeptics find themselves engaged in either symmetric or asymmetric contests based on their preferred epistemological stances and methodological approaches.
According to their foundational premise, incumbents view misinformation as a problem subject to technical analysis. It is, to them, the province of skilled experts applying methods based on training and experience, whether quantitative or analytical, to make categorical assessments. A technical approach effectively denies the input of people who lack relevant skills, seeing them as passive consumers of expert judgments, rather than as active participants with a role in shaping them. Incumbents aim to remove questions about misinformation from the realm of politics. Implicitly, they depict their authority as value-neutral and, in some instances, endorse a rigorous scientific approach to assessing online information more akin to studying the physical world than complex and contested social problems (Shu et al. Reference Shu, Cui, Wang, Lee and Liu2019; Zhou and Zafarani Reference Zhou and Zafarani2020).
Challengers, in contrast, question assertions of conventional expertise and embrace subversive forms of knowledge production. Their preferred mode of analysis removes authority from the dominion of credentialed expertise and expands the range of actors who can legitimately contribute to the discourse. This may be embodied in two ways, depending on what mode of argumentation best advances their argument. First, they may invoke conventional populist tropes that privilege popular wisdom over expert consensus and accept unfounded assertions, folk (and conspiracy) theories, and emotions as legitimate forms of claims-making. Second, they may rely on alternative “experts,” who either lack formal credentials or—potentially more persuasively—have the degrees and training of incumbents but advocate heterodox views that place them far from the mainstream of the profession. For example, journalist Lara Logan used her credibility as a former CBS News reporter to spread conspiracy theories on far-right media platforms (Peters Reference Peters2022). Lieutenant General Michael Flynn traded on his career in national intelligence when he championed claims that the 2020 election was stolen.
When evaluating how contestation takes place, I refer to symmetric or asymmetric contests, not to gauge who “wins” or “loses” an imagined audience but to examine how and to whom the contestation is pitched. Do incumbents make their claims unopposed, or are they challenged? Do detractors use ostensibly similar evidentiary standards and reasoning as incumbents do? Do fights play out in mainstream venues where all sides attempt to persuade neutral observers, or do actors focus on appealing to their core supporters? The analysis is not primarily concerned with the effects of these contests, which are difficult to gauge, although public opinion surveys can shed light on overall attitudes.
Although incumbents assert primacy in all domains, contestation in some domains is close to being symmetric. When it comes to evaluating the content of a claim—that is, discerning its factuality—the coalitions engage on the same terms and appeal to the same audiences. They deploy evidence available to the public and refer to knowledge produced by other authorities, whether scientific, medical, investigative, or otherwise—even though they may prefer different sources of evidence and interpret it differently. Likewise, when it comes to policy—how policy makers should respond to misinformation—actors compete in the same arena to shape public attitudes. Because policy preferences rest in part on values and ideological beliefs, incumbents face a natural impediment in their efforts to claim exclusive authority.
On the issues of attribution (Who is responsible?) and prevalence (How widespread is it?), incumbents have an intrinsic claim to authority due to the nonpublic nature of the data and the sophisticated techniques required to conduct analysis. Challengers therefore contest asymmetrically, relying on other modes of critique to make their case: questioning the motives or credibility of incumbent claimants or privileging populist modes of reasoning. They typically do not marshal sources of evidence that directly challenge incumbent claims. Finally, assessing consequences (What problems does it cause?) lies somewhere in between the extremes. Some questions are conducive to technical expertise and analysis, but the challenge of establishing causality and the value judgments required to appraise consequences place it more in the realm of symmetrical contestation.
Table 1 outlines a schema showing how the three coalitions publicly frame, claim, and contest on the five domains of misinformation: content, attribution, prevalence, consequences, and policies. Distinguishing the stances of relevant actors on these dimensions is critical to gaining clarity about the relationship among disputes over (mis)information, epistemological competence, and contests over authority.
Table 1 Coalitions and Positions

Case Study: Contesting Misinformation Claims about COVID-19
How does discursive contestation unfold in these domains in practice? I anchor the empirical analysis in contentious debates in the United States about the COVID-19 pandemic, primarily in 2020 and 2021 when it emerged as a serious public health threat. This relatively bounded period began at a moment when the incumbent position was well established, but skeptics and challengers had already entered the fray. The outbreak of a novel virus provoked difficult and unsettled questions about science, medicine, and public policy, which incumbents tried to address. Yet the pandemic occurred at a moment of media fragmentation and populist ascendancy and produced a wealth of claims that were propagated, debunked, and contested, placing misinformation at the center of many public debates. The stakes were high: personal and policy decisions about masking, social distancing, and vaccine uptake had life-and-death implications.
The implications of the case study are limited in place and time. Although the coalitions as described are constituted transnationally, COVID policies were made at the national level, and contestation was waged in the United States primarily with domestic audiences in mind. Responses to the pandemic were often politicized, and messages put out by scientists, public health officials, and journalists warning the public against the threat of misinformation were challenged online and incorporated into the partisan narratives and political agendas prevalent at that moment. Because the pandemic foregrounded scientific and health expertise, the case might not generalize to contested episodes about other issues. Additionally, the evolution of the public discourse and the changing balance of power among the coalitions mean that these dynamics were subject to change. The developments covered here represent only a snapshot of a longer process.
Content
The most vehement contestation typically centers on the truthfulness or falsity of the content of information. In making truth judgments, incumbents weigh claims against evidence produced and assessed by credentialed experts from nonpartisan institutions, including government agencies, universities, and the private sector. Resistance to incumbent judgments about content appears in several forms. Skeptics, pointing to past episodes of scientific bias or government overreach and malfeasance, are open to a wider range of views than those representing an official consensus. Challengers reject the authority of incumbents to evaluate information and instead promote countervailing discourses based on different evidence, or alternative interpretations of the evidence, to rebut official claims.
When the COVID-19 pandemic arose, government agencies and academic researchers produced guidance based on scientific studies, much of it provisional and subject to change. These findings gave impetus to governments at various levels to close businesses and schools, mandate mask wearing and social distancing, and assess penalties for violating those rules. It did not take long for dissenting messages to circulate on social media, sometimes in concert with street protests. Although scientific researchers studying the virus possessed the wherewithal to rebut these claims directly, others—including local authorities, social scientists, and journalists—conveyed this information secondhand based on trust in the scientific methods and expertise of the practitioners who produced the research (Kreps and Kriner Reference Kreps and Kriner2022). When RNA-based vaccines were produced, anti-vaccination groups circulated misleading claims based on questionable interpretations of the evidence, such as from the VAERS database of vaccine side effects (“Posts Continue” 2022). Governments, fact-checkers, tech companies, and researchers took note and sought to counter their messages; for example, dubbing a group of anti-vaccine accounts the “disinformation dozen” (Center for Countering Digital Hate 2021).
Yet the public availability of data to both incumbents and challengers meant that nonscientists were able to marshal their own interpretations and cast doubt on scientific expertise. Given the provisional and rapidly accumulating nature of the data on COVID, the consensus among credentialed experts shifted over time. Debates about what counted as evidence and contested interpretations of that evidence led to public confusion on a wide array of COVID-related topics (Nagler et al. Reference Nagler, Vogel, Gollust, Rothman, Fowler and Yzer2020). For the most part, contestation around content was based on competing interpretations of existing scientific data, which enabled those with less political or cultural influence to enter the fray and elicit engagement by incumbents on mainstream platforms.
Contestation around supposed COVID misinformation counterposed actors with unequal resources. Incumbents included most of the scientific and medical establishment, which had access to raw data on case numbers, hospitalizations, and deaths and could conduct large-scale epidemiological studies, in addition to their allies in government and the media. Skeptics scrutinized their claims and questioned their interpretation of data (Yglesias Reference Yglesias2020). Among challengers, a small number of doctors and scientists conducted their own experiments or promoted alternative treatments and found support among a political fringe (Huang Reference Huang2021; Piller Reference Piller2021). Libertarian-leaning public health experts in the skeptical camp did not endorse challengers’ claims but criticized incumbent efforts to delegitimize them or suppress their dissemination (Wen Reference Wen2022).
The so-called lab leak theory, about the accidental release of the virus from the Institute of Virology in Wuhan, China, was initially deemed an instance of misinformation by incumbents. Public health experts and epidemiologists published studies and made statements asserting that the virus spread from the Wuhan live animal market and labeled the lab leak hypothesis a conspiracy theory (Stolberg and Mueller Reference Stolberg and Mueller2023). Under pressure, Facebook and Twitter removed or demoted corresponding posts (Lima Reference Lima2021). Yet the evidence for either scenario was always circumstantial, and proponents of the market theory who lacked firsthand data, whether from media, politics, or academia, were reliant on the same publicly available data as skeptics and challengers. Skeptics urged caution and argued that the available evidence did not merit squelching debates about a possible lab leak. Challengers went further, accusing public health authorities of orchestrating a cover-up (Colton Reference Colton2021). Swayed by the lack of a “smoking gun” from the market and the accumulation of details about experiments being conducted at the Wuhan Institute, the views of many incumbents eventually shifted such that both theories were considered plausible and worthy of further investigation (Stolberg and Mueller Reference Stolberg and Mueller2023).
Attribution
Second, where attribution is not self-evident, incumbents aim to ascertain the identity and origins of the source of misinformation. One facet of this problem is whether it originates domestically or from abroad. The prospect of a foreign-sponsored disinformation campaign takes on national security connotations when the originator is a perceived adversary, especially Russia or China, and it is often depicted as an attack in violation of a state’s sovereignty (Polyakova and Fried Reference Polyakova and Fried2019).
The issue of attribution is less subject to contestation than content because only select actors are in a position to ascertain where information on the internet originates (Schulzke Reference Schulzke2018). Critically, determinations of attribution are frequently based on nonpublic data available only to experts using highly sophisticated techniques. For an example, a Rand study concluded, “Russia and China promoted dangerous conspiracy theories about COVID-19 that likely had a negative impact on global public health” (Johnson and Marcellino Reference Johnson and Marcellino2021). These findings relied on the analysis of 240,000 articles using “latent Dirichlet allocation and OPTICS clustering.” Reports on the coordination of foreign disinformation were produced by data analytics firms like Graphika, nonprofit organizations like Atlantic Council’s Digital Forensics Lab, and academic centers like the Stanford Internet Observatory (Bandeira et al. Reference Bandeira, Aleksejeva, Knight and Le Roux2021; Bush Reference Bush2020; Nimmo, Herbert, and Cheng Reference Nimmo, Hubert and Cheng2021). Technology companies like Microsoft and Meta and specialized offices within intelligence agencies selectively disclosed the discovery of foreign-sponsored information operations without making their data or methodology publicly available (National Intelligence Council 2021). Officials relied on studies like these, as well as governmental sources, when warning the public about the specific threat of “foreign adversarial disinformation” on COVID (“Briefing” 2020).
Another question requiring exclusive forms of expertise is whether humans or bots lie behind social media misinformation. Technology companies, although inclined to treat controversial content permissively, nonetheless enforced policies that prohibited “coordinated inauthentic behavior” and mandated its removal from the platform. Facebook and Twitter periodically blocked accounts they identified as bots, sometimes as part of state-backed campaigns, but did not share the methods used to make this determination (Gleicher Reference Gleicher2019; Twitter 2021). State Department officials warned about Russian “covert and coercive malign influence campaigns” early in the pandemic (Glenza Reference Glenza2020) and Russian disinformation about vaccine side effects at a later stage (Barnes Reference Barnes2021).
Critics objected in part as a function of the opacity involved in the production of attribution claims. Skeptics questioned whether revelations of state-sponsored or bot-based campaigns were consistent or thorough, fearing that governments misidentify or selectively publicize disinformation campaigns that identify foreign adversaries or are especially damaging to incumbent authorities (Boyd-Barrett Reference Boyd-Barrett2019). Others critiqued incumbents’ assertions of the greater danger of foreign over domestic misinformation (Zhang Reference Zhang2021) or argued that partisan demand for Russian disinformation was the problem, rather than its supply (Keating and Schmitt Reference Keating and Schmitt2021)
Sometimes other incumbents with access to raw data uncover mistakes made by others. When Hamilton-68, a think tank devoted to exposing Russian disinformation, misidentified tweets from US-based conservative accounts as Russian bots, Twitter’s chief safety officer privately objected, writing, “I think we need to just call this out on the bullshit it is” (Soave Reference Soave2023). Notably, he did not make his objection public: it was only revealed when Elon Musk disclosed it via the “Twitter Files” in an effort to undermine the company’s previous leadership (Taibbi Reference Taibbi2023).
Challengers, who sometimes find themselves accused of amplifying foreign disinformation, reject the premises of those claims but do not have access to the data or methods used to establish attribution. Unable to rebut such accusations directly, they resort to asymmetric tactics, impugning the motives of their incumbent accusers or accusing them of spreading “fake news,” appealing only to a narrow and partisan segment of the population (Desiderio Reference Desiderio2020).
Prevalence
A third dimension of misinformation is prevalence. Incumbent discourse tends to depict misinformation as widespread, especially in comparison with its circulation before the digital age (Timberg and Dwoskin Reference Timberg and Dwoskin2017; United Nations 2021; Vosoughi, Roy, and Aral Reference Vosoughi, Roy and Aral2018). As with attribution, only actors with sufficient resources and expertise such as academic researchers, tech companies, and governments are in a position to identify and quantify (what they deem to be) misinformation, whereas ordinary users and observers are unable to render judgments about data they cannot access or are unable to analyze. Claims of a COVID-19 “infodemic” rested on the supposed volume of misinformation about the virus, based on studies of large-scale social media data (Evanega et al. Reference Evanega, Lynas, Adams, Smolenyak and Insights2020; Kouzy et al. Reference Kouzy, Jaoude, Kraitem, Alam, Karam, Adib, Zarka, Traboulsi, Akl and Baddour2020), as well as anecdotal impressions (Ghebreyesus Reference Ghebreyesus2020).
Skeptics counsel for greater uncertainty, arguing that misinformation and related phenomena such as rumors and conspiracy theories were rampant before the digital age and do not necessarily increase over time (Scheufele, Krause, and Freiling Reference Scheufele, Krause and Freiling2021; Uscinski et al. Reference Uscinski, Enders, Klofstad, Seelig, Drochon, Premaratne and Murthi2022a). They also criticize what they see as incumbents’ tendency to seek predetermined conclusions about scale and warn that bias can shape interpretations of the datа (Altay, Berriche, and Acerbi Reference Altay, Berriche and Acerbi2023; Uscinski, Littrell, and Klofstad Reference Uscinski, Littrell and Klofstad2024). This was the case with the “infodemic,” which skeptics argued was based on limited evidence and faulty premises (Krause, Freiling, and Scheufele Reference Krause, Freiling and Scheufele2022; Simon and Camargo Reference Simon and Camargo2021). Challengers cannot easily perform their own data analysis at the same scale to rebut the interpretations of credentialed experts. Rather than challenge claims of the prevalence of misinformation, they instead flip the script by asserting that it is the incumbents who are responsible for spreading misinformation (Carlson Reference Carlson2021b).
Consequences
As with attribution and prevalence, claims about consequences may require technical knowledge or access to exclusive data, but like with content, they are subject to dispute based on interpretations of the data and normative ideas of what is consequential. Incumbent arguments rest on democratic theory, referencing the importance of citizens being able to access correct information to participate responsibly in politics and make informed decisions (Ecker et al. Reference Ecker, Lewandowsky, Cook, Schmid, Fazio, Brashier, Kendeou, Vraga and Amazeen2022). By this logic, misinformation causes “information pollution” that impedes the acquisition of facts (Wardle and Derakhshan Reference Wardle and Derakhshan2017). For example, false beliefs about the spread of COVID-19 or the efficacy of the vaccine were associated with ill-advised public health behaviors (Roozenbeek et al. Reference Roozenbeek, Schneider, Dryhurst, Kerr, Alexandra, Recchia, Van Der Bles and Van Der Linden2020). And although there was disagreement about the effects of specific disinformation operations such as the 2016 election or Brexit, there was a broad consensus among incumbents that misinformation can negatively influence the outcome of elections (Corbet Reference Corbet2022; Cybersecurity and Infrastructure Security Agency 2022).
In some instances, intelligence agencies and tech companies may have access to data that can connect specific actions, such as a hacking operation or disinformation campaign, to an outcome like voting or changes in online behavior. In these cases, information about attribution and prevalence and, when applicable, data on private online behavior are necessary to show cause and effect. But this information is not sufficient, because demonstrating causality requires logical inference in addition to data collection and is vulnerable to critique on its own terms. As such, consequences fall into a category in which incumbents have an advantage, but their assertions remain contestable based on interpretation of observable evidence.
Dissenting voices may challenge incumbents’ claims on the consequences of misinformation directly, calling their interpretations into question or arguing that harms are speculative or unproven (Adams et al. Reference Adams, Osman, Bechlivanidis and Meder2023; Farkas and Schou Reference Farkas, Schou, Terzis, Kloza, Kużelewska and Trottier2020). Skeptics argue that citizens have never been well informed, yet democracy perseveres, or that there are more fundamental threats to democracy than misinformation (Altay, Berriche, and Acerbi Reference Altay, Berriche and Acerbi2023; Williams Reference Williams2023). Others highlight the risk that overreacting to perceived misinformation may have unintended consequences when it comes to free speech or scientific discovery (Freiling, Krause, and Scheufele Reference Freiling, Krause and Scheufele2023). When it came to the pandemic, scholars pointed out that trust in institutions or psychological traits were more important than misinformation when it came to believing false information or engaging in antisocial behaviors (Ternullo Reference Ternullo2022; Uscinski et al. Reference Uscinski, Enders, Klofstad and Stoler2022b).
Meanwhile, challengers, who benefit from the spread of certain kinds of misinformation, tend to elide the question of whether misinformation is harmful and instead (instrumentally and selectively) embrace free speech principles (Missouri Attorney General 2022). Challengers also blamed government authorities for social problems stemming from the pandemic, asserting their own causal claims about consequences (Fox News 2021). Critically, no technical expertise or data analysis is required to engage in these normative arguments.
Policy
The domain of policy, or how to respond to misinformation, does not find consensus among incumbents. Numerous academic papers were produced about the spread of misinformation, some using viral analogies, to aid policy makers in their efforts to convince the public to comply with public health measures (Kouzy et al. Reference Kouzy, Jaoude, Kraitem, Alam, Karam, Adib, Zarka, Traboulsi, Akl and Baddour2020; Lewandowsky and Van Der Linden Reference Lewandowsky and Van Der Linden2021). To combat COVID-related misinformation, there were disagreements about when and how to limit the circulation of misinformation, what authorities should be responsible, and how effective any measures would be (Krishnan et al. Reference Krishnan, Gu, Tromble and Abroms2021; Levine Reference Levine2021; Sell et al. Reference Sell, Hosangadi, Smith, Trotochaud, Vasudevan and Gronvall2021; Simpson and Connor Reference Simpson and Conner2020). These divisions cut across countries, government agencies, and sectors. Skeptics tend to look askance at interventions to “correct” misinformation, fearing that it would increase distrust in science and result in censorship or self-censorship (Freiling, Krause, and Scheufele Reference Freiling, Krause and Scheufele2023; Shir-Raz et al. Reference Shir-Raz, Elisha, Martin, Ronel and Guetzkow2023). Insofar as they support proactive policies, they tend to favor minimally intrusive measures that emanate from civil society such as media literacy training and academic research (Bateman and Jackson Reference Bateman and Jackson2024; Perini and Schie Reference Perini and van Schie2024).
In the domain of policy, challengers engage on equal footing with incumbents. Challengers oppose incumbent initiatives to restrict or counter misinformation on the grounds that elites are inherently untrustworthy and intent on stifling opposing speech (“Censorship-Industrial Complex” 2023; Hendel Reference Hendel2021). When it comes to proposed regulation of online content, they invoke alternative principles but ones that also resonate with democratic values, elevating individuals as citizens who deserve to know the truth but are often impeded by those in power (Carlson Reference Carlson2021a). Divisions within the incumbent camp aided challengers’ arguments that any regulation that might disadvantage them would also be damaging to the public at large.
Conclusion: The Protean Politics of Misinformation
This article has presented a framework to make sense of the sudden emergence of misinformation as a social and political problem a decade ago, followed by an intermittent but sometimes concerted backlash against the dominant actors and their claims. Although frequently subsumed into a simple left–right ideological binary, such an approach overlooks the broader and more consequential struggles over knowledge and authority. This article’s framework, by disaggregating differing approaches to misinformation and the components of misinformation itself, helps make sense of the discursive contests that play out in the public sphere. The division into coalitions maps onto a pro-establishment/anti-establishment political cleavage, rather than the more conventional left–right division (Uscinski et al. Reference Uscinski, Enders, Seelig, Klofstad, Funchion, Everett, Wuchty, Premaratne and Murthi2021), with the addition of one faction (skeptics) that is ideologically aligned with the pro-establishment camp but epistemologically more agnostic. Although the framework addresses stances on misinformation, the tripartite division also reflects perspectives on other contentious matters central to democracy, including race and identity, science—evident in the COVID debate—and foreign policy.
This framework can be applied to these (or other) issue areas to ascertain how coalitions assert or contest misinformation claims. To do so, it is important for researchers to identify who is labeling a particular claim or line of thinking as misinformation and what authority—scientific, institutional, or ideological—they draw on to do so. They should examine who challenges that label, noting whether they provide counter-evidence, appeal to alternative sources of credibility, or frame the label itself as a tool of political control or censorship. Another analytic move is to determine whether actors signal allegiance to specific political factions or whether they present their arguments as neutral, grassroots, or “common sense.” This will enable an analysis of how these dynamics play out across different domains (content, attribution, prevalence, consequences, and policy) and media ecosystems and how they reflect broader ideological commitments regarding truth, expertise, and critical inquiry.
This article yields three main insights into contention about misinformation. First, claims and counterclaims about misinformation are proxy fights over power, which accounts for why contention over social media posts can become so virulent. At issue is not simply whose facts are right but also who should have the authority to decide which facts are right and, ultimately, who has the right to engage in the public sphere. The stakes become apparent when incumbents advocate for the spreaders of misinformation to be downgraded in social media feeds or de-platformed and when challengers target incumbents through ad hominem attacks, conspiracy theories, and intimidation.
Second, clashes that are apparently political are nonetheless fought in epistemological and methodological trenches. At issue is what kinds of knowledge are best suited to make claims about misinformation. Incumbents assert authority based on training and standardized procedures. Challengers favor populist sources of knowledge—the idea that laypeople can access the truth—over “elitist” scientific evidence or promote the claims of alternative or uncredentialed “experts.” Yet on some questions, notably in the domains of content and policy, judgments came down to subjective interpretation based on normative principles. Incumbents have at times arguably strayed beyond their remit, making claims about misinformation even when such a verdict was not warranted by available evidence. Although it is tempting to dismiss the motives and methods of challengers, in instances in which incumbents offered certainty where humility was called for, as in the “lab leak” theory of COVID-19, skeptics and challengers exposed the dangers of groupthink and lack of curiosity among the scientifically credentialed. Their success was evident when incumbents were forced to admit they had overreached. Similarly, when it comes to policy, value judgments come into play as much as evidence, because there is no objective way to weigh the trade-off between unbridled speech and real or conjectured societal harms.
Third, although contenders may act on sincerely held philosophical principles, they also have self-serving motives when it comes to how the circulation of misinformation affects them. Incumbents have an interest in minimizing unwarranted criticism not only because it could degrade the quality of democratic discourse but also because it threatens to undermine their cultural and political power and their ability to enact their favored policies. As the frequent targets of illiberal populists, incumbents have an interest in accusing their accusers of spreading disinformation, even when the evidence for such accusations is uncertain (Uscinski Reference Uscinski2023). By the same token, illiberal challengers are also motivated by self-interest: they stand to benefit from (particular forms of) misinformation. Most egregiously, insurgent opponents of liberal democracy have disseminated false claims to cast doubt on the integrity of democratic elections because they believed it will help win them power (Starbird, DiResta, and DeButts Reference Starbird, DiResta and DeButts2023).
If misinformation contention is a proxy fight to establish authority in the new information dispensation, what is the state of that authority? Credentialed expertise does not automatically imbue individuals or organizations with legitimate authority, yet cultural power has staying power. Incumbents, for all the populist hostility they provoke, have been able to convince the majority of the US public that misinformation is widespread and, as recently as 2023, that the government or tech companies should restrict false information online (St. Aubin and Liedke Reference St. Aubin and Liedke2023). In Europe, the Digital Services Act represents a consensus among EU member states to subject large online platforms to outside scrutiny.
In the United States, however, challengers have made serious inroads in eroding the position of incumbents. The clarion that experts are intent on censoring their critics has resonated despite widespread acceptance that misinformation is a problem. Incumbents are divided as they debate the legal and ethical nuances of remedies such as algorithmic transparency, labeling, and media literacy training. Skeptics offer potentially constructive criticism aimed at preserving their vision of democracy. Meanwhile, challengers feed on the prevailing public distrust of elites to attack incumbents and appeal to underdog sentiments. This resistance has had concrete political consequences: first, a retrenchment by technology companies in their efforts to identify, label, and remove “harmful” content (Nix and Ellison Reference Nix and Ellison2023); and second, the construction of rival institutions to authoritatively assess claims from an illiberal perspective.
Perhaps the greatest triumph of challengers is their impact in pressuring large social media companies, which had previously rhetorically embraced the notion that misinformation was harmful, to change course. Before 2016, companies such as Facebook, Twitter, and YouTube allowed users to post on their platforms with only minimal restrictions. They conceded to the critics of Russia’s disinformation campaign against Hillary Clinton by embracing fact-checking and more aggressively removing content deemed harmful according to shifting criteria. Yet their actions were limited due to First Amendment protections of most speech in the United States (Wu Reference Wu2018), and the expenses associated with content moderation ate into their profits. By 2023, facing reduced investor enthusiasm and enjoying political cover from Republicans in Congress, they began allowing previously prohibited categories of content to circulate without restriction and sharply reduced their trust and safety teams and partnerships with fact-checking groups; Mark Zuckerberg even disavowed Facebook’s past practices he described as “censorship” (Frenkel and Isaac Reference Frenkel and Isaac2025; McCorvey Reference McCorvey2023; YouTube Team 2023). Meanwhile, the EU’s efforts to regulate content face resistance from US-based companies targeted for enforcement (Windwehr Reference Windwehr2025).
As the Republican Party has embraced illiberal populism, it has invested in building alternative institutions to proactively disseminate its ideas beyond its core supporters, via social media platforms, fact-checking groups, think tanks, and nonprofit organizations.Footnote 17 This project has enjoyed support from party officials and donors and is part of a larger transnational network of far-right populists who collaborate to counter liberal ideas (Abrahamsen et al. Reference Abrahamsen, Drolet, Williams, Vucetic, Narita and Gheciu2024; Garamvolgyi and Walker Reference Garamvolgyi and Walker2023). Alongside this effort, challengers have taken a more direct path toward winning the public discursive war: gaining control over the state and wielding its powers for political gain. With Trump’s return to power, this project appeared poised for success. One of his Inauguration Day executive orders echoed familiar far-right discourse by ordering the federal government to favor free speech and end “censorship” (White House 2025). Meanwhile his political appointees and aggressive treatment of disfavored media point to a concerted assault on the free press (Rutenberg Reference Rutenberg2025).
Even in the face of changing political conditions, the discourse coalitions of incumbents, challengers, and skeptics rest on sociologically rooted identities and ideological beliefs, making them likely to endure in some form. Yet the size of the coalitions may fluctuate as new actors become available and others cease their activities. Challengers may become emboldened and therefore more vocal and vehement in their discourse, whereas incumbents may be deterred from speaking in support of an unpopular or unsanctioned idea. Tactics may change, as state and financial actors rely more on coercive means to advance their policy and political objectives rather than seek to shape opinions through rhetoric. Finally, the capture of US state institutions by political forces aligned with the challengers portends a future in which the production of authoritative “knowledge” is based on ideological and nonscientific criteria (Sun and Weber Reference Sun and Weber2025). Though deprived of resources and political power, incumbents, faced with the Trump administration’s use of what they view as propaganda, will be motivated to continue calling out misinformation to thwart its efforts to undermine democracy, with the goal of eventually reclaiming their authority. The balance of power has changed, but the discursive struggle continues.
Acknowledgments
Previous versions of this article were presented in the “Europe in the World” seminar series and the Robert Schuman Centre seminar series at the European University Institute, and at the 2023 European Workshops in International Studies. I would like to thank Stephanie Hofmann, Elizaveta Gaufman, Eva Johais, Veronica Anghel, Waltraud Schelkle, Catherine Hoeffler, Maria Giulia Amadio Viceré, Mert Bayar, three anonymous reviewers, and colleagues at the University of Washington’s Center for an Informed Public.