Information disorder has proliferated on social media. But there are ways to inoculate yourself from misinformation and disinformation. (Amena Saleh / Wisconsin Watch)
Reading Time: 6 minutes

Wisconsin Watch is a nonprofit and nonpartisan newsroom. Subscribe to our newsletter to get our investigative stories and Friday news roundup.

Click to read highlights from the story:
  • Information disorder takes many forms but can be broken down into three categories: misinformation, disinformation and malinformation.
  • Human emotions and worldview make people susceptible to believing and spreading misinformation.
  • People should maintain a healthy media diet with an ever-evolving collection of various sources for local, state and national news, as well as specialized publications for topics like economics or technology.
  • This report is the first in a series from Wisconsin Watch disinformation reporter Phoebe Petrovic done in partnership with the Capital Times, UW-Madison and the National Science Foundation.

A viral TikTok claimed Disney World sought to lower the drinking age to 18. President Biden made outsized claims about job creation. A Twitter user impersonated a pharmaceutical giant announcing insulin is free. Russian agents leaked hacked emails from Hillary Clinton’s campaign. 

Each example reflects a type of what has collectively come to be known as “information disorder,” a term encompassing several kinds of misinformation, disinformation and malinformation that plague society. The various forms include propaganda, lies, conspiracies, rumors, hoaxes, hyper-partisan content, falsehoods and manipulated media, according to those who study the disease.

Journalists and academics around the world have dedicated themselves to examining information disorder. And conservative power brokers have begun to attack those efforts, often accusing them of suppressing free speech.

University of Wisconsin-Madison researchers are leading a National Science Foundation-funded project to safeguard democracy by limiting information disorder’s scope and impact. 

As part of the endeavor, Wisconsin Watch, the Capital Times and Snopes are among the organizations that now have dedicated reporters covering how information disorder infects the body politic and destabilizes democracy — and how we can protect ourselves. 

Everyone is susceptible and could pass it along to others. But knowing what to look for and what to do about it can build resilience.

Here’s a guide to help diagnose information disorder, as well as some potential remedies.

Types of information disorder

The concept of information disorder comes from Claire Wardle, a researcher who co-founded First Draft and now leads the Information Futures Lab at Brown University. She developed the term as an alternative to “fake news,” which at one point referred to the Russian disinformation campaign during the 2016 presidential election, but became co-opted by former President Donald Trump to dismiss legitimate news stories.

Credit: Claire Wardle & Hossein Derakshan

She identifies three major strains of information disorder: 

Misinformation: False or misleading content spread by those who don’t know it’s false or misleading or without intent to cause harm. First Draft notes that as people share disinformation without realizing it’s false, it can become misinformation. Research shows people are more likely to share misinformation that aligns with their worldview or signals their belonging to an in-group.

Disinformation: Deliberately false content designed to deceive or to harm. According to First Draft, three main factors motivate people to make and spread disinformation: money, political influence or desire to cause chaos.

Malinformation: Factual content spread with the intent to cause harm. It’s often private information spread for corporate or personal interest, such as when someone posts intimate photos of an ex-partner online, an act known as revenge porn

What it looks like 

First Draft further breaks down information disorder into seven common forms. 

  1. Fabricated content: Completely false or made up content — videos, photographs, stories — made to deceive or do harm.
  2. Manipulated content: Genuine information or imagery altered to deceive. 
  3. Imposter content: Content such as websites or posts impersonating real organizations or people.
  4. False context: Genuine information shared alongside false contextual information, such as sharing stories that are old or from a different place as if they’re current and related to current events.
  5. Misleading content: Content that may have genuine elements, or a “kernel of truth,” but are framed, recontextualized or reformulated in deceptive ways. 
  6. False connection: Headlines, captions or visual elements that don’t support or accurately represent the content, such as clickbait. 
  7. Satire or parody: Joking content that can dupe people into thinking it’s genuine or serious. This becomes more damaging the further it gets removed from its original source.
Credit: Claire Wardle

Josephine Lukito, a professor at University of Texas at Austin who studies mis- and disinformation on social media, said Russian troll campaigns favored false context and misleading content, often sharing real news stories with sensationalized framing to rile up a particular group. 

She recommends looking for a particular news story on Google to see if something similar appears on other outlets. If it’s a genuine event other news outlets will likely have covered it, too. 

Lukito said disinformation is also increasingly spread through videos and images, such as doctored screenshots purporting to be a news outlet posting a story, which can be harder to fact check.

Other imposter content commonly takes the form of websites or social media accounts, said Mike Wagner, a journalism professor at the University of Wisconsin-Madison. Wagner is the lead investigator for the NSF-funded research project in which Wisconsin Watch and the Capital Times are participating. 

“We’ve had misinformation since we’ve had information, and we’ve had people sharing things that aren’t true since they shared things that are true,” Wagner said.

University of Wisconsin-Madison journalism professor Mike Wagner. (Courtesy of Mike Wagner)

In 2022, after Twitter changed its verification rules to allow anyone to buy a blue check, a tweet purporting to be a pharmaceutical company announcing it had made insulin free went viral. The company’s stock dropped about 4% — although it’s impossible to attribute it solely to the tweet. Months later, the company capped the monthly out-of-pocket price for all consumers. 

Although the person behind the imposter tweet said it was obviously satire, it was effective in part because the blue verification check, username and avatar mimicked the company’s. Other imposters might share other features as well, with only small departures from the real account. The same rules may apply for websites impersonating news outlets or companies, often with a slightly different address being the tell.

Why we’re susceptible

Research shows that our emotions and worldview play a significant role in what information we believe and share.

A headline, Wagner said, can hit “you in an emotional way that makes it hard for you to be motivated to think about why that might be wrong.”

Anxiety, in particular, enhances belief in disinformation and leads to its spread. When people feel uncertainty or fear, they’re more likely to pay attention to information that resonates with their present emotions. 

That also plays into a psychological concept known as cognitive dissonance: the discomfort felt when confronted with information contrary to one’s beliefs. It can lead someone to reject credible information that further threatens those beliefs, instead seeking out comforting, even if false, information.

Even when calm, people tend to believe and share information that supports their worldview, especially if it comes from within their own circles, a phenomenon psychologists refer to as “confirmation bias.”

First Draft compiled several foundational psychological concepts here. They include the “third-person effect,” which means considering others more susceptible to misinformation than oneself, and “fluency,” the tendency to believe information that is easier to process.

Madeline Jalbert, who studies misinformation and social cognition at the University of Washington, said in a webinar that humans assume information is true by default. And we’re more likely to believe misinformation is true when it feels easy to process, such as featuring a clean font and crisp audio quality, when it lacks cues to analyze information carefully and when it meets some of the basic elements we instinctively use to evaluate truth.

The “truth criteria” are:

  1. Compatibility: Whether it’s compatible with other things known to be true.
  2. Coherence: Whether it “makes sense.”
  3. Credibility: Whether it comes from a credible source.
  4. Consensus: Whether other people believe it.
  5. Evidence: Whether it has supporting evidence.

How to protect yourself

“In this digital era, people are both consumers of information and producers of information,” Lukito said.

Because of that, she said, it’s as important to consider what you share as it is to consider what you consume. That means considering the truthfulness or veracity of content before sharing.

University of Texas at Austin journalism professor Josephine Lukito. (Courtesy of Josephine Lukito)

Wagner recommends approaching content with this goal: “Learn the truth.”

Repeating that mantra can help move beyond personal biases, making someone more open to the possibility that their “side” might have flaws and the other side “isn’t the devil,” he said.

It also encourages consideration of the content’s other aspects: the author and their motivations, the source, whether the story supports the headline, whether the cited evidence supports the claim, who is sharing it. The Trust Project, which Wisconsin Watch has joined, developed eight “trust indicators” to consider when reviewing a news story.

“We need to constantly be willing to ask ourselves, ‘Why am I believing this? Does this make sense?’ ” Wagner said.

Lukito recommends having a healthy media diet, an ever-evolving collection of various sources for local, state and national news, as well as specialized publications for topics like economics or technology.

It’s important to distinguish between pieces driven by opinion, speculation or fact, and “absolutely important” that these sources of information have a process for verifying information. And it’s even better if you can seek out some of these sources yourself, rather than only letting a social media algorithm feed you all your information. 

One key technique is “lateral reading,” which involves evaluating a source or content’s credibility by researching their claims on multiple other sites.

We live in a different media ecology than decades prior, one that puts new responsibilities on individuals, Lukito said. 

“This is a new expectation for citizens that we’ve not seen in the past,” she said. It requires energy and labor, but “I certainly think it’s worth it.”

Have you encountered some mis- or disinformation?

Whether online or in real life, we want to hear about it. Wisconsin Watch and the Capital Times will be reporting together and separately on mis- and disinformation in Wisconsin over the coming months, especially with the 2024 presidential election inching closer.

Email tips, questions or feedback to reporters Phoebe Petrovic (disinfo@wisconsinwatch.org) and Erin McGroarty (emcgroarty@captimes.com).

The nonprofit Wisconsin Watch (www.WisconsinWatch.org) collaborates with WPR, PBS Wisconsin, other news media and the University of Wisconsin-Madison School of Journalism and Mass Communication. All works created, published, posted or disseminated by Wisconsin Watch do not necessarily reflect the views or opinions of UW-Madison or any of its affiliates.

Creative Commons License

Republish our articles for free, online or in print, under a Creative Commons license.

Popular stories from Wisconsin Watch

Phoebe Petrovic is an investigative reporter covering disinformation at Wisconsin Watch and a 2022-2023 Law & Justice Journalism Project fellow. As a Report for America corps member from 2019-2022, Petrovic reported, produced, and hosted "Open and Shut," a podcast series co-published with Wisconsin Public Radio examining the power of prosecutors. Petrovic previously worked at WPR as a Lee Ester News Fellow, “Reveal” from the Center for Investigative Reporting as an editorial intern and NPR's "Here & Now" as a temporary producer. Her work has aired nationally on all of NPR's flagship news magazines. She holds a bachelor’s degree in American Studies from Yale University.