Someday, the pandemic will be a distant memory, but COVID-19 has caused lasting shifts in society as well. This goes beyond having a new seasonal disease to deal with — Americans have become fundamentally divided on the sources of scientific and political authority, and many of them do not trust each other at all. We have become overwhelmed by the sheer volume of information available to us, allowing deceptive content to slip through the cracks.
When we defeat the pandemic, the next threat is already in our midst: well-crafted misinformation and a receptive audience waiting to consume more. Fake news is dangerous for the same reasons that COVID-19 became such a problem. We have been drastically underprepared to fight it; at the same time, we have designed our systems in such a way that they are easy to infiltrate.
The World Health Organization has its own definition for the concept of an infodemic: “too much information including false and misleading information in digital and physical environments during a disease outbreak.” The most apparent instance of this is what we have seen over the last 18 months living in the pandemic — we simply hear way too much information to process it all.
The exact same thing happens during elections, major world events, and, well, even minor world events. We live in a society that is built around information, and too much information means we have to prioritize what we think about. That is what biases are: mental shortcuts to save time. Remember that infodemics aren’t just about misinformation. Unclear messaging from the government can be just as disastrous because it contributes to a distrust of authority, regardless of whether the government is “right” or “wrong.”
In general, social media is designed to maximize your attention for as long as possible. This did not happen by accident; it is simply a matter of profit. People are psychologically inclined to listen to voices that reinforce their existing beliefs, so they remain happy and engaged when their social media feed does exactly that. This also works in the other direction: because users are funneled toward their internal biases, the most extreme thinkers are given an audience where they may have had none before. It’s the perfect positive feedback loop.
Once someone has entered one of these rabbit holes, it is difficult to get back out. For example, even though YouTube gives extra promotion to pro-vaccine content from legitimate sources in search results, a user who is already viewing an anti-vaccine video will still likely be recommended another one. The way social networks boost content can have a direct negative effect on the community because it keeps people inside echo chambers. At the end of the day, that means more users stay online, and more money gets moved.
The 2020 infodemic was waiting for us in plain sight. All the pieces were in place — we just needed a crisis big enough to start the machine. The Internet is built the wrong way, and its problems are obvious, but is there even an alternative? Try as we might, we will never find a magical solution that lets us convince everyone to believe what we believe. But maybe we should stop trying to find the cure and start working on the vaccine. Even if we can’t get people out of misinformation rabbit holes, we can help them avoid falling into them in the first place. Students in particular are among the most vulnerable to deception, but many educators are optimistic that changes can be made.
Some researchers have even created games that teach players how to identify fake news stories. These are all steps in the right direction. The strategy for combating misinformation should focus on the next generation of Internet users because they will define American democracy more than anyone else. Even though infodemics will be a part of our lives forever, we can still contain their impact.
Fighting lies is an uphill battle because, in the real world, profit tends to win over all else. Misinformation has disastrous effects on democracy, public health and our ability to trust each other. It turns us against friends, family and the very concept of authority. The systems we use to communicate are fundamentally flawed — but at the same time, it is hard to imagine what we can do to change them.
Twitter, Facebook and Instagram became successful for a reason: they are fantastic attention-keepers. Instead of trying to build a completely new media paradise, free from hate and deception, maybe we can change the current system from the inside out. Stopping infodemics from happening might prove impossible, but perhaps we can develop a sort of herd immunity. If we educate enough young Americans in media literacy and fact-checking, we can ensure a safe future for democracy.