Misinformation lost in the 2022 midterms: What was different?
Massive Data Institute Director Lisa Singh joined GU Politics Fellow Katie Harbath, McCourt Institute grantee Leticia Bode and Georgetown Professor Simon Blanchard for a fireside chat about misinformation during the 2022 Midterm Elections.
The lead up to the 2022 Midterm Elections was fraught with discussions about misinformation, conspiracy theories and the role social media companies play in limiting the spread of “fake-news.” However, the results from Georgetown experts’ research showed that misinformation did not win this year.
In a fireside chat with McCourt School students, McCourt Institute grantee and Provost Distinguished Associate Professor Leticia Bode argued that platforms and “tech elites” have learned several important lessons since the 2016 and 2020 election cycles and have thus been able to better combat misinformation online. While most Americans were exposed to misinformation during the 2022 election cycle, Bode found a much smaller amount of falsehoods were being spread online than in past election years, due in large part to better algorithms built to catch misinformation before it spreads.
Katie Harbath , the former public policy director at Facebook and Fall 2022 GU Politics Fellow, added that “after the 2016 election, Facebook attempted to build systems, tools, policies and partnerships to mitigate these risks — and they seem to have worked.”
Addressing bias in algorithms
While it is evident that current strategies from social media platforms have worked well to combat misinformation, many McCourt School students expressed concerns about how companies plan to address bias in algorithms.
Harbath underscored the importance of actively solving for bias in algorithms as it becomes evident. “For enforcement, [tech companies] need to continuously make sure that their algorithms are programmed correctly to actively find potentially problematic content. This type of research is still in its infancy,” she said. “Candidates from either side are still struggling to determine what is allowed and what should be deemed misinformation so that is the first step.”
Professor Lisa Singh added a technical note that many algorithms learn on cases that are clearly true and clearly false, but once they are put into practice, “there is a much larger middle ground that is often hard for algorithms to parse through.” Because of this, many developers use data from reliable fact checkers to train their models and adjust them based on known political leanings. Singh said that while this is “not an open-and-shut case with an easy solution, it is much more reliable and efficient than individuals fact-checking every claim.”
Looking ahead to the 2024 Presidential Election
Although misinformation was less prevalent during the 2022 Midterm Elections, many researchers and practitioners are looking ahead to future presidential elections, which have been historically more susceptible to false messaging and foreign interference.
While views of cautious optimism were shared across the panel, Bode summarized it best, saying, “by layering all of our small efforts on top of one another, we just might have a better chance of limiting misinformation in 2024, and that will benefit all of us.”