Register now for our virtual event on 'Ripple effects: The global economic impact of the US election' which will take place on Thursday 21st November from 18:00 - 19:15 GMT. Join us to to hear from leading academics and top business experts on the far-reaching economic consequences of the US elections.
Register nowHow misinformation and bias shapes what we believe
Alex Edmans discusses bias that allows misinformation to thrive
Throughout the race for the US Presidential election these past several months, a closer look at biases highlights the need for both voters and business leaders to understand how misinformation makes us vulnerable to falsehoods.
London Business School Professor of Finance Alex Edmans, author of May Contain Lies, has spent years studying the psychological biases that allow misinformation to thrive, especially during tense election cycles. His insights in this highly-acclaimed book, released in May this year, shed light on why people are so susceptible to misinformation, particularly when emotions run high, and how individuals — including business leaders — can become more resilient in their consumption of information.
Confirmation bias: seeing only what we want to see
Confirmation bias is one of the biggest obstacles to objective thinking during an election. As Alex explains: “We have our preferred candidate, and we will interpret all evidence that we see as supporting that candidate, even if the evidence is close to non-existent.” This bias leads people to embrace information that aligns with their beliefs, regardless of the quality or credibility of the source.
In a race as polarised as Harris vs. Trump, with both camps hammering their messages across all media platforms, confirmation bias only deepens the divide, making it harder for voters to accept information that doesn’t align with their views.
Sometimes, the more extreme a claim sounds, the more likely people are to believe it. This “you couldn’t make it up” effect makes even fringe narratives seem credible.
This cycle, Alex says, is particularly visible in Trump’s claims of a “stolen” 2020 election when he lost out to President Joe Biden, which still resonate with many Republicans, despite a lack of credible evidence. Alex points out that this bias is so powerful that people are willing to believe even outlandish claims when they align with their existing beliefs. These narratives have the potential to sway undecided voters, who might see a candidate as a "victim" in a rigged system, particularly those who already distrust the political establishment.
The 'illusory truth effect': repetition builds belief
Another key bias Alex highlights is the “illusory truth effect”, which reinforces beliefs through repetition. In an election environment, where adverts, social media posts and news cycles repeat messages, even the most dubious claims can start to sound believable.
“The illusory truth effect means that people are more likely to believe repeated claims, even if they’re extreme,” Alex explains.
The US Department of Justice’s recent crackdown on websites allegedly run by Russian operatives demonstrates just how potent this effect can be. These websites, which mimic major news outlets like Fox News and The Washington Post, were designed to push pro-Russian, anti-Ukraine content. They also leveraged AI, social media influencers and paid ads to make false narratives more plausible.
Alex explains that this repetitive cycle of content amplifies the illusion of truth, fostering belief in claims that might otherwise be dismissed as absurd. “Sometimes, the more extreme a claim sounds, the more likely people are to believe it,” he notes. This “you couldn’t make it up” effect makes even fringe narratives seem credible.
The high cost of misinformation
For business leaders, misinformation’s influence extends beyond elections, affecting workplaces and corporate cultures. While each executive may only cast one vote, they play a larger role in shaping how their organisations process and discuss information. Alex’s book suggests that creating a psychologically safe culture, where colleagues are encouraged to speak up and respectfully challenge each other’s views, can combat misinformation by ensuring that people hear both sides of an issue.
This may well be crucial in a highly charged election, where employees may bring their political beliefs — and biases — into the workplace. At US search giant Google, for instance, outspoken employee protests on political issues led to controversial dismissals earlier this year following the recent conflict in Gaza.
Leaders who prioritise a balanced, open culture can help employees question their own assumptions, overcome their own bias, and be willing to hear views that differ from their own. This, in turn, helps foster a healthier, more resilient corporate environment, one where employees are encouraged to think critically and use the collective wisdom of their colleagues rather than only listening to what you want to hear.
Social media and the role of AI
Social media and AI-driven tools have created fertile ground for misinformation to spread in today’s digital economy. The DoJ’s actions underscore just how advanced these tactics have become, as foreign operatives leverage sophisticated AI to manipulate US voters. US attorney general Merrick Garland described Russia’s interference as “increasingly sophisticated and accelerated”, presenting a growing threat to election integrity as Americans head to the polls to cast their ballots on November 5.
While Alex acknowledges that social media companies have the power to combat misinformation, he’s sceptical about their ability — and incentive — to do so. “False information can be produced faster than it can be checked,” he says.
Social media and AI-driven tools have created fertile ground for misinformation to spread in today’s digital economy
Moreover, the algorithms that keep users hooked often feed them content that reinforces their beliefs, rather than challenging them. “Social media companies feed confirmation bias — they show us the information we like, because that’s what keeps us glued,” he says.
What’s more, as AI and “deepfake” technology advance, misinformation becomes even harder to detect. This 2024 presidential election cycle has seen abundant use of AI-generated content, according to the Council on Foreign Relations, with some of it so realistic that voters may have difficulty discerning fact from fiction.
An example is when Florida governor Ron DeSantis shared a deepfake of former president Trump appearing to hug and kiss his former chief medical advisor Anthony Fauci — supposedly meant as a jab at Trump’s handling of the Covid-19 pandemic. For Alex, since we can’t rely on social media companies to snuff out misinformation, the solution lies in individual empowerment. This involves encouraging people to develop critical thinking skills that allow them to question information, regardless of its source or apparent credibility.
Tools for spotting misinformation
Alex believes that everyone, from voters to business leaders, can benefit from simple tools to identify and combat misinformation. “The key is to give the same scrutiny to something you like as to something you don’t,” he advises. This approach helps counter confirmation bias, as it forces individuals to examine information objectively, regardless of personal preference.
A practical exercise Alex suggests is to imagine the opposite of a claim you’re inclined to believe. For instance, if a headline claims that “red wine leads to longer life”, imagine instead that “red wine leads to shorter life” and question how you would challenge that statement. A red wine lover might want to shoot it down by claiming that other factors are at play. Maybe those who drink red wine are poor, compared to those who drink champagne or spirits, and this poverty could be behind the shorter life. Now that you’ve alerted yourself to the possibility of wealth as an alternative explanation, ask whether they still apply even though the headline is one that you like. For example, could people who drink red wine be richer, and that wealth is the cause of the longer life?
This kind of mental exercise helps neutralise confirmation bias. It makes you more resilient to misleading information by teaching you to apply the same level of scrutiny to all claims.
The road ahead
Looking beyond the US presidential election, Alex warns that misinformation may become an even greater challenge in the age of AI and deepfakes. While AI offers tools for detecting false information by cross-referencing with established data, it can also be used to generate convincing fakes that blur the line between reality and fiction.
“People often think that new inventions are either 100 percent amazing or 100 percent terrible, but the reality is usually somewhere in between,” Alex says. He sees AI as both a tool and a threat, making critical thinking very important indeed.
While AI offers tools for detecting false information by cross-referencing with established data, it can also be used to generate convincing fakes that blur the line between reality and fiction
AI’s potential for misuse means that individuals must take ownership of their media literacy, practising skills that help them distinguish fact from fabrication. For Alex, this means fostering a mindset of healthy scepticism and encouraging a culture of information verification that goes beyond simple fact-checking.
Ultimately, insights in May Contain Lies are a call to action for leaders, voters and citizens to take control of their own information habits. By recognizing our biases, scrutinising information critically and questioning assumptions, we can each play a role in countering the spread of misinformation. As with any election, this approach may be critical for anyone seeking to make an informed, unbiased choice.
Discover fresh perspectives and research insights from LBS