“If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth.”
These wise words of Carl Sagan in his book “The Demon-Haunted World: Science as a Candle in the Dark,” offer a grave warning against the allure of pseudoscience and conspiracy theories. Although touting UFO sightings or claims of “deep state” orchestration of the government may seem bizarre or laughable to the general public, conspiracy theories can ensnare large portions of the population and quickly devolve into serious societal, democratic, and public health threats.
Conspiracy theorists believe that events are dictated by powerful, shadowy organizations secretly pulling the strings of world affairs. Psychologists propose that conspiracy theories are appealing coping mechanisms to satisfy social-psychological needs for control and predictability. This dependence, however, leads conspiracy believers to reject or selectively ignore irrefutable evidence contradicting their beliefs. Studies have shown that conspiracy theories also increase feelings of paranoia and existential threat that believers are trying to remedy. The cyclic nature of conspiratorial thinking is not only dangerous to individuals but also increasingly widespread. A recent poll suggests that up to half of the US population believes in some form of conspiracy theory. Out of twelve provided theories, the one with the most widespread support was the idea that Harvey Lee Oswald didn’t act alone in JFK’s assassination, believed by 54% of participants.
Generative AI models have made the creation and distribution of disinformation far easier. Advances in deep learning and natural language processing allow users to generate hyper-realistic deepfake images and voice cloning from simple text prompts. This content, ranging from entirely GPT-written science papers flooding Google Scholar to fake audio calls from “Joe Biden” dissuading voters from participating in New Hampshire’s Democratic primaries, can be almost indistinguishable from fact. Conspiracy theorists can leverage the torrents of false information flooding social media to justify their beliefs, which has the potential to erode trust in scientific institutions or influence elections.
Instead of generating it, researcher Thomas Costello and his collaborators recruited large language models to combat misinformation. They theorized that models like OpenAI’s GPT-4 Turbo are well suited to combat a common tactic called “Gish galloping,” where conspiracy theorists attempt to bombard opponents with a flood of arguments in a short period of time. This study examined 2000 participants who believed in a conspiracy related to an important event, such as climate change or COVID-19.
The chatbot, or “debunkbot,” based on GPT-4 Turbo, is trained on a vast database of books and studies and programmed to adapt its response to the arguments presented by each person and their confidence in the theory. After three rounds of back-and-forth conversation lasting only 8.4 minutes on average, 27.4% of participants started expressing uncertainty in their previously-held beliefs. This is a significant result that worked over a range of theories and lasted up to two months after interacting with the chatbot.
A suggested reason why the strategy was so effective is that the chatbot can process torrents of information far faster than a human while still remaining polite, unlike humans engaging in real debates who often grow frustrated and overwhelmed. Although conspiracy theorists are unlikely to seek out a debunkbot consultation voluntarily, a possible application is a rapid fact-checking interface in key news outlets and social media platforms. By allowing for easy access to interactive data and counterarguments to overheard claims, this tool might be able to sow seeds of doubt and prevent people from falling down the rabbit hole in the first place.
The COVID-19 pandemic and 2024 US election cycle have highlighted how pervasive and deeply consuming conspiracies can be. Although AI and social media have amplified the reach of misinformation, the chatbot’s method of “tailored persuasion” may be a promising tool to break the rabbit-hole cycle. “Debunkbot” shows us that reliance on facts, open civil discourse, and a healthy dose of skepticism are still our strongest remedies against conspiracies.
“’Debunkbot’ shows us that reliance on facts, open civil discourse, and a healthy dose of skepticism are still our strongest remedies against conspiracies.”