Fuelling the flames of misinformation - Digital Threat Digest
PGI’s Digital Investigations Team brings you the Digital Threat Digest, SOCMINT and OSINT insights into disinformation, influence operations, and online harms.
PGI’s Digital Investigations Team brings you the Digital Threat Digest, SOCMINT and OSINT insights into disinformation, influence operations, and online harms.
I remember when I studied the susceptibility to committing crime in my Crime Studies post-grad. According to research, many factors, ranging from cognitive biases, emotional vulnerabilities, to social environments, influence a person’s likelihood of committing a crime. Interestingly, these same types of factors also influence how vulnerable people are to misinformation and extremist beliefs. In today’s digital world, understanding why some people fall for misinformation or become drawn to extremist ideas has never been more important.
In times of global crisis, fear, anxiety, and anger are powerful drivers in the spread of misinformation. People seek clear answers in a time of chaos, so they’re more likely to accept simplistic or extreme explanations, even if they’re false. The conflict in Ukraine and Palestine are obvious examples – the fear of war spreading, economic fallout, and the emotional toll of seeing death and destruction can make people more likely to believe misinformation that aligns with their pre-conceived ideas. When both conflicts broke out, social media was inundated with fake news and accounts, much of which couldn’t be fact-checked in such a short and chaotic period. It was only later when the online information environment had stabilised, and fact-checkers had time to verify content, that a more accurate picture of events began to emerge.
Motivated reasoning—where people process information in a way that satisfies their emotional needs—is another psychological factor that makes it hard to break free from misinformation. This is especially true in politically polarised environments. For example, in the aftermath of the US presidential election in 2020, the "Stop the Steal" movement gained traction among those anxious or angry about their candidate losing. Rather than critically evaluating information, people clung to the voter fraud narrative because it eased their emotional distress. A similar pattern is seen with conspiracy theories surrounding the Israel-Palestine and Russia-Ukraine conflicts, where people believe what makes them feel justified, secure, or aligned with their group.
Media literacy is another key factor. Here at PGI, I feel confident enough to say that we’re less susceptible to misinformation simply because our jobs consist of identifying and understanding fake news and its impact online. But the average person doesn’t always critically think about the sources they’re seeing. In the end, just like with criminal behaviour, there’s no single reason people believe misinformation or extremist ideas. It’s a mix of cognitive biases, emotional vulnerability, social influences, and lack of critical thinking skills. During times of acute crisis, these factors are even more pronounced.
Subscribe to the Digital Threat Digest
More about Protection Group International's Digital Investigations
Our Digital Investigations Analysts combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes.
Disclaimer: Protection Group International does not endorse any of the linked content.
Working within the Trust and Safety industry, 2024 has been PGI’s busiest year to date, both in our work with clients and our participation in key conversations, particularly around the future of regulation, the human-AI interface, and child safety.
At their core, artificial systems are a series of relationships between intelligence, truth, and decision making.
Feeding the name of a new criminal to the online OSINT community is like waving a red rag to a bull. There’s an immediate scramble to be the first to find every piece of information out there on the target, and present it back in a nice network graph (bonus points if you’re using your own network graph product and the whole thing is a thinly veiled advert for why your Ghunt code wrap with its purple-backlit-round-edged-dynamic-element CSS is better than everyone else’s).