From predictions to reality: Digital safety in a year of change
Beth Hepworth, Client Director and Reem Awad, Senior Digital Investigations Analyst
Beth Hepworth, Client Director and Reem Awad, Senior Digital Investigations Analyst
“The only thing we have to fear is fear itself”
So, it wasn’t quite as neat as the Roosevelt quote (nothing ever is), but it did feel like election interference was the ultimate perception hack of 2024. It was always going to feel that way in a calendar year where 90 major elections involving more people voting in a year than ever recorded in history. But while the drumbeat threat of disinformation, deepfakes and distortion dominated the headlines, the reality was that election preparedness prevailed, or adversaries were not as effective as perhaps feared (or both).
Whether or not election interference fears helped focus public attention and scepticism (good!) or, in fact, caused a proactive hyper focus by governments and election integrity stakeholders (also good, as long as it’s not too zero sum), we saw elections—including two of the world’s largest (and often controversial) democracies – the US and India—without significant election integrity challenges, a handful of major snap elections (which are often subject to online turbulence given their sudden nature) pass without challenge, and lots of pre-emptive intelligence and signposting serving to reassure citizens and disrupt nascent efforts (really good).
It isn’t all sighs of relief though - Romania’s Constitutional Court annulled the results of its presidential election, citing significant foreign interference. While this is not the first annulment of election results globally – Romania’s case is unique and sets a precedent in addressing foreign interference, especially via social media. This takes us into a grey area of how to effectively call out and evidence interference.
Online election integrity efforts have focused on safeguarding the process of democratic voter participation, the safety of candidates and election officials, the integrity of voter information and the veracity of the results and government transitions. Meanwhile, we have seen the online election ‘conversation’ dominated by local and global ‘wedge issues’ which have been major (and dominant) election campaign platforms.
Weaponisation of societal issues is certainly not a new trend, but the way in which such issues have been seeded, amplified, engaged with, and propagated online has been truly remarkable in 2024. Whether via covert infrastructure—such as pseudo-media, fake accounts and dark PR companies—or engaging outwardly overt online assets—such as paid influencers, disgruntled journalists and academics or even political candidates themselves—both domestic and foreign interest and activity in fanning the online flames that divide communities and incite polarising hostility causes more problems than fact-checking is ever able to solve for those pursuing authenticity and integrity online. Such efforts also continue to support the rise of populist governance which we’ve witnessed in transfers of power and in the problems incumbent governments have unanimously faced in 2024 election outcomes.
The important question for the team at PGI is: ‘how will digital spaces be altered by these power shifts?’ that have resulted from election outcomes in such a bumper year of changes and challenges to government.
Those recognising the role the online environment can play in winning or losing national support may seek to harness that power. Digital authoritarianism, a system established to exert comprehensive control over an information environment, is likely to become further entrenched in societies built on authoritarianism. As such, we can expect an increasingly hostile environment for journalists, activists; counter-narratives to the authoritarian or right-wing entities; and an uptick in targeted harassment (e.g. doxxing) or online silencing of those who speak in opposition.
Unfortunately, 2025 will not be the year that sees the sudden end of social media as a disinformation tool. We enter the third year of war between Russia-Ukraine, the second year of the Israel-Hamas war and can count a handful of other significant and persistent conflicts where online information/disinformation also blurs boundaries with the ‘ground truth’ of the reality and outcome of those conflicts.
Potentially, it’s a robust year for democratic outcomes but with more than a hint of concern about the way in which power brokers seek to use the information environment and the resources available to ensure it remains authentic and safe. So, our team cannot rest yet, we will still have much to do in 2025.
Our Digital Investigations team helps clients navigate the digital threat landscape, providing actionable intelligence to help understand, mitigate and respond to platform risks and novel threats. Talk to us if you want to find out how we do it.
We began this year knowing it was going to be a significant year for digital risk and digital safety. An unprecedented number of elections, brand new online safety legislation under implementation – all taking place against a backdrop of both existing and new conflict and war.
Working within the Trust and Safety industry, 2024 has been PGI’s busiest year to date, both in our work with clients and our participation in key conversations, particularly around the future of regulation, the human-AI interface, and child safety.
At their core, artificial systems are a series of relationships between intelligence, truth, and decision making.