Trust & Safety: A look ahead to 2025
James Smith, Head of Trust & Safety
James Smith, Head of Trust & Safety
In 2025, leading global online safety regulations are set to further clarify their specific requirements and enforcement criteria, with service providers mandated to respond to these obligations.
We are due to see the next phase of the EU Digital Services Act (DSA), bringing greater transparency reporting requirements for online services. These include data collection Reporting Templates and Statements of Reasons (on content moderation decisions) from July, and further preparation for harmonised Transparency Reports in early 2026. Trust & Safety teams looking to provide strong statements of reason will benefit from PGI’s deep attribution capabilities, to highlight complex networked harms that are designed to overcome automated content moderation policy and operations.
The UK Online Safety Act (OSA) has several milestone events in 2025. Ofcom published its Illegal Harms Statement on 16 December 2024, including the Illegal Harms Codes of Practice and guidance on illegal content risk assessments. This means that all in-scope service providers are now ‘on the clock’ to complete their illegal harms risk assessments by March 2025. From April, there will be over 40 different technical and procedural implementation requirements for service providers to deliver.
In Australia, we expect more detail on how the Online Safety Amendment (Social Media Minimum Age) Act of 2024 will define, track and enforce a social media ban for under-16s. This is likely to require significant efforts on age and user ID verification. PGI’s digital intelligence capabilities can discover the signs and signals of inauthentic accounts and personas, supporting clients to positively address inauthentic activity on their platforms.
In the US, while the debate around the Kids Online Safety Act (KOSA) continues, another conversation around a possible repeal of Section 230 of the Communications offers some interesting future scenarios for ‘big tech’. On one side of the argument, campaigners for online safety reform see it as a powerful mechanism to force significant change in platform liability for online harms issues. Free speech advocates might consider it a threat to the rights of expression, especially if companies follow a repeal with uncompromising restrictions on platform engagement. Some analysts suggest that a Section 230 repeal would prohibitively raise the entry stakes for new tech companies, polarising the landscape around those tech giants that can afford potential litigation. In all cases, PGI’s digital investigations team provide pre-emptive and proactive intelligence, mapping complex harms networks on to platforms, through signal data and behaviours.
Safety is steadily becoming a key factor for platform users, when exercising consumer choice (if in doubt, take a look at the latest active user base statistics for X). Anecdotally, it is interesting to see that many of the most avid adopters of social media—late Gen X or early Millennials—are now parents, which has changed their consumer calculus on tech adoption from functionality towards safety. PGI expects the conversation in 2025 to become more focussed on which platform is the safest, or most compliant with the regulation. Advanced features such as real-time monitoring and parental controls are expected to develop further into 2025.
Consumer behaviours will inevitably drive business behaviours, and we can expect serious platform conversations around advertising revenue. Brands are closely tied to consumer trends; where a platform is not perceived as ethically aligned, platforms may find advertising revenue affected (for example, brands that have withdrawn from X include Disney, Paramount, NBCUniversal, Comcast, Lionsgate, Warner Bros. Discovery, Apple, and Oracle). Digital investigations, providing proactive intelligence, can help platforms get ahead of revenue and reputational risks.
Our Digital Investigations team helps clients navigate the digital threat landscape, providing actionable intelligence from digital corners where in-house teams do not, or cannot, go. Talk to us if you want to find out how we do it.
Working within the Trust and Safety industry, 2024 has been PGI’s busiest year to date, both in our work with clients and our participation in key conversations, particularly around the future of regulation, the human-AI interface, and child safety.
At their core, artificial systems are a series of relationships between intelligence, truth, and decision making.
Feeding the name of a new criminal to the online OSINT community is like waving a red rag to a bull. There’s an immediate scramble to be the first to find every piece of information out there on the target, and present it back in a nice network graph (bonus points if you’re using your own network graph product and the whole thing is a thinly veiled advert for why your Ghunt code wrap with its purple-backlit-round-edged-dynamic-element CSS is better than everyone else’s).