Cyber Security
Investigations
Capacity Building
Insights
About
Digital Threat Digest Insights Careers Let's talk

Trust & Safety: A look ahead to 2025

James Smith, Head of Trust & Safety

Time
Working within the Trust and Safety industry, 2024 has been PGI’s busiest year to date, both in our work with clients and our participation in key conversations, particularly around the future of regulation, the human-AI interface, and child safety. We also made a clear statement on our company values by joining the WeProtect Global Alliance, as active members in the fight to end child exploitation online.
As busy as this year has been, we have also been thinking about the year ahead. In 2025, Trust & Safety teams face arguably their most intensive year yet. Digital threats continue to evolve in complexity and ingenuity, at the very same time as regulatory scrutiny begins to bite hard.

Regulation

In 2025, leading global online safety regulations are set to further clarify their specific requirements and enforcement criteria, with service providers mandated to respond to these obligations.

We are due to see the next phase of the EU Digital Services Act (DSA), bringing greater transparency reporting requirements for online services. These include data collection Reporting Templates and Statements of Reasons (on content moderation decisions) from July, and further preparation for harmonised Transparency Reports in early 2026. Trust & Safety teams looking to provide strong statements of reason will benefit from PGI’s deep attribution capabilities, to highlight complex networked harms that are designed to overcome automated content moderation policy and operations.

The UK Online Safety Act (OSA) has several milestone events in 2025. Ofcom published its Illegal Harms Statement on 16 December 2024, including the Illegal Harms Codes of Practice and guidance on illegal content risk assessments. This means that all in-scope service providers are now ‘on the clock’ to complete their illegal harms risk assessments by March 2025. From April, there will be over 40 different technical and procedural implementation requirements for service providers to deliver.

In Australia, we expect more detail on how the Online Safety Amendment (Social Media Minimum Age) Act of 2024 will define, track and enforce a social media ban for under-16s. This is likely to require significant efforts on age and user ID verification. PGI’s digital intelligence capabilities can discover the signs and signals of inauthentic accounts and personas, supporting clients to positively address inauthentic activity on their platforms.

In the US, while the debate around the Kids Online Safety Act (KOSA) continues, another conversation around a possible repeal of Section 230 of the Communications offers some interesting future scenarios for ‘big tech’. On one side of the argument, campaigners for online safety reform see it as a powerful mechanism to force significant change in platform liability for online harms issues. Free speech advocates might consider it a threat to the rights of expression, especially if companies follow a repeal with uncompromising restrictions on platform engagement. Some analysts suggest that a Section 230 repeal would prohibitively raise the entry stakes for new tech companies, polarising the landscape around those tech giants that can afford potential litigation. In all cases, PGI’s digital investigations team provide pre-emptive and proactive intelligence, mapping complex harms networks on to platforms, through signal data and behaviours.

User choice and brand awareness

Safety is steadily becoming a key factor for platform users, when exercising consumer choice (if in doubt, take a look at the latest active user base statistics for X). Anecdotally, it is interesting to see that many of the most avid adopters of social media—late Gen X or early Millennials—are now parents, which has changed their consumer calculus on tech adoption from functionality towards safety. PGI expects the conversation in 2025 to become more focussed on which platform is the safest, or most compliant with the regulation. Advanced features such as real-time monitoring and parental controls are expected to develop further into 2025.

Consumer behaviours will inevitably drive business behaviours, and we can expect serious platform conversations around advertising revenue. Brands are closely tied to consumer trends; where a platform is not perceived as ethically aligned, platforms may find advertising revenue affected (for example, brands that have withdrawn from X include Disney, Paramount, NBCUniversal, Comcast, Lionsgate, Warner Bros. Discovery, Apple, and Oracle). Digital investigations, providing proactive intelligence, can help platforms get ahead of revenue and reputational risks.

What's next?

Our Digital Investigations team helps clients navigate the digital threat landscape, providing actionable intelligence from digital corners where in-house teams do not, or cannot, go. Talk to us if you want to find out how we do it.