Hippo-critical behaviour - Digital Threat Digest
PGI’s Digital Investigations Team brings you the Digital Threat Digest, SOCMINT and OSINT insights into disinformation, influence operations, and online harms.
PGI’s Digital Investigations Team brings you the Digital Threat Digest, SOCMINT and OSINT insights into disinformation, influence operations, and online harms.
Last week, my social media feeds were filled with news of Israel's synchronised attacks in Lebanon, ranging from news updates and victim testimonies to Hezbollah memes and edgy tankie shitposting. It seemed all-consuming—which it should, trivialisation via memefication aside, given the scale of the attack and the number of innocent lives lost—and seemed to be all anyone was talking about. But this week, in an unexpected turn of events, exploding pagers are out, and shrieking baby hippos are in. My feeds are now dominated by 'Moo Deng', an endangered pygmy hippopotamus from Thailand who has gone viral for being "defiant, sassy, slippery, chubby", and loving to scream. When I asked one of my colleagues what his algorithm was showing in search of inspiration for this digest, he similarly stated: "Moo Deng".
Since becoming an overnight internet sensation, Moo Deng has been given her own 24-hour livestream and official merchandise. She has been described as "the moment" in Vogue, featured in posts by legendary sports teams such as the New York Mets and Bayern Munich, and has caused brands such as Sephora to hop on the bandwagon and launch targeted pygmy hippo-inspired advertising campaigns. While I'm not one to hate on Moo Deng (she is "the pookie of the moment", after all, and arguably a reflection of all of us on a bad day), I do have an issue with the way our algorithms treat every piece of news as another 'trending topic' for media outlets, brands and corporations alike to capitalise on for a few days before moving on to the next 'big thing'. And it's not just the algorithms – our own attention spans (or lack thereof) mimic the same processes.
Just last week, my colleague wrote about the Kafkian nature of social media and the perils of treating politics like just another piece of viral content. Similarly, I've previously written about attention wars and the risks of oversaturating an information environment – but it's really struck a nerve with me this week. Social media algorithms push topical filter bubbles that prioritise entertainment over news content, arguably exacerbating the rise of the attention-deficit 'popcorn brain'. And there's nothing objectively wrong with seeking positive content to offset the overwhelming negativity of the world around us. However, many of us don’t seek this content, but are fed it instead – making the ease with which we all collectively pivoted from discussing a bloody conflict to a baby pygmy hippo alarming and absurdist, to say the least.
Hundreds of people have been killed. And while we may not see the harm in disengaging from political content, it’s important to remember that the only ones who benefit from this distraction are those perpetuating these conflicts in the first place. They rely on our popcorn brains to stop questioning, monitoring and investigating them, leaving them free to repeat the same patterns over and over. It almost feels like they’re capitalising on our short attention spans, while we’re capitalising on Moo Deng – who, let’s be real, has no idea what’s going on either.
Subscribe to the Digital Threat DigestMore about Protection Group International's Digital Investigations
Our Digital Investigations Analysts combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes.
Disclaimer: Protection Group International does not endorse any of the linked content.
Working within the Trust and Safety industry, 2024 has been PGI’s busiest year to date, both in our work with clients and our participation in key conversations, particularly around the future of regulation, the human-AI interface, and child safety.
At their core, artificial systems are a series of relationships between intelligence, truth, and decision making.
Feeding the name of a new criminal to the online OSINT community is like waving a red rag to a bull. There’s an immediate scramble to be the first to find every piece of information out there on the target, and present it back in a nice network graph (bonus points if you’re using your own network graph product and the whole thing is a thinly veiled advert for why your Ghunt code wrap with its purple-backlit-round-edged-dynamic-element CSS is better than everyone else’s).