AI 'slopaganda' has arrived in Ireland and is making extreme political messaging cheap and viral
by Stephen McDermott, https://www.thejournal.ie/author/stephen-mcdermott/ · TheJournal.ieStephen McDermott
WHEN THE FACELESS boffins from the Oxford English Dictionary gather in some hotel boardroom in December to tally votes for the Word of the Year, I imagine ‘slopaganda’ will be in the mix.
The vocabulary of the terminally online has become a safe bet in this headline-generating contest – just ask yourself how many times you’ve heard the words ‘goblin mode’, ‘rizz’, ‘brain rot’ or ‘rage bait’ (the four previous winners) spoken aloud.
And slopaganda – a portmanteau of AI slop and propaganda – seems to me a lexically perfect summary of the world in 2026, marrying the crises of catastrophic wars, their wider fallout and the internet’s decay as a result of AI-produced, low-quality content.
The term was coined by academics last year and was not widely known before going mainstream in February around the time that Donald Trump decided to bomb Iran.
The ensuing war led to a wave of viral, Lego-style videos from Iran that have mocked Trump and linked him to Jeffrey Epstein, as well as criticising the United States and its war efforts more broadly.
Trump himself has been no stranger to posting his own slopaganda on social media, sharing everything from videos that depict him as a fighter pilot dumping faeces on protesters, to controversial images of himself as the Pope and Jesus.
AI slop has been widely criticised for killing the internet by flooding social media platforms and other online spaces with cheap, inauthentic and easily replicable content to grab people’s attention.
And while that criticism may be valid, the use of AI slop for political ends feels like a new departure because it has injected the phenomenon with a sense of purpose and real-world consequences.
Leaving aside their aims, history’s most famous pieces of visual propaganda – think Uncle Sam posters or Leni Riefenstahl’s Triumph of the Will – have tended to be technically proficient works of art that were produced by skilled individuals.
But the advent of generative AI means that partisan political messaging has been democratised and can now be made by anyone who’s willing and able to follow a DIY process.
Social media means that propaganda of this nature can then be broadcast to anyone in ways that obscure its origins – either because it’s not possible to identify the source of the slopaganda or because it’s impossible to know who exactly is responsible.
This is not just a problem that exists at the level of geopolitical warfare, either.
I can tell you this as a result of how I use social media platforms by following accounts that post extreme content; that process trains my algorithm to surface a lot of posts that are produced by AI.
I’ve already seen my fair share of Irish slopaganda, including during the recent fuel price protests, when a slew of social media pages shared AI-generated posters calling for people to form large convoys and block roads leading to Dublin.
I spotted another version at a lower level this week, in the form of a YouTube ‘documentary’ promoted to me on X and Facebook by a channel called Gael Force Media.
Advertisement
It was initially presented to me as a trailer for the preview of a ‘documentary’, whose makers were clearly eager for it to go viral by teasing the story on social media beforehand.
When it finally landed this week, the ‘documentary’ told the alleged story of a Muslim sex abuser who was involved in running a trafficking and drug operation in Clonmel.
Notably, the footage was almost entirely generated by AI and narrated by a female voice that was also generated by AI.
YouTube has labelled the video as “altered or synthetic content”, though Google did not respond to my queries about whether this material is allowed on the platform when it’s presented factually.
The video treats the AI presenter’s testimony as real, but the story contained few specifics like where alleged incidents took place or when; instead, it just pushed well-worn anti-immigrant tropes about Muslim men and sexual violence.
Towards the end, the AI presenter said she received help from an anti-immigrant ‘community safety’ group called Sinne na Daoine, which has links to one of the people behind the YouTube channel.
I reached out to verify the claims and asked Gael Force Media if they could put me in touch with the woman whose story was used for the documentary; I did not receive any response, and gardaí would not confirm details of any investigation when asked, despite the video claiming that they knew about the supposed gang.
The video only has a low number of views, but it shows the potential uses of slopaganda among fringe groups who can use it to create a simulacrum of reality.
This is often how such material initially appears online, in posts that test the water in obscure pockets of social media feeds.
When videos like this are presented as ‘documentaries’ by channels that claim to be independent media outlets, they can create an emotional response, which is ultimately their aim (rather than informing, which is what the media usually seeks to do).
That is part of what makes slopaganda effective: it is something that exists only to be consumed and then moved past, even as its claims or messaging persist.
The material does not need to be overly convincing if it can introduce the idea that something might be true, in much the same way as other forms of misinformation do.
This is a useful tool for fringe political groups, particularly because that type of material only has to go viral once to be successful.
The mention of Sinne na Daoine towards the end and the links between the group and Gael Force Media were not exactly subtle, but it was clearly designed as an invitation for viewers to find out more about the anti-immigrant group.
This isn’t to overstate the quality of the ‘documentary’: the video is badly produced, blatantly AI-generated, and there are extremely obvious questions around the authenticity of the story it tells.
But it’s also a format that offers a layer of plausible deniability because there is no individual storyteller to whom the narrative can be traced, which makes it harder to hold anyone accountable for what is being claimed.
For those who produce content like this, the obscuring of authorship is part of the beauty of slopaganda because it can add another layer of confusion around motive or origin when it comes to analysing a political message.
In that environment, information can circulate and go viral, regardless of its accuracy.
Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.
Learn More Support The Journal