Stop Calling It Disinformation
We are deep down in Postdigital Propaganda™
Hello.
I’ve been quietly refining my analytical diagnostic: Postdigital Propaganda™. The prefix sounds rhetorical — until the empirical material keeps producing phenomena that existing frameworks can’t name. In the background on constant repeat: Dirty tech by Kim Gordon.
We are living under a postdigital condition: a state in which digital media are no longer experienced as separate, novel, or even particularly visible—but instead fully integrated into the material and social fabric of everyday life.
Meanwhile the 2026 Iran War is unfolding through AI-generated images, videos, and memes — shaping how this new kind of AI war is experienced. Building on the last edition (Cognitive Warfare) here is a look at current developments with a postdigital propaganda lens.
What the heck is Postdigital Propaganda™
Postdigital propaganda refers to the infrastructural organisation of political alignment within algorithmically governed platform ecologies. Yeah, I know. In plain terms: it means propaganda is no longer primarily about messages, lies, or even persuasion. It’s about the conditions under which things become visible, feel right, and get endlessly repeated.
The shift looks something like this: from message to environment, from actor to infrastructure and from persuasion to alignment. Alignment is when the environment itself starts to lean — as platforms consistently surface, reward, and circulate content pointing in the same direction, regardless of who produced it or why.
It’s called postdigital because we are no longer “online.” The Internet, Google, platforms. They are just there. Ambient. Like air. You only notice them, when they are gone. When platforms become the environment, propaganda doesn’t just move through them—it emerges from how they work.
So instead of digital propaganda = actors using tools. We get postdigital propaganda = alignment produced inside the system itself. Propaganda becomes a condition, not a campaign.
Four dynamics keep this running
You can see this everywhere right now: White House memes, Lego missiles, Ghibli soldiers, etc. It’s always the same four dynamics. These don’t run in sequence — they reinforce each other:
Platform infrastructure: what you see is determined by a recommendation system — not the sender, not the argument.
Platformized affect: what gets recommended is determined by reaction — outrage, delight, disgust. Feeling is the ranking signal.
Participatory recursion: what circulates is continuously remade — by users, by networks, by state actors.
Synthetic plausibility: what repeats starts to feel real — AI and aesthetics don’t need to convince, only to accumulate.
The Iranian Lego Use Case
Take the Lego propaganda videos making the rounds (How Lego Became a Go-To Meme of the Propaganda Wars, Wall Street Journal). On the surface, it’s absurd: missile strikes, dead soldiers, geopolitical escalation — rendered as cute plastic toys. But that’s exactly why it works. It runs on four dynamics — all at once.
Platform infrastructure: recognizable, frictionless formats — Lego, Minecraft, GTA — fit the feed and travel regardless of who made them. Which matters, because who made them is increasingly beside the point. The White House is running the same playbook, splicing real airstrike footage with Call of Duty overlays and Wii graphics. Iran didn’t infiltrate a neutral medium. The format was already there. What looks like foreign propaganda is structurally indistinguishable from official messaging — authorship collapsed into a shared visual vernacular no one owns.
Platformized affect: the childlike aesthetic doesn’t just lower your guard — it registers as an affective signal before you’ve evaluated anything. The system rewards resonance, not accuracy. Emotional charge determines visibility before origin or intent ever enter the picture. It doesn’t spread because it’s true. It spreads because it hits. It doesn’t arrive pre-disarmed. It arrives pre-amplified.
Participatory recursion: Iran, Russia, China, and the White House remix the same visual language — but so do millions of anonymous accounts that aren’t coordinating with anyone. That’s the point. The format reproduces across actors until the distinction between propaganda and posting collapses. The system processes all of it the same way, indifferent to intent. It stops belonging to anyone. It becomes the condition.
Synthetic plausibility: the mechanism isn’t deception — it’s accumulation. Repetition stabilizes a political world before any single claim is evaluated. The game aesthetic doesn’t just make violence digestible. It reframes what this war is. Not tragedy, not crisis — content. Something to watch, remix, scroll past. By the time you see it, the imaginary already holds. Not because you’ve been fooled, but because it already feels coherent.
That’s the point: the video doesn’t need to convince you. It needs to circulate, register, repeat. And once it does, it stops being a clip and becomes part of the environment in which the war is understood — not as competing narratives, but as a shared affective atmosphere that everyone feeds and no one controls.
What changes
Under these conditions, the success of propaganda is no longer determined by whether something is true, or even persuasive. Postdigital propaganda does not just shape what we see or feel—it also shapes what counts as evidence, and how reality itself can be confirmed or denied (Tech Policy Press) as long as it
fits the feed
hits the feeling
gets picked up
and keeps looping.
So what
We can keep arguing about what is true and what is fake. Keep chasing the next debunk, the next fact-check. But that’s increasingly beside the point.

If the problem is alignment, the task shifts: away from correcting individual claims — and toward understanding, and intervening in, the conditions that make certain directions stick.
Disinformation assumes a problem of content. We’re dealing with a problem of condition.
More
Cascades of A.I. Fakes (New York Times)
The fog of AI (The Atlantic)
Trump’s video game war: AI, memes and a simplistic narrative (Guardian)
War Becomes Spectacle in Trump’s Horrific Propaganda Promoting War in Iran (Truthout)
What else?
Three-quarters (73%) of Gen Z TikTok active users say the content today feels staged and performative, and over half (53%) feel it’s censored (The Harris Poll)
Finding fake followers, TikTok edition (Conspirador)
The material life of Italian Brainrot (Algofolk)
The Story Behind All Those Fruit and Veggie AI Slop Videos (NY Mag)
AI videos of sexualised black women removed from TikTok (BBC)
Generative Propaganda (Working Paper)
Thanks for reading. My name is Marcus. If you’re interested in tailored insights, workshops, consulting, or policy support, or just want to discuss stuff get in touch. Here is Linkedin. Here is Bluesky. Ciao



Absolutely terrific post and extremely valuable to me. I had just experienced the effects of the Lego Iran Meme on myself and some unease with it, that helped me to situate it.
I will think about this for a while, I also hope to be able to find the time tomorrow to read up on the literature list. I very much appreciate it a lot. Thanks.
I dig this article so much. However, there's something I'm a little on the fence about.
Propaganda can mix facts with spin to nudge alignment over time, like state media remixing Lego aesthetics to normalize war as spectacle. Disinfo demands outright fabrication with deceptive intent, often as a foreign psyop tactic to sow chaos fast. In the Iran War, Russia's participatory vibes might look like propaganda's ambient hum, but when Tehran drops AI fakes of US losses, that's disinfo piercing the fog.
What feels off is how "postdigital propaganda" flattens this: if everything recursive becomes propaganda, do we lose grip on tracing foreign agency? Your four dynamics nail the platform trap, but calling it all "propaganda" risks diluting the intent behind state-sponsored fakes? Doesn't it dilute the problems of cognitive warfare?