Propaganda Works the Same Way Every Time (Here's the Playbook)

In 44 BC, after Julius Caesar's assassination, Mark Antony stood in the Roman Forum and gave a funeral speech. He didn't directly attack the conspirators. Instead, he held up Caesar's bloody toga, read Caesar's will (which left money to every Roman citizen), and repeated the phrase "Brutus is an honorable man" with increasing irony until the crowd was ready to burn the conspirators' houses down. In 1933, Joseph Goebbels took over Germany's new Ministry of Public Enlightenment and Propaganda and began a campaign that used emotional imagery, constant repetition, enemy creation, and false binaries to reshape how an entire nation thought. In 2016, state-sponsored troll farms used social media algorithms to amplify divisive content, create fake grassroots movements, and exploit emotional triggers to influence elections across multiple countries.

Different centuries. Different technologies. Same playbook. And once you see the playbook, you can't be played the same way again.

Why This Exists

Propaganda isn't something that only happened in the past to less sophisticated people. It's happening to you right now, every day, and it doesn't call itself propaganda. It calls itself advertising, content, news, viral posts, and "just asking questions." The techniques haven't changed in two thousand years. The delivery system has.

Edward Bernays -- Sigmund Freud's nephew, and the man who essentially invented modern public relations -- wrote a book in 1928 called simply Propaganda. In it, he argued that the "conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society." He wasn't being sinister. He was being honest. He believed that because most people don't have time to research every issue, someone needs to shape public opinion, and that someone might as well be professionals who know what they're doing. Bernays used Freudian psychology to sell cigarettes to women (rebranding them as "torches of freedom" during a suffragette march), to convince Americans that bacon and eggs was a traditional breakfast (on behalf of a pork producer), and to help the United Fruit Company topple a democratically elected government in Guatemala [VERIFY]. The same man. The same techniques. Applied to commerce, culture, and geopolitics.

Jacques Ellul, in Propaganda: The Formation of Men's Attitudes, made a distinction that matters: propaganda isn't just lying. The most effective propaganda uses true facts arranged to produce a false impression. It doesn't need to invent information. It just needs to control which information you see, in what order, with what emotional framing. That's why it works on smart people. You can fact-check every individual claim and still be manipulated by the structure.

The Core Ideas (In Order of "Oh, That's Cool")

The six techniques that show up every time. Across Roman rhetoric, medieval religious art, totalitarian posters, Cold War broadcasts, and modern social media campaigns, the same six techniques appear with remarkable consistency. They aren't the only techniques, but they're the most common, and recognizing them is the single most valuable thing this article can teach you.

Technique 1: Enemy creation. Every propaganda campaign needs a villain. The Romans had barbarians. The Nazis had Jews, Communists, and "degenerates." The Soviets had capitalists. Modern political campaigns have "the elites," "the immigrants," "the other party." The technique works by giving people a target for their frustration. It doesn't matter if the frustration has legitimate sources -- the propagandist just redirects it toward a convenient enemy. The test is simple: when someone tells you who to blame before they've explained the structural problem, they're using enemy creation.

Technique 2: Repetition. If you hear something enough times, your brain starts treating it as true. Psychologists call this the illusory truth effect, and it's one of the most robust findings in cognitive science. Goebbels understood this intuitively -- he reportedly said that "a lie told once remains a lie, but a lie told a thousand times becomes truth" [VERIFY]. Modern advertising runs on the same principle. You don't remember a brand because of one ad. You remember it because you've seen it five hundred times. Political slogans work the same way. The content of the slogan matters less than the fact that you can't stop hearing it.

Technique 3: Emotional override. Propaganda targets your feelings, not your logic. This is deliberate. When you're emotional -- angry, scared, outraged, inspired -- you process information differently. You're less likely to question the source, less likely to consider alternative explanations, and more likely to act impulsively. George Orwell described this in his 1946 essay "Politics and the English Language" -- he argued that political language is designed to "make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind." The mechanism is emotional: the language bypasses your critical thinking and goes straight to your gut.

Technique 4: False binary. "You're either with us or against us." "There are two kinds of people." "If you don't support X, you must support Y." The false binary eliminates nuance by forcing you to choose between two options when there are actually many. It's effective because humans naturally prefer clarity to complexity. A world with two sides is easier to navigate than a world with fifty. But the simplicity is a trap. Every time someone presents a complex issue as having exactly two possible positions, they're using this technique.

Technique 5: Appeal to tradition. "This is how we've always done it." "Our ancestors knew." "Traditional values." The appeal to tradition works by connecting a current political agenda to a (usually idealized) version of the past. It's effective because nostalgia is powerful and because questioning tradition feels disrespectful. But "we've always done it this way" is not an argument for continuing to do it this way. Slavery was traditional. Not letting women vote was traditional. The appeal to tradition is especially common in propaganda from conservative movements, but progressive movements have their own version -- the appeal to historical inevitability ("this is the right side of history") is the same technique wearing different clothes.

Technique 6: Manufactured consensus. "Everyone knows." "The people have spoken." "Studies show." "Experts agree." Manufactured consensus makes you feel like you're in the minority if you disagree, and most humans are deeply uncomfortable being in the minority. Social media has supercharged [QA-FLAG: banned word — replace] this technique -- bot networks, coordinated campaigns, and algorithmic amplification can make a fringe opinion look like a mass movement. When you see "everyone is talking about X," the right question is: "Is everyone actually talking about it, or has the algorithm decided I should think everyone is talking about it?"

The same playbook across eras. Walk through any major propaganda campaign and you'll find these techniques layered on top of each other. Nazi propaganda created enemies (Jews, Communists), repeated key slogans relentlessly, used emotionally charged imagery (heroic Aryan figures, dehumanizing caricatures), presented a false binary (the German nation vs. its enemies), appealed to a mythical Germanic tradition, and manufactured consensus through rallies, controlled media, and suppression of dissent. Soviet propaganda did the same things with different content -- the enemy was the capitalist, the emotional imagery was the heroic worker, the tradition was the revolutionary struggle, the consensus was "the people." Same architecture. Different wallpaper.

Why it works on smart people. This is the part that trips people up. Most people think propaganda works on dumb people and they're immune. That's wrong, and thinking you're immune actually makes you more vulnerable. Propaganda doesn't target intelligence. It targets emotion, identity, and cognitive shortcuts. Smart people have the same emotional wiring as everyone else. They can be outraged, scared, and tribal in the same ways. The difference is that smart people are better at rationalizing why their emotional reaction is actually a logical conclusion. Bernays knew this. He specifically targeted educated audiences because he understood that the educated person who's been manipulated will construct an elaborate intellectual justification for a position they arrived at emotionally.

How to inoculate yourself. Here's the genuinely good news. Research by Jon Roozenbeek and Sander van der Linden at Cambridge University has shown that "prebunking" -- learning to recognize manipulation techniques before you encounter them in the wild -- reduces susceptibility to propaganda by a significant margin. In their studies, participants who learned about common misinformation techniques were roughly 70% better at identifying manipulative content afterward [VERIFY]. The mechanism is simple: once you know the technique, it loses its power. A magician's trick only works if you don't know how it's done. The six techniques listed above are the tricks. Now you know how they're done.

The practical version: before you react to any piece of content -- news article, social media post, political ad, viral video -- pause and ask three questions. First: what emotion is this trying to make me feel? Second: what technique is being used (enemy creation, repetition, false binary, etc.)? Third: what information is missing? That three-second pause won't make you immune, but it'll make you dramatically harder to manipulate.

How This Connects

Propaganda is the dark version of rhetoric and persuasion, which is something you'll study in English class. The same tools -- emotional appeal, narrative structure, repetition, audience analysis -- can be used to manipulate or to communicate honestly. The difference isn't in the tools. It's in the intent and the transparency. When a writer uses emotional appeal to help you understand a real experience, that's rhetoric. When a political operator uses emotional appeal to make you hate a group of people you've never met, that's propaganda. Knowing the toolkit lets you tell the difference.

The propaganda playbook also connects directly to negotiation and social influence. The same techniques that work at the societal level -- false binaries, emotional override, manufactured consensus -- show up in personal interactions, workplace politics, and the social dynamics of your school. The kid who says "everyone thinks you're being weird about this" is using manufactured consensus. The friend who says "you're either loyal to me or you're not" is using a false binary. Recognizing the technique doesn't just help you evaluate the news. It helps you navigate your own life.

The School Version vs. The Real Version

The school version: You study propaganda in the context of specific regimes -- Nazi Germany, the Soviet Union, maybe wartime America. You look at posters, analyze speeches, and discuss how "those people" were manipulated. The implicit message is that propaganda is a historical phenomenon that happened to other people in other times.

The real version: Propaganda is a permanent feature of every society, including yours. The techniques are identical across eras -- only the medium changes. Your history class teaches you what propagandists said. The real skill is learning how they said it, so you can recognize the same structure when it shows up in your social media feed, your news consumption, and your political environment. That's not paranoia. It's literacy. And in an age when the delivery systems for propaganda are more sophisticated than anything Goebbels or Bernays could have imagined, it's one of the most important kinds of literacy you can develop.


This article is part of the History: Pattern Recognition series at SurviveHighSchool. [QA-FLAG: footer series line format — expected "Part of the History: Pattern Recognition series." with no "This article is" or "at SurviveHighSchool"] [QA-FLAG: footer related reading label — expected "Related Reading:" (capital R), got "Related reading:"]

Related reading: The Stories We Tell Ourselves, The Same 5 Things Keep Happening, Every Empire Falls the Same Way