Psychotechnologies of Influence

I think that most people are unaware of the extent to which their beliefs and their daily choices are shaped by advertising and public relations, and the deceptive and manipulative  psychotechnologies they frequently employ. We’re so inundated every day by symbols and messages crafted by professional persuaders that their influence is largely invisible to most people. We’re all targets of corporate social engineers, and there wouldn’t be so many advertisements if they weren’t effective.

The “Father of Public Relations,” Edward Bernays, was a government propagandist during World War I. After the war, realizing that propaganda had peacetime applications, he re-named it public relations, and wrote the rulebooks for a new profession: the public relations counsel (in the sense of “legal counsel”). Bernays was a nephew and confidante of Sigmund Freud, whose teachings about subconscious influence were combined with the techniques of propaganda in such books as Crystalizing Public Opinion (1923) and Propaganda (1928).

Bernays wrote about “the possibilities of regimenting the public mind” and “the conscious and intelligent manipulation of the organized habits and opinions of the masses.” The practitioners of this new science of influence and persuasion, he wrote, “constitute an invisible government which is the true ruling power of our country. We are governed, our minds molded, our tastes formed, our ideas suggested, largely by men we have never heard of.” Over the last century the propaganda industry (advertising, public relations and political consultancy) has become an indispensable part of both commerce and politics. You may never have heard of Edward Bernays, but he was one of LIFE magazine’s “100 Most Influential People of the Twentieth Century.”

Persuasive messages and campaigns that rely on logic and facts aren’t propaganda. Propaganda aims at the gut, not the brain, using deceptive and manipulative techniques to influence and persuade. The techniques of propaganda aren’t the only weapons in the arsenal of the propaganda industry. Rhetorical devices, symbol manipulation, heuristics, and psychological learning theory – specifically classical (Pavlovian) conditioning and operant conditioning – are among the psychotechnologies  of influence and persuasion utilized by propagandists. I’ll write about some of these tricks of the trade in Part 2, but I’ll first  name and describe the classic techniques of propaganda. Most of these techniques were identified by the Institute for Propaganda Analysis, a public interest group in the thirties whos stated goal was to “teach people how to think (independently), not what to think.”

Probably the most common propaganda technique is assertion: stating an opinion as if it were a fact. Assertions range from outright lies to cleverly-worded messages with no objective factual basis. If you qualify a stated belief with “I think,” “it seems to me,” or “in my opinion,” it’s not propaganda. President Reagan’s famous  statement that “government is the problem” is a classic example of assertion. Another of the most frequently used propaganda techniques is ad nauseam – the endless repetition of assertions, slogans, or advertising jingles. A phrase attributed to Hitler’s Minister of Propaganda, Joseph Goebbels, is that “a lie repeated a thousand times becomes the truth.”

Transfer is a term for creating an association, positive or negative, between two unrelated things. (From a psychological point of view, transfer involves classical conditioning.) Using an American flag as a backdrop for a political message is an example of positive transfer. A background visual of burning stacks of money is an example of negative transfer. Bandwagon suggests that we should be on the winning side and avoid being left behind with the losers: “Everybody knows that’s the truth” or “for those who think young.”

Other propaganda deceptions include lies of omission, card-stacking, and distortion, where facts are cherry-picked to promote the message, and any contrary facts are omitted or misrepresented; or involving an insidious mixture of facts and outright lies; or half-truths, where facts are blended with assertions. Glittering generalities like “national honor” and “best country in the world” are subjective and have no objective basis for definition. Name-calling attempts to reduce a person to a label. With ad hominem, the messenger is attacked, to distract from the message, i.e. “You can’t trust anything he says.” Testimonial and appeal to authority attempt to link  the message with an admired person or authority, whether Abraham Lincoln  or “nine-out-of-ten dentists.” Celebrity endorsements  also fall into this category.

Simplification and pinpointing the enemy offer simple explanations for complex issues and propose a culprit for an identified problem, as in Hitler’s scapegoating of the Jews. Appeal to fear and stereotyping also belong to this cluster of techniques – favored by demagogues and xenophobes – and are self-explanatory. The black & white fallacy is also related: if you’re not with us, you’re against us. There’s no middle ground.

The result of a successful propaganda campaign  is ignorance or deception on a mass scale. If this post has stimulated your curiosity  about psychotechnologies and corporate social engineering, I’ve written a book about it: Ad Nauseam: How Advertising and Public Relations Changed Everything – available in paperback online, or as an e-book.

Information v. Propaganda

There’s a war going on in the media between information and propaganda. Propaganda has become so commonplace in our society that most people don’t seem to know it when they see it. The result of a successful propaganda campaign is  orchestrated ignorance on a mass scale.

Whether a message is information or propaganda isn’t just a matter of opinion or viewpoint. Information is based on facts, evidence and logic, and aims for the intellect. Propaganda relies on specific deceptive and manipulative techniques, and aims at the gut. Its purpose isn’t to inform, but to influence or persuade – often in the guise of information. Providers of information deal in facts and evidence; propagandists  care more about perception than facts. Here are some of the manipulative techniques used by propagandists:

Assertion is stating an opinion – or an outright lie – as if it were a proven fact. It’s not propaganda if you use a qualifier such as “in my opinion” or “I think/believe that ____.” Sometimes propagandists mix lies and opinions with half-truths, and even with selected facts that appear to support their message. Assertion is widely used in advertising, public relations, and political campaigns. Transfer is a term for creating an association. positive or negative, between two unrelated things. Using a giant American flag as a backdrop for a message is an example of positive transfer, whether you’re selling cars or candidates. A blown-up visual of hundred-dollar bills going up in flames, as a backdrop in a political attack ad, is an example of negative transfer. Transfer can be aural or visual, and is a staple of perception management.

Ad nauseam is the technique of incessant repetition. The phrase “A lie repeated a thousand times becomes the truth” has been attributed to Hitler’s propaganda minister, Joseph Goebbels. Political slogans are designed to be endlessly repeated, and to lodge themselves in your mind.  Three related propaganda techniques are lies of omission, card stacking and distortion, where facts are cherry-picked to promote the message, and any contrary facts are omitted or misrepresented. Glittering generalities involves the use of emotionally-loaded phrases that have no objective basis for definition: freedom fighter, perfect union, best country in the world, master race.

Bandwagon suggests that we should follow the crowd, join the Winners, and avoid being left behind with the Losers. “Everyone knows” and “anyone with common sense knows _____” can be used to prop up just about any political opinion. Name calling attempts to arouse prejudice or antipathy, sometimes in the form of an unverified assertion (i.e. Joe Smith is a secret communist), or in the form of sarcasm and ridicule. A related propaganda technique is ad hominem, in which the messenger is attacked, in order to discredit or distract attention from the message. Propagandists are professional perception managers, and name calling and ad hominem are among their favorite tools for molding public perceptions.

Simplification and pinpointing the enemy offer simple explanations for complex issues. and target a culprit for an identified problem. Both techniques were used to devastating effect by the Nazis, to justify the mass murder of Jews. Slogans are usually simplifications. (BUILD A WALL! comes to mind.) A related technique is the black-and-white fallacy, which says that if you’re not for us, you’re against us – no middle ground or room for compromise. Appeal to fear/prejudice relies on stereotypes to fire-up emotions – notably fear and anger. Appeal to authority attempts to link its message with people viewed favorably by the target audience. That’s why you often see white lab coats on actors who endorse medical products and services, so they resemble doctors or scientists. It also explains why you hear that “nine out of ten dentists recommend _____.” Celebrity endorsements are a variant of appeal to authority. Even though we all know that they’re staged and scripted, we nevertheless tend to form a positive association between the product and the admired celebrity. If he uses that product, it must be pretty good.

In my book Ad Nauseam: How Advertising and Public Relations Changed Everything, I write about how we became a Propaganda Society over the course of the twentieth century, and how propaganda is most effective when it’s invisible to the people most influenced by it. I also write about other “psychotechnologies of influence,” including rhetorical devices, heuristics and behavior modification. I’ll write about those in a future post.

 

Mental pollution, Part 2

In my book Ad Nauseam: How Advertising and Public Relations Changed Everything I wrote, “As a psychologist, it disturbs me greatly to see that our society’s primary systematic application of the principles of psychology has been as a tool for commercial and political persuasion, and for the manipulation of behavior in the service of commerce.” Propaganda, which I wrote about in my last post, is only one psychotechnology  of influence used by the propaganda industries – advertising, public relations and political consultancy. Behavior modification is another. According to psychological learning theory (behaviorism) there are two means of systematically conditioning behavior: classical conditioning and operant conditioning.

Classical conditioning is exemplified by Pavlov’s experiments with dogs and is a passive mode of conditioning. Knowing that dogs reflexively salivate when presented with food, Pavlov conditioned his dogs to have the same reaction to the ringing of a bell, ringing it whenever food was presented. Over time, the dogs came to associate the two previously unrelated stimuli, learning to salivate whenever the bell was rung. This kind of associative learning is routinely used by advertisers and marketers to get consumers to associate their product or brand with something they already like or want.

Operant conditioning is an active mode of conditioning, in which a targeted behavior is systematically reinforced. If you expect from experience to be rewarded for what you do, it increases the odds that you’ll do it. This is the method used to teach rats to press a lever in their cage to get food, and to train dolphins to jump through hoops. An example of this in TV advertising is, “Call in the next ten minutes and shipping is free.”

As promised in my last post, here are some of the techniques used by propagandists to influence and persuade. Probably the most frequently used techniques in the media is assertion – either an outright lie, or stating an opinion as if it were a fact, without first saying “I think” or “in my opinion.” Any ad that says “We’re the best/least expensive” without providing factual evidence falls in this category. I think that the second most frequently used propaganda technique is ad nauseam. A lie repeated and repeated and repeated can come to be perceived as the truth. Three other, related, techniques are lies of omission, card stacking and distortion, where facts are cherry-picked to promote the message and any contrary facts are left out or misrepresented. Sometimes the message mixes facts and lies or half-truths; sometimes facts are blended with unsubstantiated opinions (assertions) in a manner designed to obscure the objective truth.

With transfer, a classical conditioning technique, an attempt is made to create an association (positive or negative) between two unrelated things. Using an American flag as a backdrop for a political message is an example of positive transfer. Showing a picture of the opposition candidate with a Hitler mustache superimposed is an example of negative transfer. Bandwagon suggests that we should follow the in-crowd, join the winning side, avoid being left behind with the losers. (Wouldn’t you like to be a Pepper, too?) Glittering generalities involves the use of emotionally loaded generalities that have no objective basis for definition, such as “freedom lover,” “perfect gift for all occasions,” or “best country in the world.”

Name calling can take the form of sarcasm and ridicule, or can employ the assertion technique, such as calling a political candidate a closet Communist, or a secret ISIS supporter, or “weak on crime.” With ad hominem, instead of dealing with the message, the messenger is attacked: “Don’t believe anything he says,” or “fake news.” Simplification offers simple solutions for complex problems, and is often seen in the form of slogans. Pinpointing the enemy and  stereotyping were used by the Nazi propaganda machine to stoke the fires of anti-Semitism and to justify Hitler’s genocidal “final solution.”

Appeal to authority attempts to create a positive association. Examples are celebrity endorsements, a politician invoking the name of an icon such as George Washington or Abraham Lincoln, or an actor in a commercial wearing a white lab coat to suggest that she’s a doctor or a scientific expert. “Nine-out-of-ten dentists recommend _______” is another example.

There are other propaganda techniques that you can read about in my book, but these are some of the most commonly used by professional persuaders. Some commercials and political messages use several, to disguise the fact that what they deliver is not information. These classic propaganda techniques were identified by the Institute for Propaganda Analysis (IPA), a non-partisan educational organization that, unfortunately, only existed from 1937-42. The IPA distributed information about propaganda analysis to schools and civic organizations. One reason we’ve become a Propaganda Society is that we don’t have anything like the IPA to educate the public at large, and propaganda analysis isn’t taught in our public schools.

In my next post I’ll return to my usual subject matter and look into the pathological condition commonly known as “multiple personalities.”