Could you please put me through to Dr. George Lakoff? I have a really bad case of brainwashing I’m trying to overcome.
New research confirms that repetition of “myths” and slogans helps lodge them in the minds of the public and that refuting them often leads only to the public remembering falsehoods better. Instead, they tell us that “education campaigns with an ‘affirmative’ message,” even if it is a negative message, are far more effective in defeating an adversary’s frame.
University of Michigan social psychologist Norbert Schwarz has done experiments showing that people remember things they hear repeated often enough, regardless of its source, and even if it’s from a single source.
“Hearing the same opinion from several sources is more influential than hearing it only once from one source. This is as it should be,” he wrote in an email exchange with HarperIndex.ca. “But, as we showed in a recent paper, hearing it multiple times from the same source is nearly as influential. ‘A repetitive voice sounds like a chorus.’ So a single person or small group can create the impression of broad consensus through sheer repetition.”
He has experimented with exposing research subjects to falsehoods repeatedly. The research points to the conclusion “that hearing the same Bush snippet multiple times on CNN and every other news show gives it a disproportionate weight. And when several members of the administration hit the talk shows with similar statements, it surely creates reality.”
“Research on the difficulty of debunking myths has not been specifically tested on beliefs about Sept. 11 conspiracies or the Iraq war,” wrote Washington Post science and human behavior correspondent Shankar Vedantam on September 4. “But because the experiments illuminate basic properties of the human mind, psychologists such as Schwarz say the same phenomenon is probably implicated in the spread and persistence of a variety of political and social myths.”
This all confirms what we already know: Right-wingers are brainwashed, and repetition is all it takes to do it. Come up with a snappy mantra, even if it’s complete bullshit, and repeat it often enough, and people will believe it even in the face of overwhelming evidence to the contrary. It’s a disturbing example of how people’s better senses can be shunted aside, and it explains a lot about how cults (religious and otherwise) exercise mind control.
Even more disturbing is how easy it is to do:
Schwarz showed volunteers two lists of common beliefs about health and disease that were prepared by the US Centers for Disease Control and Prevention (CDC). One list was entirely true. The other was entirely false and identified as being untrue common beliefs or “myths.” After half an hour, older people “misremembered 28 percent of the false statements as true. Three days later, they remembered 40 percent of the myths as factual.
“Younger people did better at first, but three days later they made as many errors as older people did after 30 minutes. Most troubling was that people of all ages now felt that the source of their false beliefs was the respected CDC.”
A little information is a dangerous thing, but a lot of repeated misinformation is even more so:
The theory may help explain “why large numbers of Americans incorrectly think that Saddam Hussein was directly involved in planning the Sept 11 attacks.” They certainly explain the tight discipline of Stephen Harper’s Conservatives or the Bush Republicans in repeating slogans, charges and epithets like “cut and run,” “support our troops,” or “Canada’s new government,” and repeated focus on symbolic issues like the threat of terrorism and Arctic sovereignty.
“Things that are repeated often become more accessible in memory, and one of the brain’s subconscious rules of thumb is that easily recalled things are true,” the Post reports. “In politics and elsewhere, this means that whoever makes the first assertion about something has a large advantage over everyone who denies it later.”
Kimberlee Weaver at Virginia Polytechnic Institute did research showing that “the brain gets tricked into thinking it has heard a piece of information from multiple, independent sources, even when it has not.”
This is why I don’t listen to right-wing talk radio, and why I automatically tune out when I hear Dubya droning: I don’t want to turn into one of the dumb, benighted Thirty-Percenters. I already know that their message is brainwash, so avoiding it as much as possible keeps the disinformation out.
I also look at all conservative and even “centrist” mainstream media sources with a jaundiced eye, and now you know why. They are the ultimate repeaters of crapaganda, because they tend to just reproduce it uncritically. This only reinforces the echo chamber effect.
I could recommend a number of books to read about this phenomenon, which is as pervasive in religion as in politics (and in religions that have specifically to do with far-right politics). Chris Edwards’s Crazy For God takes on the overt brainwashing of the Moonies; Deborah Laake’s Secret Ceremonies exposes the soft, squishy, deeply disturbing underbelly of the Mormons. I haven’t yet read George Lakoff (yes, I know, I’m shamefully behind) but I’m sure none of what I’d see in Don’t Think of An Elephant would surprise me, either.
Meanwhile, I’ve come away with some important gleanings:
Don’t repeat the bullshit, even in refuting it;
Come up with a positive message to counter the crap;
Be succinct and concise–in other words, CATCHY.
And above all, cultivate your bullshit detector–and learn to use earplugs.