The Corrosive Effect of Perpetual, Relentless Lies and Deception to the Human Brain

The human mind suffers when faced with a constant barrage of lying, as leading psychologists have discovered, and lies are eventually accepted as truth.

The now-standard model was first proposed by Harvard University psychologist Daniel Gilbert over 20 years ago. While writing for the American Psychologist magazine, Gilbert studied how the human brain behaves when being confronted with factually inaccurate information. Through his research, Gilbert discovered that regardless of when the brain decides if information it discovers is true or not, the lie is accepted as truth, however briefly, in order to discern whether to accept or reject the information. Gilbert cited the research of William James, who, in studying the work of Dutch philosopher Baruch Spinoza, wrote:

“All propositions, whether attributive or existential, are believed through the very fact of being conceived.”

According to Gilbert, people process all information, whether true or false, in two steps:
  1. First, even if only very briefly, we hold the lie to be true: We must accept something in order to understand it. For example, if someone were to tell us—hypothetically speaking, of course—that climate change is a hoax invented by the Chinese, we must for a fraction of a second accept that, indeed, some shadowy tricksters within the Chinese government concocted the idea of climate change and then proceeded to propagate it worldwide.
  2. Only then do we take the second step to complete the mental certification process, by either accepting it (yes, the Chinese did invent climate change!) or rejecting it (no, they didn’t!).

Unfortunately, while the first step is a natural part of thinking—it happens automatically and effortlessly—the second step can be easily disrupted. It takes work: We must actively choose to accept or reject each statement we hear. In certain circumstances, that verification simply fails to take place. As Gilbert wrote in his paper “How Mental Systems Believe“:

“[W]hen faced with shortages of time, energy, or conclusive evidence, [human brains] may fail to unaccept the ideas that they involuntarily accept during comprehension.”

While the mind can eventually dismiss inaccurate information that it initially perceives to be true through the process of comprehension, it can eventually be worn down, as a human being only possesses a limited capacity to evaluate the veracity of every statement confronted. Consequently, when exposed to enough consistent lying over an extended period of time, the human brain will slowly become increasingly incapable of discerning fact from fiction.

With Donald Trump in the White House — who, according to Politico, lied once every three minutes and 15 seconds over five hours of speeches and press conferences — the next four years will undoubtedly bring trouble and untold damage for the American brain as it struggles to differentiate between true and false. According to PolitiFact70 percent of the statements made by Donald Trump checked during his presidential campaign were false, while only 4 percent were completely true, and 11 percent mostly true. In comparison, PolitiFact found 26 percent of statements by Hillary Clinton to be false.

New York tabloid writers who covered Trump’s rise in the 1980s and ’90s observed that his lying appears to be more than a mere tactic – it’s an ingrained habit – just by how often, and pointlessly, he would lie to them. In Trump’s autobiography, his own ghostwriter coined the phrase “truthful hyperbole,” referring to the flagrant truth-stretching that Trump incessantly employed, over and over, to help close sales. Trump apparently loved the wording, and went on to adopt it as his own.

Our brains are particularly ill-equipped to deal with lies when they come not singly but in a constant stream. When we are overwhelmed with false, or potentially false, statements, our brains pretty quickly become so overworked that we stop trying to sift through everything. It’s called “cognitive load”—our limited cognitive resources are overburdened. It doesn’t matter how implausible the statements are; if bombarded with enough of them, people will inevitably absorb some. Eventually, without consciously realizing it, the brain simply gives up trying to determine what is true.

Trump actually ventures a step beyond simply maintaining a constant stream of dishonesty: If he has a particular untruth he wants to propagate—not just an undifferentiated barrage—he simply states it, over and over. As it turns out, sheer repetition of the same lie can eventually mark it as true in our heads. It’s an effect known as “illusory truth“.

In the 1970’s, a team of psychologists from Temple University and Villanova University in Philadelphia, Pennsylvania, coined the term “illusory truth” when defining the process the mind goes through when confronted with the same lie, over and over again, until it finally accepts it as truth. Lynn Hasher, David Goldstein, and Thomas Toppino discovered this illusory truth when presenting a group of people with a set of statements to evaluate as true or false over a period of two weeks. The psychologists found that when presenting the group with the same false statements, they were more likely to be rated as true when confronted with the lie the second and third times.

The Hasher/Goldstein/Toppino study appears to provide empirical evidence to support the claim famously stated by Nazi propagandist Joseph Goebbels:

“If you tell a lie big enough and keep repeating it, people will eventually come to believe it.”

Repetition of any kind—even to refute the statement in question—only serves to solidify it. For instance, if one says, “It is not true that climate change is a hoax propagated by the Chinese,” or try to refute the claim with evidence, one often perversely accomplishes the opposite of what one intended. Later on, when the brain goes to recall the information, the first part of the sentence often gets lost, leaving only the second. In a 2002 study, Colleen Seifert, a psychologist at the University of Michigan, found that even retracted information—that we acknowledge has been retracted—can continue to influence our judgments and decisions. Even after people were told that a fire was not caused by paint and gas cylinders left in a closet, they continued to use that information—for instance, saying the fire was particularly intense because of the volatile materials present—even as they acknowledged that the correction had taken place. When presented with the contradictions in their responses, they said things such as, “At first, the cylinders and cans were in the closet and then they weren’t”—in effect creating a new fact to explain their continued reliance on false information. This means that when the New York Times, or any other publication, runs a headline like “Trump Claims, With No Evidence, That ‘Millions of People Voted Illegally,’” it perversely reinforces the very claim it means to debunk.

In politics, false information is more particularly powerful. If false information comports with preexisting beliefs—something that is often true in partisan arguments—attempts to refute it can actually backfire, planting it even more firmly in a person’s mind. Trump won over certain voters by stoking fears about the Islamic State, immigrants and crime. Leda Cosmides at the University of California, Santa Barbara (UCSB), points to her work with her colleague John Tooby on the use of outrage to mobilize people: “The campaign was more about outrage than about policies,” she says. According to Politico:

“And when a politician can create a sense of moral outrage, truth ceases to matter. People will go along with the emotion, support the cause and retrench into their own core group identities. The actual substance stops being of any relevance.”

Brendan Nyhan, a political scientist at Dartmouth University who studies false beliefs, has found that when false information is specifically political in nature, part of our political identity, it becomes almost impossible to correct lies. When people read an article that begins with a blatant lie that very strongly aligns with and reinforces a certain partisan worldview, which only later contains a correction that refutes the lie, the initial misperception persisted among many of those partisan readers—and, indeed, was frequently strengthened. In the face of a seeming assault on their identity, they often did not fix this fallacy in their mind to conform with the truth: Instead, they doubled down on the exact views that were explained to be wrong.

Consider a 2013 paper aimed at correcting political misperceptions, specifically. In the study, a group of people around the country were first asked about their knowledge of several government policies: For instance, how familiar were they with how electronic health records were handled? They also were asked about their attitudes toward the issues: Were they in favor, or opposed? Everyone next read a news article crafted specifically for the study that described the policy: how electronic health records work, what the objectives of using them are and how widely they are, in fact, used. Next, each participant saw a correction to the article, stating that it contained a number of factual errors, alongside an explanation of what was wrong. But the only people who actually changed their incorrect beliefs as a result were those whose political ideology was aligned with the correct information already. Those whose beliefs ran counter to the correction? They changed their belief in the accuracy of the publication that could possibly publish such an obviously bogus correction. It’s easy enough to correct minor false facts, the color of a label, say, if they aren’t crucial to your sense of self. Alas, nothing political fits into that bucket.

The distressing reality is that our sense of truth is far more fragile than we would like to think it is—especially in the political arena, and especially when that sense of truth is twisted by a figure in power. As the 19th-century Scottish philosopher Alexander Bain put it, “The great master fallacy of the human mind is believing too much.” False beliefs, once established, are incredibly tricky to correct. A leader who lies constantly creates a new landscape, and a citizenry whose sense of reality may end up swaying far more than they think possible. It’s little wonder that authoritarian regimes with sophisticated propaganda operations can warp the worldviews of entire populations. “You are annihilated, exhausted, you can’t control yourself or remember what you said two minutes before. You feel that all is lost,” as one man who had been subject to Mao Zedong’s “reeducation” campaign in China put it to the psychiatrist Robert Lifton. “You accept anything he says.”

Original sources, to whom all copyrights belong respectively:



[whohit]lying-brain-trump[/whohit]