inherent Psychological biases

The article below presents an interesting experiment in assumption and bias.

A Magician’s Best Trick: Revealing a Basic Human Bias
An encounter with a magician reveals a lesson: Think critically about whether you’re only intermittently thinking critically

By ROBERT SAPOLSKY
Dec. 31, 2014 12:01 p.m. ET

My family and I recently watched a magician perform. He was not of the sleight-of-hand ilk but, instead, had a stunning ability to psychologically manipulate his audience into doing and thinking what he wanted: con man as performance artist. He was amazing.

Afterward, we had the fortune of talking about neuroscience and psychology with him. In the process, he offered a demonstration of a small building-block of his craft.

The Magician gave a dime to a volunteer, my son, instructing him to hide it in one of his hands behind his back. The Magician would then guess which hand held the dime. He claimed that if you’re attuned to subtleties in people’s behavior, you can guess correctly at way above chance levels.

Not that day, though. The Magician wasn’t doing so hot. After a string of wrong guesses, he mumbled to my son, “Hmm, you’re hard to read. More rounds, and The Magician was only running 50-50.

They traded roles. And my son turned out to be really good at this game. The Magician looked impressed. After a stretch of correct guesses, he asked: “Did you play rock/paper/scissors when you were a kid?”

“Yes,” my son said.

“Were you good at it?”

“I suppose so.”

“Ah, makes sense, there are similar skills involved.”

Another string of successful guesses; we were agog. The Magician, looking mighty alert, asked: “So, are you trying to imagine what I’m thinking? Or are you focusing on my facial expressions? Or on something I’m doing with my hands?”

The near perfect streak continued; we were flabbergasted. Finally, another guess by my son—“I’m guessing it’s in your right hand.” The Magician opened his right hand displaying a dime. And then opened his left hand, which contained…another dime.

We dissolved with laughter, seeing what dupes we were. Ohhh, he had dimes in both hands the whole time. We started imagining cons built on manipulating a mark into believing that he has an otherworldly skill at something unexpected, and then somehow exploiting that false belief.

The guy had played us every step of the way. First, there was his “poor” performance at guessing—hey, we concluded, this guy puts on his pants one leg at a time. Then he complimented my son with “Hmm, you’re hard to read.” Next, The Magician gave a plausible explanation for my son’s success: “The experience with rock/paper/scissors, of course.” Finally, as my son’s run continued, The Magician indicated it was now a given that my son was virtuosic at this game: The point now was to analyze how he was doing it. Hook, line and sinker.

Something was painfully obvious. If my son had had a string of failures, with the hand containing no dime, we would have instantly used Critical Thinking 101, saying to the magician: “Hey, open your other hand, let’s make sure both hands aren’t empty.”

But faced with this string of successes, it never occurred to us to say. “Let’s make sure you don’t also have a dime in that other hand.” Instead, I had been thinking: “Wow, my son turns out to be the Chosen One of dime-guessing; my wife and I now have the heavy responsibility of ensuring that he only uses his gift for good; he’ll usher in world peace with his ability; he’ll…”

No wonder I’m embarrassed.

It’s what psychologists call “confirmation bias”: remembering information that supports your opinion better than information doing the opposite; testing things in a way that can only support, rather than negate, your hypothesis; and—the variant we fell for—being less skeptical about outcomes we like than we would about less-pleasing results.

Confirmation bias infests diplomacy, politics, finance and everyday life. This experience offered some wonderful lessons: Think critically about whether you’re only intermittently thinking critically; beware of Ponzis bearing gifts; always examine the mouth and the other hand of a gift horse.

Why we eat spicy foods

I guess I now have a better understanding of my enjoyment of spicy food and my regular use of hot sauce.  The WSJ has a piece in its weekend review called “Why We Love the Pain of Spicy Food.”  Though, I can honestly say I do have a limit to the amount of spice I am willing to try.  I am not one to want excessive risk and pain, just a kick to my taste-buds.

Why We Love the Pain of Spicy Food
Eating hot chili peppers allows us to court danger without risk, activating areas of the brain related to both pleasure and pain.

By JOHN MCQUAID
Dec. 31, 2014 2:10 p.m. ET

As winter settles in and temperatures plunge, people turn to food and drink to provide a little warmth and comfort. In recent years, an unconventional type of warmth has elbowed its way onto more menus: the bite of chili peppers, whether from the red jalapeños of Sriracha sauce, dolloped on tacos or Vietnamese noodles, or from the dried ancho or cayenne peppers that give a bracing kick to Mayan hot chocolate.

But the chili sensation isn’t just warm: It hurts! It is a form of pain and irritation. There’s no obvious biological reason why humans should tolerate it, let alone seek it out and enjoy it. For centuries, humans have eagerly consumed capsaicin—the molecule that generates the heat sensation—even though nature seems to have created it to repel us.

Like our affection for a hint of bitterness in cuisine, our love of spicy heat is the result of conditioning. The chili sensation mimics that of physical heat, which has been a constant element of flavor since the invention of the cooking fire: We have evolved to like hot food. The chili sensation also resembles that of cold, which is unpleasant to the skin but pleasurable in drinks and ice cream, probably because we have developed an association between cooling off and the slaking of thirst. But there’s more to it than that.

Paul Rozin, a professor of psychology at the University of Pennsylvania, became interested in our taste for heat in the 1970s, when he began to wonder why certain cultures favor highly spicy foods. He traveled to a village in Oaxaca, in southern Mexico, to investigate, focusing on the differences between humans and animals. The residents there ate a diet heavy in chili-spiced food. Had their pigs and dogs also picked up a taste for it?

“I asked people in the village if they knew of any animals that liked hot pepper,” Dr. Rozin said in an interview. “They thought that was hilariously funny. They said: No animals like hot pepper!” He tested that observation, giving pigs and dogs there a choice between an unspicy cheese cracker and one laced with hot sauce. They would eat both snacks, but they always chose the mild cracker first.

Next, Dr. Rozin tried to condition rats to like chilies. If he could get them to choose spicy snacks over bland ones, it would show that the presence of heat in cuisine was probably a straightforward matter of adaptation. He fed one group of rats a peppery diet from birth; another group had chili gradually added to its meals. Both groups continued to prefer nonspicy food. He spiked pepper-free food with a compound to make the rats sick, so they would later find it disgusting—but they still chose it over chili-laced food. He induced a vitamin-B deficiency in some rats, causing various heart, lung and muscular problems, then nursed them back to health with chili-flavored food: This reduced but didn’t eliminate their aversion to heat.

In the end, only rats whose capsaicin-sensing ability had been destroyed truly lost their aversion to it. Dr. Rozin came to believe that something unique to humanity, some hidden dynamic in culture or psychology, was responsible for our love of chili’s burn. For some reason apparently unrelated to survival, humans condition themselves to make an aversion gratifying.

Not long after, Dr. Rozin compared the tolerances of a group of Americans with limited heat in their diets to the Mexican villagers’ tastes. He fed each group corn snacks flavored with differing amounts of chili pepper, asking them to rank when the taste became optimal and when it became unbearable.

Predictably, the Mexicans tolerated heat better than the Americans. But for both groups, the difference between “just right” and “ouch” was razor-thin. “The hotness level they liked the most was just below the level of unbearable pain,” Dr. Rozin said. “So that led me to think that the pain itself was involved: They were pushing the limits, and that was part of the phenomenon.”

In the human brain, sensations of pleasure and aversion closely overlap. They both rely on nerves in the brainstem, indicating their ancient origins as reflexes. They both tap into the brain’s system of dopamine neurons, which shapes motivation. They activate similar higher-level cortical areas that influence perceptions and consciousness.

Anatomy also suggests that these two systems interact closely: In several brain structures, neurons responding to pain and pleasure lie close together, forming gradients from positive to negative. A lot of this cross talk takes place close to hedonic hot spots—areas that respond to endorphins released during stress, boosting pleasure.

The love of heat was nothing more than these two systems of pleasure and pain working together, Dr. Rozin concluded. Superhot tasters court danger and pain without risk, then feel relief when it ends. “People also come to like the fear and arousal produced by rides on roller coasters, parachute jumping, or horror movies,” he wrote in the journal Motivation and Emotion—as well as crying at sad movies and jumping into freezing water. “These ‘benignly masochistic’ activities, along with chili preference, seem to be uniquely human.” Eating hot peppers may literally be a form of masochism, an intentional soliciting of danger.

Dr. Rozin’s theory suggests that flavor has an unexpected emotional component: relief. A 2011 study led by Siri Leknes, a cognitive neuroscientist then at Oxford University, looked at the relationship of pleasure and relief to see if they were, in essence, the same. Dr. Leknes gave 18 volunteers two tasks while their brains were scanned: one pleasant, one unpleasant.

In the first task, they were asked to imagine a series of pleasurable experiences, including consuming their favorite meal or smelling a fresh sea breeze. In the other, they were given a visual signal that pain was coming, followed by a five-second burst of 120-degree heat from a device attached to their left arms—enough to be quite painful but not enough to cause a burn.

The scans showed that relief and pleasure were intertwined, overlapping in one area of the frontal cortex where perceptions and judgments form, and in another near the hedonic hot spots. As emotions, their intensity depended on many factors, including one’s attitude toward life. Volunteers who scored higher on a pessimism scale got a stronger surge of relief than did optimists, perhaps because they weren’t expecting the pain to end.

The world’s hottest chili, according to the Guinness World Records, is the Carolina Reaper, developed a few years ago by Ed Currie. His website features videos of people eating the peppers, and they are studies in torture. As one man tries a bite, his eyes open with surprise, then his chair tips back and he falls on the floor. Another sweats up a storm and appears to be suffering terribly, but presses on until he has eaten the whole thing.

Watching these, it’s clear that whatever enjoyment might be derived from savoring chili flavors, true satisfaction comes only in the aftermath: the relief at having endured, and survived.

—Adapted from Mr. McQuaid’s “Tasty: The Art and Science of What We Eat,” to be published on Jan. 13 by Scribner.

Does anyone read the Bible literally?

In the WSJ last week, there was an opinion piece regarding the question of biblical literalism.  The author presents a hypothetical and real scenario regarding a non-literal read of the Bible. 

Until this month, John Candide had taught religion at Calvin College, affiliated with the Christian Reformed Church, for 25 years. But his “fall from grace,” as the website InsideHigherEd put it, came after he “wrote about challenges science poses to a literal reading of the gospels.” For that offense, he has agreed to leave his tenured teaching job.

As Mr. Candide explains, he recently became troubled by conflicts between science and literal readings of gospel accounts of Jesus’ resurrection: “The more I read, the more I talked with biologists, the more it became clear to me: science tells us that when people die, they stay dead.” He adds that he continues to believe in the importance of the Bible.

In their joint statement, Mr. Candide and Calvin College said that they agreed to part ways because of tensions raised by his scholarship and a desire not to create “harm and distraction.” Despite this peaceful resolution, his departure raises questions about freedom of scholarship at the college.

Or not—I invented John Candide. The actual story at InsideHigherEd, from which I have borrowed liberally above, was a bit different. The alleged offense involved challenges posed by science to a literal reading of Genesis, not of Jesus’ resurrection. And the professor in question is named John Schneider.

As he wrote in an academic journal earlier this year, Mr. Schneider has concluded that human ancestry can’t be traced to a single couple, the Adam and Eve of the Genesis account. Moreover, he believes—on the basis of science, he says—that the very notion of a fall from a primal state of beatitude must be false.

Let’s compare the fictitious case of Prof. Candide with the real Prof. Schneider. Clearly the gospel accounts of Jesus’ resurrection are at odds with what science tells us about the everyday workings of the world. Faced with this conflict, some Christians have reverted to damage control, saying that of course Jesus didn’t rise from the dead—that’s just silly. What would happen to our credibility if we kept insisting on that? No, the real meaning of the resurrection story is that Jesus lives on in the hearts of all those who follow his example. And so on.

The vast majority of Christians—including those who teach at Calvin College—continue to believe otherwise. Yes, they say, Jesus triumphed over sin and death. But they don’t suppose that Jesus’ resurrection renders the scientific understanding of the world irrelevant. On the contrary: It’s precisely in contrast to the ordinary that the resurrection stands out.

In short, if our imaginary Prof. Candide decided one day that he could no longer affirm the reality of the resurrection, it would seem unremarkable that he and the college should part.

But what about Prof. Schneider? There is a salient difference between Genesis and the gospels. For all their disagreement over the details, orthodox Christians broadly agree about how to read the gospels. But there is no such consensus about how to read Genesis. The range of sharply differing views was outlined in the cover story of the June 2011 issue of Christianity Today, “The Search for the Historical Adam.”

What is at stake in these disputes is not a choice between following biblical authority on the one hand or science on the other, as the matter is often misleadingly framed. Rather, we see rival theological commitments, rival understandings of how to read Genesis.

Undergirding Young Earth Creationism—the belief that the Earth was created only a few thousand years ago—is an unswerving commitment to a certain way of reading scripture, not a disdain for science. A different approach (for example, John Walton’s “The Lost World of Genesis One”) seeks to recover the ancient worldview implicit in the Genesis account of creation, a perspective from which the measurable age of the Earth, however vast, is not relevant. Critical to debates over “the historical Adam” are theological motifs such as Christ as “the second Adam.” These lose their meaning, many evangelicals argue, if Genesis isn’t read literally.

But an alarm should sound whenever the word “literal” is used in this context, whether as a badge of pride (“I just believe in reading the Bible literally”) or as a hint that low-browed fundamentalists are lurking nearby. No one—no one—reads the Bible literally. But some readers are more attentive, more faithful, more imaginative and more persuasive than others.

From a Jewish angle, the question of literalism runs along similar lines. Remembering the responses to Natan Slifkin’s non-literal read of the creation story, The Challenge of Creation, I think that if someone in a Yeshiva were to speculate on Genesis 1 in a non-creationist manner, it would be the equivalent of a Christian read of the Gospels that was non-literal.  I recall some in the synagogue I attended, upon hearing Slifkin speak about how evolution does not equate with creation and how reading Genesis one through the eyes of evolution was creating a falsehood, were bothered.  And of course, the questions about the historical validity of the Torah, from start to finish, began.  Yet, even in certain more moderate Orthodox institutions, non-literal readings are acceptable.

You have to be crazy to rule a country

In last weekend’s WSJ review, an adapted piece was written on the value of mental illness, specifically depression, in helping define great leaders.  It seems that mild depression provides people with a greater clarity to see the world as it truly is, as well as a greater ability to adapt in the midst of crisis.  I should note that not all depressed leaders are great.  A story I always found remarkable pertains to Joseph Stalin at the beginning of the Nazi invasion of the Soviet Union in 1941.  It is reported that Stalin was in a state of shock for weeks after the invasion, and it was so bad that he was not even running the nation during that time.

When times are good and the ship of state only needs to sail straight, mentally healthy people function well as political leaders. But in times of crisis and tumult, those who are mentally abnormal, even ill, become the greatest leaders. We might call this the Inverse Law of Sanity.

Consider Neville Chamberlain. Before the Second World War, he was a highly respected businessman from Birmingham, a popular mayor and an esteemed chancellor of the exchequer. He was charming, sober, smart—sane.

Winston Churchill, by contrast, rose to prominence during the Boer War and the first World War. Temperamental, cranky, talkative, bombastic—he bothered many people. During the “wilderness” years of the 1930s, while the suave Chamberlain got all the plaudits, Churchill’s own party rejected him.

When not irritably manic in his temperament, Churchill experienced recurrent severe depressive episodes, during many of which he was suicidal. Even into his later years, he would complain about his “black dog” and avoided ledges and railway platforms, for fear of an impulsive jump. “All it takes is an instant,” he said.

Abraham Lincoln famously had many depressive episodes, once even needing a suicide watch, and was treated for melancholy by physicians. Mental illness has touched even saintly icons like Mahatma Gandhi and Martin Luther King Jr., both of whom made suicide attempts in adolescence and had at least three severe depressive episodes in adulthood.

Aristotle was the first to point out the link between madness and genius, including not just poets and artists but also political leaders. I would argue that the Inverse Law of Sanity also applies to more ordinary endeavors. In business, for instance, the sanest of CEOs may be just right during prosperous times, allowing the past to predict the future. But during a period of change, a different kind of leader—quirky, odd, even mentally ill—is more likely to see business opportunities that others cannot imagine…

“Normal” nondepressed persons have what psychologists call “positive illusion”—that is, they possess a mildly high self-regard, a slightly inflated sense of how much they control the world around them.

Mildly depressed people, by contrast, tend to see the world more clearly, more as it is. In one classic study, subjects pressed a button and observed whether it turned on a green light, which was actually controlled by the researchers. Those who had no depressive symptoms consistently overestimated their control over the light; those who had some depressive symptoms realized they had little control…

Depression also has been found to correlate with high degrees of empathy, a greater concern for how others think and feel. In one study, severely depressed patients had much higher scores on the standard measures of empathy than did a control group of college students; the more depressed they were, the higher their empathy scores. This was the case even when patients were not currently depressed but had experienced depression in the past. Depression seems to prepare the mind for a long-term habit of appreciating others’ point of view.