What is too much for online sharing

The New Republic has an article about “Undersharing on Facebook Has Become a Bigger Crime Than Oversharing.”  I am confounded how an argument can be made that by not sharing one’s thoughts, it is causing more problems.  While I do think people need to share thoughts in order to teach others and support others in similar life situations, I also advocate for discretion.  Not everyone should share nor should all topics be deemed worthy of online sharing.  This is not to mention the long term ramifications of online commenting and posting.  However, I do agree with the author’s sentiment that there are situations when public outcry needs to be made through any medium available.  Social media has created a double edge sword regarding what is deemed private vs. public.

Undersharing on Facebook Has Become a Bigger Crime Than Oversharing
By Phoebe Maltz Bovy

Do you complain about your marriage on Facebook? If you’re not, according to Hannah Seligson, you should start. In a recent New York Times piece, Seligson argued that by refusing to broadcast marital spats on social media—by “undersharing about our spouses”—couples are inflicting harm on their unmarried friends. Posting only “the occasional burst of good news” gives a false idea of marriage to your unmarried friends (who evidently don’t have access to movies, novels, or the offline versions of any of their Facebook friends). This creates a vicious cycle: Miserable unions that look blissful online inspire other ill-suited couples to tie the knot.

The predominant complaint on social media used to be against oversharing, from banal complaints about your subway ride to reckless murder accusations. But in the past year or so, the balance has shifted: Whereas we used to question the ethics of sharing, now we question the ethics of withholding. It’s not enough to vent to some offline friends. Anyone who has a private story that might benefit others owes that story to the general public, and whatever is holding someone back from telling all—concerns about reputation, perhaps—starts to look selfish. Social-media silence, once viewed as admirable discretion or a sign that someone’s got too much of a real (that is, offline) life for such trivialities, now seems suspect. Whether it’s that your child has a learning disability or that your wife’s lost interest in sex, the world has a right to know.

That goes for politics, too. While some view partisan status updates as irritating, others see them as a requirement for good citizenship. Here, too, the pro-sharing side is winning out, as became apparent last summer and fall in the debate about why white people weren’t talking about Ferguson on Facebook. In her much-shared essay, Janee Woods argued that her white friends were too busy sharing cat videos and mourning celebrities to remark upon Michael Brown’s death. Woods suggested that the only possible explanation for this silence was apathy or, at least, inadequate concern:

There’s a real fear of saying the wrong thing even if the intention is pure, of being alienated socially and economically from other white people for standing in solidarity with black people, or of putting one’s self in harm’s way, whether the harm be physical or psychological. I’m not saying those aren’t valid fears but I am challenging white people to consider carefully whether failing to speak out or act because of those fears is justified when white silence and inaction mean the oppression and death of black people.
In The Guardian, Heather Barmore acknowledged that white people posting about race are bound to have their privilege checked, but asked them to get over it and post anyway: “[R]acism is real—and the first step to having a practical conversation is to admit that it’s real and that it can be terrifying.” In early December, Vocativ’s Abigail Tracy called out the borough of Staten Island for its “deafening silence” on the Eric Garner grand jury decision, as judged by hashtags on Twitter and Instagram. (The nostalgic “#TBT” evidently won out over “#BlackLivesMatter.”)

A lack of stand-taking on social media can easily be interpreted as callous indifference to the world. It’s now assumed that people not only should but do share everything on social media, so an absence of posts and tweets about certain topics suggests that someone either isn’t thinking about them or is thinking something unspeakable, at least in their social circle.

The fight against parental overshare (a battle I still support) looks like a lost cause. As the very concept of privacy becomes a relic, the notion that people might be selectively open about certain information starts to look quaint. It shouldn’t. A culture of socially-enforced sharing will, I suspect, screw over those with less social-media savvy. They’ll earnestly share all, only to become pariahs among their friends—or nationally, if Gawker picks up the story. And I have yet to see any evidence that the sharing of family members’ secrets makes the world a better place. There are legal reasons divorcing couples should avoid posting too much information; reticence may be less about self-promotion-by-omission and more about a desire not to have posts brought up in court. Those who do post play-by-plays of their marriage’s demise inspire not admiration but bafflement.

There’s a far stronger case to be made for speaking out about current events. While I’d rather not see individuals judged for what they omit from their preferred social media platform, these are effective awareness-raising tools, and a little general social pressure to use them as such doesn’t hurt. Woods and Barmore have a point: If what’s holding you back from calling out injustice is a fear of getting too few likes, perhaps the time has come to accept that status updates are, if not the entirety of political engagement, a start. The same white people who feel queasy if they don’t speak out when someone says something racist at a dinner party would do well to feel the same guilty conscience in their social-media interactions. That doesn’t mean you need to live-tweet your delayed flight, but speak up when the moment warrants it. You might be more severely judged for what you don’t say than what you do.
Phoebe Maltz Bovy is a writer living in New Jersey.

Advertisements

inherent Psychological biases

The article below presents an interesting experiment in assumption and bias.

A Magician’s Best Trick: Revealing a Basic Human Bias
An encounter with a magician reveals a lesson: Think critically about whether you’re only intermittently thinking critically

By ROBERT SAPOLSKY
Dec. 31, 2014 12:01 p.m. ET

My family and I recently watched a magician perform. He was not of the sleight-of-hand ilk but, instead, had a stunning ability to psychologically manipulate his audience into doing and thinking what he wanted: con man as performance artist. He was amazing.

Afterward, we had the fortune of talking about neuroscience and psychology with him. In the process, he offered a demonstration of a small building-block of his craft.

The Magician gave a dime to a volunteer, my son, instructing him to hide it in one of his hands behind his back. The Magician would then guess which hand held the dime. He claimed that if you’re attuned to subtleties in people’s behavior, you can guess correctly at way above chance levels.

Not that day, though. The Magician wasn’t doing so hot. After a string of wrong guesses, he mumbled to my son, “Hmm, you’re hard to read. More rounds, and The Magician was only running 50-50.

They traded roles. And my son turned out to be really good at this game. The Magician looked impressed. After a stretch of correct guesses, he asked: “Did you play rock/paper/scissors when you were a kid?”

“Yes,” my son said.

“Were you good at it?”

“I suppose so.”

“Ah, makes sense, there are similar skills involved.”

Another string of successful guesses; we were agog. The Magician, looking mighty alert, asked: “So, are you trying to imagine what I’m thinking? Or are you focusing on my facial expressions? Or on something I’m doing with my hands?”

The near perfect streak continued; we were flabbergasted. Finally, another guess by my son—“I’m guessing it’s in your right hand.” The Magician opened his right hand displaying a dime. And then opened his left hand, which contained…another dime.

We dissolved with laughter, seeing what dupes we were. Ohhh, he had dimes in both hands the whole time. We started imagining cons built on manipulating a mark into believing that he has an otherworldly skill at something unexpected, and then somehow exploiting that false belief.

The guy had played us every step of the way. First, there was his “poor” performance at guessing—hey, we concluded, this guy puts on his pants one leg at a time. Then he complimented my son with “Hmm, you’re hard to read.” Next, The Magician gave a plausible explanation for my son’s success: “The experience with rock/paper/scissors, of course.” Finally, as my son’s run continued, The Magician indicated it was now a given that my son was virtuosic at this game: The point now was to analyze how he was doing it. Hook, line and sinker.

Something was painfully obvious. If my son had had a string of failures, with the hand containing no dime, we would have instantly used Critical Thinking 101, saying to the magician: “Hey, open your other hand, let’s make sure both hands aren’t empty.”

But faced with this string of successes, it never occurred to us to say. “Let’s make sure you don’t also have a dime in that other hand.” Instead, I had been thinking: “Wow, my son turns out to be the Chosen One of dime-guessing; my wife and I now have the heavy responsibility of ensuring that he only uses his gift for good; he’ll usher in world peace with his ability; he’ll…”

No wonder I’m embarrassed.

It’s what psychologists call “confirmation bias”: remembering information that supports your opinion better than information doing the opposite; testing things in a way that can only support, rather than negate, your hypothesis; and—the variant we fell for—being less skeptical about outcomes we like than we would about less-pleasing results.

Confirmation bias infests diplomacy, politics, finance and everyday life. This experience offered some wonderful lessons: Think critically about whether you’re only intermittently thinking critically; beware of Ponzis bearing gifts; always examine the mouth and the other hand of a gift horse.

Why we eat spicy foods

I guess I now have a better understanding of my enjoyment of spicy food and my regular use of hot sauce.  The WSJ has a piece in its weekend review called “Why We Love the Pain of Spicy Food.”  Though, I can honestly say I do have a limit to the amount of spice I am willing to try.  I am not one to want excessive risk and pain, just a kick to my taste-buds.

Why We Love the Pain of Spicy Food
Eating hot chili peppers allows us to court danger without risk, activating areas of the brain related to both pleasure and pain.

By JOHN MCQUAID
Dec. 31, 2014 2:10 p.m. ET

As winter settles in and temperatures plunge, people turn to food and drink to provide a little warmth and comfort. In recent years, an unconventional type of warmth has elbowed its way onto more menus: the bite of chili peppers, whether from the red jalapeños of Sriracha sauce, dolloped on tacos or Vietnamese noodles, or from the dried ancho or cayenne peppers that give a bracing kick to Mayan hot chocolate.

But the chili sensation isn’t just warm: It hurts! It is a form of pain and irritation. There’s no obvious biological reason why humans should tolerate it, let alone seek it out and enjoy it. For centuries, humans have eagerly consumed capsaicin—the molecule that generates the heat sensation—even though nature seems to have created it to repel us.

Like our affection for a hint of bitterness in cuisine, our love of spicy heat is the result of conditioning. The chili sensation mimics that of physical heat, which has been a constant element of flavor since the invention of the cooking fire: We have evolved to like hot food. The chili sensation also resembles that of cold, which is unpleasant to the skin but pleasurable in drinks and ice cream, probably because we have developed an association between cooling off and the slaking of thirst. But there’s more to it than that.

Paul Rozin, a professor of psychology at the University of Pennsylvania, became interested in our taste for heat in the 1970s, when he began to wonder why certain cultures favor highly spicy foods. He traveled to a village in Oaxaca, in southern Mexico, to investigate, focusing on the differences between humans and animals. The residents there ate a diet heavy in chili-spiced food. Had their pigs and dogs also picked up a taste for it?

“I asked people in the village if they knew of any animals that liked hot pepper,” Dr. Rozin said in an interview. “They thought that was hilariously funny. They said: No animals like hot pepper!” He tested that observation, giving pigs and dogs there a choice between an unspicy cheese cracker and one laced with hot sauce. They would eat both snacks, but they always chose the mild cracker first.

Next, Dr. Rozin tried to condition rats to like chilies. If he could get them to choose spicy snacks over bland ones, it would show that the presence of heat in cuisine was probably a straightforward matter of adaptation. He fed one group of rats a peppery diet from birth; another group had chili gradually added to its meals. Both groups continued to prefer nonspicy food. He spiked pepper-free food with a compound to make the rats sick, so they would later find it disgusting—but they still chose it over chili-laced food. He induced a vitamin-B deficiency in some rats, causing various heart, lung and muscular problems, then nursed them back to health with chili-flavored food: This reduced but didn’t eliminate their aversion to heat.

In the end, only rats whose capsaicin-sensing ability had been destroyed truly lost their aversion to it. Dr. Rozin came to believe that something unique to humanity, some hidden dynamic in culture or psychology, was responsible for our love of chili’s burn. For some reason apparently unrelated to survival, humans condition themselves to make an aversion gratifying.

Not long after, Dr. Rozin compared the tolerances of a group of Americans with limited heat in their diets to the Mexican villagers’ tastes. He fed each group corn snacks flavored with differing amounts of chili pepper, asking them to rank when the taste became optimal and when it became unbearable.

Predictably, the Mexicans tolerated heat better than the Americans. But for both groups, the difference between “just right” and “ouch” was razor-thin. “The hotness level they liked the most was just below the level of unbearable pain,” Dr. Rozin said. “So that led me to think that the pain itself was involved: They were pushing the limits, and that was part of the phenomenon.”

In the human brain, sensations of pleasure and aversion closely overlap. They both rely on nerves in the brainstem, indicating their ancient origins as reflexes. They both tap into the brain’s system of dopamine neurons, which shapes motivation. They activate similar higher-level cortical areas that influence perceptions and consciousness.

Anatomy also suggests that these two systems interact closely: In several brain structures, neurons responding to pain and pleasure lie close together, forming gradients from positive to negative. A lot of this cross talk takes place close to hedonic hot spots—areas that respond to endorphins released during stress, boosting pleasure.

The love of heat was nothing more than these two systems of pleasure and pain working together, Dr. Rozin concluded. Superhot tasters court danger and pain without risk, then feel relief when it ends. “People also come to like the fear and arousal produced by rides on roller coasters, parachute jumping, or horror movies,” he wrote in the journal Motivation and Emotion—as well as crying at sad movies and jumping into freezing water. “These ‘benignly masochistic’ activities, along with chili preference, seem to be uniquely human.” Eating hot peppers may literally be a form of masochism, an intentional soliciting of danger.

Dr. Rozin’s theory suggests that flavor has an unexpected emotional component: relief. A 2011 study led by Siri Leknes, a cognitive neuroscientist then at Oxford University, looked at the relationship of pleasure and relief to see if they were, in essence, the same. Dr. Leknes gave 18 volunteers two tasks while their brains were scanned: one pleasant, one unpleasant.

In the first task, they were asked to imagine a series of pleasurable experiences, including consuming their favorite meal or smelling a fresh sea breeze. In the other, they were given a visual signal that pain was coming, followed by a five-second burst of 120-degree heat from a device attached to their left arms—enough to be quite painful but not enough to cause a burn.

The scans showed that relief and pleasure were intertwined, overlapping in one area of the frontal cortex where perceptions and judgments form, and in another near the hedonic hot spots. As emotions, their intensity depended on many factors, including one’s attitude toward life. Volunteers who scored higher on a pessimism scale got a stronger surge of relief than did optimists, perhaps because they weren’t expecting the pain to end.

The world’s hottest chili, according to the Guinness World Records, is the Carolina Reaper, developed a few years ago by Ed Currie. His website features videos of people eating the peppers, and they are studies in torture. As one man tries a bite, his eyes open with surprise, then his chair tips back and he falls on the floor. Another sweats up a storm and appears to be suffering terribly, but presses on until he has eaten the whole thing.

Watching these, it’s clear that whatever enjoyment might be derived from savoring chili flavors, true satisfaction comes only in the aftermath: the relief at having endured, and survived.

—Adapted from Mr. McQuaid’s “Tasty: The Art and Science of What We Eat,” to be published on Jan. 13 by Scribner.