by Drake Baer
March 20, 2017

from ThriveGlobal Website
 

 




Image courtesy of Unsplash.



New research says

they're better at judging information

than know-it-alls.




Have you ever noticed how we say that beliefs are something you "hold"?

 

According to new research (Cognitive and Interpersonal Features of Intellectual Humility) out of Duke University, the grip you keep on your ideas shapes the way you approach the world. How tightly (or not) you cling to your own opinions is called "intellectual humility" (IH), and it could have a big effect on the decisions you make.

 

People high in IH are less committed to being right about everything and care more about what new information is out in the world, while people low in IH want to believe what they already think.

 

If you agree with the statement,

"I question my own opinions, positions, and viewpoints because they could be wrong"  -  used in the Personality and Social Psychology Bulletin study  -  then you may score highly on IH.

Call it informational sensitivity:

"People differ to the degree that they have antennae on in regard to evidence quality," lead author Mark Leary, professor of psychology and neuroscience, tell us.

The ramifications are broad, Mark Leary says, since needing to be right all the time and ignoring evidence that conflicts with your opinion can create all sorts of problems, whether in relationships or business or politics.

 

Here's one politically relevant example from the paper:

Participants read an anecdote about a politician changing his mind after he learned more an issue, and those low in IH were more likely to say that he was flip-flopping than (wisely) re-evaluating his opinion based on new evidence.

To Leary, the results speak poorly of the American ideal of "sticking to your guns," since that's the opposite of the open- and eager-mindedness that IH characterizes.

 

Self-worth and identity issues might be in the mix, too.

"You almost get the sense that intellectually humble people keep their ego out of their intellectual decisions," Leary says.

With IH, it's less about being right than seeing what's right.

 

Troublingly, other research (Powerful People Think Differently about Their Thoughts) indicates that powerful people take their thoughts more seriously, suggesting that IH might go down the higher you climb in social structures.

 

IH may arise from other, more fundamental factors of personality, too.

 

In one experiment, Leary found that people high in IH also had high openness to experience, a core personality trait measuring curiosity, and need for cognition, or how much you enjoy thinking.

 

Leary reasons that IH arises from those drives working together, since desiring new ideas and chewing them over has a way of getting you accustomed to changing your mind.

 

As indicated in another experiment, IH also depends your ability spot facts. To assess this, the researchers recruited 400 people online for a critical thinking task evaluating the merits of flossing.

 

After reporting how regularly they flossed, participants read one of two essays advocating the practice  -  one relied on strong, scientific arguments citing dental experts, the other on weak, anecdotal arguments from ordinary people.

 

The results:

People high in IH rated the strong argument much more highly, and more hopefully for dental (and social!) progress, the low frequency flossers were indeed more likely to change their minds after reading the stronger essay  -  so long as they were high in IH.

The question I really wanted to ask Leary was outside the scope of the study, namely:

  • Is there a way to get people to be higher in IH?

  • Is there a magic wand of humility that you can wave?

After telling me that's a question for another paper, he did offer a tip from his lectures.

 

When he's teaching this stuff, he likes to check his students' intellectual privilege with a couple of well-placed questions.

"Probabilistically," he likes to say, "wouldn't it be strange if your views were always the right ones? Wouldn't it be odd if everything you believe is true?"

 

 

 

 

 

 

 

 

Powerful People

...Think Differently about Their Thoughts
by Drake Baer

October 27, 2016

from NYMag Website

 

 

 

 

Photo: Jim Watson
AFP/Getty Images

 

 

 

Being in power does, in a very real sense, go to people's heads...

 

Psychologists have found that when people are made to feel powerful, they believe more in the things they're thinking.

 

This leads to a bunch of wacky, seemingly contradictory behaviors, as Ohio State Ph.D. candidate Geoff Durso explained to Science of Us in an email:

Feeling more powerful may make you kinder and more assertive, yet also more dishonest (The Ergonomics of Dishonesty).

This is explained by the "self-validation theory of judgment," he says, which basically means that when you feel powerful, your thoughts get magnified.

 

They feel more right compared to if you felt powerless.

"So, when placed in a situation where one is primed to think aggressively (e.g., a competition), greater feelings of power should translate into more aggressive, competitive behavior (thus seeming to lend credibility to the idea that 'power corrupts')," Durso explained.

 

"But when placed in a situation where someone has the goal to act generously (e.g., when considering a charitable donation), the self-validation perspective predicts that greater feelings of power should now translate into more helpful, pro-social behavior."

And this is where things get even weirder:

If powerful people don't know what to do, they really don't know what to do.

For a study (From Power to Inaction - Ambivalence Gives Pause to the Powerful) published this month in Psychological Science, Durso and his colleagues recruited 129 and 197 college students for two separate experiments.

 

Participants were given different descriptions of an employee named Bob, some with all positive attributes (like that Bob beat his earnings goals), some with all negative (e.g., Bob stole his colleague's mug from the kitchen), and some with an even split.

 

Then, the participants were given a writing task where they had to recall an experience in their lives that made them feel powerful or powerless, framing their decision.

 

They were also asked how conflicted they felt about Bob's future, and in one study, they were asked to decide whether to fire or promote Bob with the click of a mouse. Of the participants who were given ambivalent information about his behavior, the powerful took 16 percent longer to make a decision than the less powerful.

 

Just as power made people kinder or more dishonest when they were primed for it in the other experiments, it also made them think longer about conflicting information.

 

It's a finding that is easy to spot corollaries for out in the wild.

 

Barack Obama, who largely makes 'good' decisions, has spoken about the burden of power and conflicting information.

 

In an interview with U.S. News in 2009, Obama said that one of the difficulties of his job was that if a problem were to have a clear solution, it wouldn't land on his desk; the buck wouldn't stop there.

 

When asked about difficult economic decisions, he said that there's always going to be probabilities involved.

"You're never 100 percent certain that the course of action you're choosing is going to work," he said.

 

"What you can have confidence in is that the probability of it working is higher than the other options available to you. But that still leaves some uncertainty, which I think can be stressful, and that's part of the reason why it's so important to be willing to constantly re-evaluate decisions based on new information."

Same with George W. Bush, as the research team notes in their press release:

Though he described himself as "the decider," the president said that he would not be rushed into making a decision about whether to add or withdraw forces from Iraq, and was, despite his decider-ness, given to delays.

It's all evidence that the more powerful you feel, the higher the stakes are - even in your head...