What's new
Fantasy Football - Footballguys Forums

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

How facts backfire: Facts don’t necessarily change our minds. (1 Viewer)

How Facts Backfire

It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.

This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.

“Area Man Passionate Defender Of What He Imagines Constitution To Be,” read a recent Onion headline. Like the best satire, this nasty little gem elicits a laugh, which is then promptly muffled by the queasy feeling of recognition. The last five decades of political science have definitively established that most modern-day Americans lack even a basic understanding of how their country works. In 1996, Princeton University’s Larry M. Bartels argued, “the political ignorance of the American voter is one of the best documented data in political science.”

On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

What’s going on? How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.

New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.

It’s unclear what is driving the behavior — it could range from simple defensiveness, to people working harder to defend their initial beliefs — but as Nyhan dryly put it, “It’s hard to be optimistic about the effectiveness of fact-checking.”

It would be reassuring to think that political scientists and psychologists have come up with a way to counter this problem, but that would be getting ahead of ourselves. The persistence of political misperceptions remains a young field of inquiry. “It’s very much up in the air,” says Nyhan.

But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

There are also some cases where directness works. Kuklinski’s welfare study suggested that people will actually update their beliefs if you hit them “between the eyes” with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.

Kuklinski’s study, however, involved people getting information directly from researchers in a highly interactive way. When Nyhan attempted to deliver the correction in a more real-world fashion, via a news article, it backfired. Even if people do accept the new information, it might not stick over the long term, or it may just have no effect on their opinions. In 2007 John Sides of George Washington University and Jack Citrin of the University of California at Berkeley studied whether providing misled people with correct information about the proportion of immigrants in the US population would affect their views on immigration. It did not.

And if you harbor the notion — popular on both sides of the aisle — that the solution is more education and a higher level of political sophistication in voters overall, well, that’s a start, but not the solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong. Taber and Lodge found this alarming, because engaged, sophisticated thinkers are “the very folks on whom democratic theory relies most heavily.”

In an ideal world, citizens would be able to maintain constant vigilance, monitoring both the information they receive and the way their brains are processing it. But keeping atop the news takes time and effort. And relentless self-questioning, as centuries of philosophers have shown, can be exhausting. Our brains are designed to create cognitive shortcuts — inference, intuition, and so forth — to avoid precisely that sort of discomfort while coping with the rush of information we receive on a daily basis. Without those shortcuts, few things would ever get done. Unfortunately, with them, we’re easily suckered by political falsehoods.

Nyhan ultimately recommends a supply-side approach. Instead of focusing on citizens and consumers of misinformation, he suggests looking at the sources. If you increase the “reputational costs” of peddling bad info, he suggests, you might discourage people from doing it so often. “So if you go on ‘Meet the Press’ and you get hammered for saying something misleading,” he says, “you’d think twice before you go and do it again.”

Unfortunately, this shame-based solution may be as implausible as it is sensible. Fast-talking political pundits have ascended to the realm of highly lucrative popular entertainment, while professional fact-checking operations languish in the dungeons of wonkery. Getting a politician or pundit to argue straight-faced that George W. Bush ordered 9/11, or that Barack Obama is the culmination of a five-decade plot by the government of Kenya to destroy the United States — that’s easy. Getting him to register shame? That isn’t.
Fascinating...
 
One thing that happens a lot, though, is debate over "facts" themselves. Even as simple of a concept as "is what is being presented as a fact actually a fact"? Even further: Is it accurate? Is it complete? Is it relevant?

 
A lot of this was covered in a very interesting book called "How We Decide."

Bascially, people usually make snap judgments without first weighing all the evidence or considering all the facts. Then, once they've decided, they ignore all evidence that contradicts their opinion and search out evidence that supports it.

Once I read that, I pretty much retired from political threads. Why bother when I know I'm going to ignore info (true or not) presented by the other side and they're going to do the same with me?

But I would like to think that knowing about this tendency has helped me try to gather and consider factual evidence a little more before making snap judgments.

 
A lot of this was covered in a very interesting book called "How We Decide."Bascially, people usually make snap judgments without first weighing all the evidence or considering all the facts. Then, once they've decided, they ignore all evidence that contradicts their opinion and search out evidence that supports it.Once I read that, I pretty much retired from political threads. Why bother when I know I'm going to ignore info (true or not) presented by the other side and they're going to do the same with me? But I would like to think that knowing about this tendency has helped me try to gather and consider factual evidence a little more before making snap judgments.
:goodposting: The level of debate in this country should be a major worry for all involved. As people get more scared, whether its rational or not, they cling to things that help strengthen their own self. Mostly those things are the things they believe for whatever reason. No level of proof will change their minds in most instances because a changing of the mind is actually a loss of their self. Over the last couple of decades we have been building stronger and stronger ideologies and egos. It matters not anymore who is right, its just who has the most money to convince the most people. Which will be the undoing of this country - it won't be right wing or left wing policies per se but the use of those policies to support an ego/self. The worst thing that can happen, especially on the political front, is the loss of people being able to work together who have different opinions because in the end there is no clear cut answer to most political problems.
 
A lot of this was covered in a very interesting book called "How We Decide."Bascially, people usually make snap judgments without first weighing all the evidence or considering all the facts. Then, once they've decided, they ignore all evidence that contradicts their opinion and search out evidence that supports it.Once I read that, I pretty much retired from political threads. Why bother when I know I'm going to ignore info (true or not) presented by the other side and they're going to do the same with me? But I would like to think that knowing about this tendency has helped me try to gather and consider factual evidence a little more before making snap judgments.
:goodposting: This. Perception is reality. That realization contributed significantly to me getting out of science. Why bust my hump to further human knowledge when most people not only don't care, but are openly hostile towards things as basic as vaccination and evolution?
 
A lot of this was covered in a very interesting book called "How We Decide."Bascially, people usually make snap judgments without first weighing all the evidence or considering all the facts. Then, once they've decided, they ignore all evidence that contradicts their opinion and search out evidence that supports it.Once I read that, I pretty much retired from political threads. Why bother when I know I'm going to ignore info (true or not) presented by the other side and they're going to do the same with me? But I would like to think that knowing about this tendency has helped me try to gather and consider factual evidence a little more before making snap judgments.
:goodposting: This. Perception is reality. That realization contributed significantly to me getting out of science. Why bust my hump to further human knowledge when most people not only don't care, but are openly hostile towards things as basic as vaccination and evolution?
As a former researcher i can agree with this. Its hard to let go of the fact that most people not only don't care but will indeed be hostile to it
 
There's not enough partisan bickering in this thread. I'd like to highlight the following passages from the article:

Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)...For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.
:goodposting:
 
A lot of this was covered in a very interesting book called "How We Decide."Bascially, people usually make snap judgments without first weighing all the evidence or considering all the facts. Then, once they've decided, they ignore all evidence that contradicts their opinion and search out evidence that supports it.Once I read that, I pretty much retired from political threads. Why bother when I know I'm going to ignore info (true or not) presented by the other side and they're going to do the same with me? But I would like to think that knowing about this tendency has helped me try to gather and consider factual evidence a little more before making snap judgments.
:goodposting: The level of debate in this country should be a major worry for all involved. As people get more scared, whether its rational or not, they cling to things that help strengthen their own self. Mostly those things are the things they believe for whatever reason. No level of proof will change their minds in most instances because a changing of the mind is actually a loss of their self. Over the last couple of decades we have been building stronger and stronger ideologies and egos. It matters not anymore who is right, its just who has the most money to convince the most people. Which will be the undoing of this country - it won't be right wing or left wing policies per se but the use of those policies to support an ego/self. The worst thing that can happen, especially on the political front, is the loss of people being able to work together who have different opinions because in the end there is no clear cut answer to most political problems.
I disagree with the notion that it's gotten any worse over the last 10, 20, 30, 50, or 100 years. I think it's pretty much the same as it's always been. Our perception that "it's changed" is, IMHO, a result of us individually growing older and realizing what's been true all along. As an idyllic 20 year old, I was going to change the world by figuring out the secrets of nature and using that knowledge to benefit mankind.I eventually realized most people just don't give a #### what the facts are, they're going to do what they please regardless of data or evidence. If you're making decisions on evidence and information, you should be coming out ahead far more frequently than the average bear, who simply rummages through the trash and picks out what he likes.
 
A lot of this was covered in a very interesting book called "How We Decide."Bascially, people usually make snap judgments without first weighing all the evidence or considering all the facts. Then, once they've decided, they ignore all evidence that contradicts their opinion and search out evidence that supports it.Once I read that, I pretty much retired from political threads. Why bother when I know I'm going to ignore info (true or not) presented by the other side and they're going to do the same with me? But I would like to think that knowing about this tendency has helped me try to gather and consider factual evidence a little more before making snap judgments.
:goodposting: The level of debate in this country should be a major worry for all involved. As people get more scared, whether its rational or not, they cling to things that help strengthen their own self. Mostly those things are the things they believe for whatever reason. No level of proof will change their minds in most instances because a changing of the mind is actually a loss of their self. Over the last couple of decades we have been building stronger and stronger ideologies and egos. It matters not anymore who is right, its just who has the most money to convince the most people. Which will be the undoing of this country - it won't be right wing or left wing policies per se but the use of those policies to support an ego/self. The worst thing that can happen, especially on the political front, is the loss of people being able to work together who have different opinions because in the end there is no clear cut answer to most political problems.
I disagree with the notion that it's gotten any worse over the last 10, 20, 30, 50, or 100 years. I think it's pretty much the same as it's always been. Our perception that "it's changed" is, IMHO, a result of us individually growing older and realizing what's been true all along. As an idyllic 20 year old, I was going to change the world by figuring out the secrets of nature and using that knowledge to benefit mankind.I eventually realized most people just don't give a #### what the facts are, they're going to do what they please regardless of data or evidence. If you're making decisions on evidence and information, you should be coming out ahead far more frequently than the average bear, who simply rummages through the trash and picks out what he likes.
What's changed in my opinion though is the internet. Now everyone has an opinion that is correct on mostly every subject.
 
Taking the political angle:

The founding of this country was based on the premise that human nature doesn't change.

More recently, however, we have forces that believe that we can change human nature through our scientific learnings which lead to social engineering.

Compare and contrast the two belief systems in light of the findings contained in this study.

 
Taking the political angle:The founding of this country was based on the premise that human nature doesn't change. More recently, however, we have forces that believe that we can change human nature through our scientific learnings which lead to social engineering.Compare and contrast the two belief systems in light of the findings contained in this study.
You should watch 'century of self' a bbc documentaryafter the war the US government hired the best psychologists in the world to indeed change the nature of the citizens of USA to ensure that they would be placated by little things like shopping.
 
A lot of this was covered in a very interesting book called "How We Decide."Bascially, people usually make snap judgments without first weighing all the evidence or considering all the facts. Then, once they've decided, they ignore all evidence that contradicts their opinion and search out evidence that supports it.Once I read that, I pretty much retired from political threads. Why bother when I know I'm going to ignore info (true or not) presented by the other side and they're going to do the same with me? But I would like to think that knowing about this tendency has helped me try to gather and consider factual evidence a little more before making snap judgments.
:goodposting: The level of debate in this country should be a major worry for all involved. As people get more scared, whether its rational or not, they cling to things that help strengthen their own self. Mostly those things are the things they believe for whatever reason. No level of proof will change their minds in most instances because a changing of the mind is actually a loss of their self. Over the last couple of decades we have been building stronger and stronger ideologies and egos. It matters not anymore who is right, its just who has the most money to convince the most people. Which will be the undoing of this country - it won't be right wing or left wing policies per se but the use of those policies to support an ego/self. The worst thing that can happen, especially on the political front, is the loss of people being able to work together who have different opinions because in the end there is no clear cut answer to most political problems.
I disagree with the notion that it's gotten any worse over the last 10, 20, 30, 50, or 100 years. I think it's pretty much the same as it's always been. Our perception that "it's changed" is, IMHO, a result of us individually growing older and realizing what's been true all along. As an idyllic 20 year old, I was going to change the world by figuring out the secrets of nature and using that knowledge to benefit mankind.I eventually realized most people just don't give a #### what the facts are, they're going to do what they please regardless of data or evidence. If you're making decisions on evidence and information, you should be coming out ahead far more frequently than the average bear, who simply rummages through the trash and picks out what he likes.
What's changed in my opinion though is the internet. Now everyone has an opinion that is correct on mostly every subject.
Fair point.
 
I would say it's way worse today than it was 20, 50 or 100 years ago, simply because the nation and its government have more power. With that power come agendas. Those agendas are powerful enough to affect everybody's lives in a significant way. This alllows people to foster an emotional attachment to a particular agenda because it affects them in a real way. People have way too much of a vested interest in the "facts" to objectively evalaute them.

 
A great example of this phenomenon is the issue of global warming. Conservatives generally oppose restrictions on free enterprise, while liberals are generally okay with it if it serves for them the greater good. These opposing philosophies were set in stone long before the issue of global warming ever arose. Now along comes global warming, which is either happening or it isn't: you wouldn't think that most issues of scientific fact would be open to too much debate (and they're mostly not; except for the refusal among certain religious types to accept evolution or the age of the Earth, this is really one of the very few scientific issues in which the facts themselves are contested.)

Yet because it appears that any "solution" to global warming is going to involve government restrictions on free enterprise, the two philosophies immediately take sides. Go look at any one of the threads here where AGW is debated, and you'll see that almost all the conservatives argue that it doesn't exist, while almost all the liberals argue that it does. How were these positions formed, exactly?

 
i think part of what's going on in that article is that no one believes that news media presents facts anymore. So people presented with information from news media are more likely to take that information as a lie than they are as a fact.

 
This is why there is no market for objective news media. People don't watch the news to learn "facts" in an objective sense, but rather to have their pre-conceived notions validated.

 
A lot of this was covered in a very interesting book called "How We Decide."Bascially, people usually make snap judgments without first weighing all the evidence or considering all the facts. Then, once they've decided, they ignore all evidence that contradicts their opinion and search out evidence that supports it.Once I read that, I pretty much retired from political threads. Why bother when I know I'm going to ignore info (true or not) presented by the other side and they're going to do the same with me? But I would like to think that knowing about this tendency has helped me try to gather and consider factual evidence a little more before making snap judgments.
:confused: The level of debate in this country should be a major worry for all involved. As people get more scared, whether its rational or not, they cling to things that help strengthen their own self. Mostly those things are the things they believe for whatever reason. No level of proof will change their minds in most instances because a changing of the mind is actually a loss of their self. Over the last couple of decades we have been building stronger and stronger ideologies and egos. It matters not anymore who is right, its just who has the most money to convince the most people. Which will be the undoing of this country - it won't be right wing or left wing policies per se but the use of those policies to support an ego/self. The worst thing that can happen, especially on the political front, is the loss of people being able to work together who have different opinions because in the end there is no clear cut answer to most political problems.
I disagree with the notion that it's gotten any worse over the last 10, 20, 30, 50, or 100 years. I think it's pretty much the same as it's always been. Our perception that "it's changed" is, IMHO, a result of us individually growing older and realizing what's been true all along. As an idyllic 20 year old, I was going to change the world by figuring out the secrets of nature and using that knowledge to benefit mankind.I eventually realized most people just don't give a #### what the facts are, they're going to do what they please regardless of data or evidence. If you're making decisions on evidence and information, you should be coming out ahead far more frequently than the average bear, who simply rummages through the trash and picks out what he likes.
What's changed in my opinion though is the internet. Now everyone has an opinion that is correct on mostly every subject.
Fair point.
Also the 24 hr. news channels that position opinion shows as news. The media lives off of partisan bickering, as those that have the strongest beliefs are the ones that bring the most consistent ratings.
 
A great example of this phenomenon is the issue of global warming. Conservatives generally oppose restrictions on free enterprise, while liberals are generally okay with it if it serves for them the greater good. These opposing philosophies were set in stone long before the issue of global warming ever arose. Now along comes global warming, which is either happening or it isn't: you wouldn't think that most issues of scientific fact would be open to too much debate (and they're mostly not; except for the refusal among certain religious types to accept evolution or the age of the Earth, this is really one of the very few scientific issues in which the facts themselves are contested.)Yet because it appears that any "solution" to global warming is going to involve government restrictions on free enterprise, the two philosophies immediately take sides. Go look at any one of the threads here where AGW is debated, and you'll see that almost all the conservatives argue that it doesn't exist, while almost all the liberals argue that it does. How were these positions formed, exactly?
Or, a great example is the climate religion's complete dismissal of the facts presented by the climategate emails. :thumbup:
 
A great example of this phenomenon is the issue of global warming. Conservatives generally oppose restrictions on free enterprise, while liberals are generally okay with it if it serves for them the greater good. These opposing philosophies were set in stone long before the issue of global warming ever arose. Now along comes global warming, which is either happening or it isn't: you wouldn't think that most issues of scientific fact would be open to too much debate (and they're mostly not; except for the refusal among certain religious types to accept evolution or the age of the Earth, this is really one of the very few scientific issues in which the facts themselves are contested.)

Yet because it appears that any "solution" to global warming is going to involve government restrictions on free enterprise, the two philosophies immediately take sides. Go look at any one of the threads here where AGW is debated, and you'll see that almost all the conservatives argue that it doesn't exist, while almost all the liberals argue that it does. How were these positions formed, exactly?
This reminds me of this fantastic thread from a few months ago.
 
Watching the morning news shows and this terms came to mind:

Confirmation Bias:

Confirmation bias (also called confirmatory bias or myside bias) is the tendency of people to favor information that confirms their beliefs or hypotheses.[Note 1][1] People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).

A series of experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. In certain situations, this tendency can bias people's conclusions. Explanations for the observed biases include wishful thinking and the limited human capacity to process information. Another explanation is that people show confirmation bias because they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way.

Confirmation biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. Poor decisions due to these biases have been found in political and organizational contexts.
http://en.wikipedia.org/wiki/Confirmation_bias

It seems to me what might be going on right now in what has come out of the Libya emails is that the Obama administration was really committed and internally proud on the idea that it had dealt terrorism a blow by acting on the years of work by military intelligence to find and kill Bin Ladin, no matter where he was. And they further felt that they had a rock solid campaign theme based on this.

The administration glossed over the actual analysis and intelligence, not purposefully but because they just immediately leapt to fill in the blanks by outright assuming the video behind protests in Egypt were also at the source of the Beghazi attacks, and this also fit in with their liberal mindset which recoiled at what they themselves felt was an offensive video.

The GOP today also seems to be doing this, the suspicion of the Obama administration leads them to leap at the idea that there must have been a choate conspiracy because of the existence of a potential motive in propping up an important campaign theme when in fact there was just pure negligence and maybe some problematic political involvement by some NSA intelligence personnel charged with providing or informing talking points.

It also seems to happen here a lot, and also nationally, in terms of these kinds of national debates where people just seem to talk past each other.

 

Users who are viewing this thread

Back
Top