backfire-effect (Link Bibliography)

“backfire-effect” links:

  1. http://richarddagan.com/framing/Nyhan-Reifler2010.pdf

  2. http://journals.sagepub.com/doi/full/10.1177/2053168017716547

  3. https://par.nsf.gov/servlets/purl/10013134

  4. https://pdfs.semanticscholar.org/07f9/b072c228f3842684b2deb1c2d6e1dbbbc570.pdf

  5. http://rkellygarrett.com/wp-content/uploads/2014/05/Garrett-et-al.-Undermining-Corrective-Effects1.pdf

  6. http://www.djflynn.org/wp-content/uploads/2016/08/elusive-backfire-effect-wood-porter.pdf

  7. http://www.dartmouth.edu/~nyhan/opening-political-mind.pdf

  8. 2007-bullock-excerpt.pdf

  9. 2007-bullock.pdf: ⁠, John G. Bullock (2007-06-01; statistics  /​ ​​ ​bayes):

    This dissertation contains 3 parts—three papers. The first is about the effects of party cues on policy attitudes and candidate preferences. The second is about the resilience of false political beliefs. The third is about Bayesian updating of public opinion. Substantively, what unites them is my interest in partisanship and public opinion. Normatively, they all spring from my interest in the quality of citizens’ thinking about politics. Methodologically, they are bound by my conviction that we gain purchase on interesting empirical questions by doing things differently: first, by bringing more experiments to fields still dominated by cross-sectional survey research; second, by using experiments unlike the ones that have gone before.

    1. Part 1: It is widely believed that party cues affect political attitudes. But their effects have rarely been demonstrated, and most demonstrations rely on questionable inferences about cue-taking behavior. I use data from 3 experiments on representative national samples to show that party cues affect even the extremely well-informed and that their effects are, as Downs predicted, decreasing in the amount of policy-relevant information that people have. But the effects are often smaller than we imagine and much smaller than the ones caused by changes in policy-relevant information. Partisans tend to perceive themselves as much less influenced by cues than members of the other party—a finding with troubling implications for those who subscribe to deliberative theories of democracy.
    2. Part 2: The widely noted tendency of people to resist challenges to their political beliefs can usually be explained by the poverty of those challenges: they are easily avoided, often ambiguous, and almost always easily dismissed as irrelevant, biased, or uninformed. It is natural to hope that stronger challenges will be more successful. In a trio of experiments that draw on real-world cases of misinformation, I instill false political beliefs and then challenge them in ways that are unambiguous and nearly impossible to avoid or dismiss for the conventional reasons. The success of these challenges proves highly contingent on party identification.
    3. Part 3: Political scientists are increasingly interested in using to evaluate citizens’ thinking about politics. But there is widespread uncertainty about why the Theorem should be considered a normative standard for rational information processing and whether models based on it can accommodate ordinary features of political cognition including partisan bias, attitude polarization, and enduring disagreement. I clarify these points with reference to the best-known Bayesian updating model and several little-known but more realistic alternatives. I show that the Theorem is more accommodating than many suppose—but that, precisely because it is so accommodating, it is far from an ideal standard for rational information processing.
  10. http://papers.nips.cc/paper/3725-bayesian-belief-polarization.pdf

  11. https://www.lesswrong.com/posts/tSgcorrgBnrCH8nL3/don-t-revere-the-bearer-of-good-info

  12. 1977-ross.pdf

  13. 2012-tinsley.pdf: ⁠, Catherine H. Tinsley, Robin L. Dillon, Matthew A. Cronin (2012-04-18; statistics  /​ ​​ ​bias):

    In the aftermath of many natural and man-made disasters, people often wonder why those affected were underprepared, especially when the disaster was the result of known or regularly occurring hazards (eg., hurricanes). We study one contributing factor: prior near-miss experiences. Near misses are events that have some nontrivial of ending in disaster but, by chance, do not. We demonstrate that when near misses are interpreted as disasters that did not occur, people illegitimately underestimate the danger of subsequent hazardous situations and make riskier decisions (eg., choosing not to engage in mitigation activities for the potential hazard). On the other hand, if near misses can be recognized and interpreted as disasters that almost happened, this will counter the basic “near-miss” effect and encourage more mitigation. We illustrate the robustness of this pattern across populations with varying levels of real expertise with hazards and different hazard contexts (household evacuation for a hurricane, Caribbean cruises during hurricane season, and deep-water oil drilling). We conclude with ideas to help people manage and communicate about risk.

    [Keywords: near miss; risk; decision making; natural disasters; organizational hazards; hurricanes; oil spills.]

  14. http://www.exmormon.org/whylft18.htm

  15. https://web.archive.org/web/20131029232552/http://www.theliteraryreview.org/WordPress/tlr-poetry/

  16. http://abstrusegoose.com/537