Sunday, August 18, 2019

The Return of Rationalism in the Age of Fake News


Setting up the Context: Farewell to Reason?

Social Psychology and Behavioral Economics have been greatly influenced by the work of Daniel Kahneman and Amos Tversky (d.1996) who did pioneering empirical research on rational decision making from the 1970s to the late 1990s. In study after study, they found that the "rational utility maximizer" presupposed by classical economic models has no counterpart in real life. Cognitive biases are embedded in a non-conscious process of making decisions quickly. Slow and deliberate thinking is much less prevalent than most psychologists and economists had previously thought. Much of this is summarized in Kahneman's 2011 book, Thinking Fast and Slow https://books.google.com/bo... . For now, the main point is that models of rational agency were deeply undercut, even discredited by Kahneman, Tversky and associated figures like Richard Thaler who built on these foundations. By the 90s, several universities using real-time PET Brain Scans and game-playing, reinforced the view that decision-making is predominantly emotional and that people do not know what their best interests are, and so cannot be "utility maximizers," i.e. rational goal-directed agents.(Goldman; 2004) Subsequent brainscan studies support the hypothesis that most decisions are not even made consciously (e.g. https://www.sciencedaily.co... ) All of this has trickled down to mass media reports on bias, motivated reasoning and other concepts marshaled to explain seemingly irrational and counterproductive behaviors. The image of persons as "rational animals" has taken quite a beating .

Still, Kahneman and others have pointed out that it is possible to slow down and think deliberately with practice, even learning to avoid some of the automatic biases he and others have identified as widespread phenomena. The question thus arises: If we were to make decisions (how to vote, what cause to support) and judgments (what to believe and why) consciously and somewhat rationally, would we be able to identify our own interests and detect faulty arguments in politics and mass media, for example? Questions like these have no clear-cut answers, but many moral psychologists and affective scientists have promoted the idea that paradoxically, the more we think about morally and politically loaded issues consciously and deliberately, the stronger our biases and prejudices become. This paradoxical outcome (if it is true) is thought to be the product of "motivated reasoning" whereby we evaluate evidence and facts mainly in terms of arguments that support what we already believe or desire (see, Haidt, Kahan, Sperber et al.). Basically, "reason" is largely superfluous; it functions more or less to rationalize our our beliefs and deeds. Whether hypothesized in terms of evolutionary theory (Dan Sperber https://www.psychologytoday... ), policy studies (Dan Kahan https://papers.ssrn.com/sol... ) or moral psychology (Jonathan Haidt https://www.nytimes.com/201... -- the result it a kind of "farewell to reason" which I have never endorsed.

On this channel and elsewhere, I've argued (against the current grain) that learning and practicing critical thinking skills and learning civics in an emotionally engaging setting would go a long way in building up "rationality-muscles" that have long atrophied in the age of click-of -the-mouse news and communications generally. Elsewhere I have argued that rationality and emotion are most certainly entangled in complex ways, but that emotions can themselves be more or less appropriate to various situations based on rational considerations. We certainly act as if that is true when, for example, telling others that their anger is "irrational" or disproportionate to the perceived offense (as in cases like road rage). We act, that is, as if emotions already include rational considerations, which I think is correct, http://changingminds.org/ex... I will expand on that thouht in a post on philosophy of emotions in the future. For now, the point is that when, for example, Jonathan Haidt, claims that passions rule over reason, there is an assumption that these are competing rather than interconnected aspects of mental life. Can emotions be educated, enriched, or broadened? Can people become less prone to snap judgments (less impatient) and cultivate greater *curiosity*-- that pivotal intellectual emotion that, along with *wonder* is associated with both philosophy and science? Is the life of the mind doomed to the narrow precincts of motivated reasoning and rationalization? I do not think so.
Therefore, I was delighted to come across the following NYT article and the detailed research article from the Journal of Cognition which it summarizes . Taking on the motivated reasoning account, the authors review an impressive body of empirical evidence that supports the view that reasoning skills are positively correlated with the ability to effectively differentiate fake from real news regardless of political ideology or moral background beliefs. Links to the NY Times piece, and the academic research article it summarizes are provided below.
______________________________________________________________________________


Why Do People Fall for Fake News?

by Gordon Pennycook and David Rand

What makes people susceptible to fake news and other forms of strategic misinformation? And what, if anything, can be done about it?These questions have become more urgent in recent years, not least because of revelations about the Russian campaign to influence the 2016 United States presidential election by disseminating propaganda through social media platforms. In general, our political culture seems to be increasingly populated by people who espouse outlandish or demonstrably false claims that often align with their political ideology.

The good news is that psychologists and other social scientists are working hard to understand what prevents people from seeing through propaganda.The bad news is that there is not yet a consensus on the answer. Much of the debate among researchers falls into two opposing camps. One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group —to which the two of us belong — claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy.

However, recent research suggests a silver lining to the dispute: Both camps appear to be capturing an aspect of the problem. Once we understand how much of the problem is a result of rationalization and how much a result of laziness, and as we learn more about which factor plays a role in what types of situations, we’ll be better able to design policy solutions to help combat the problem.

The rationalization camp, which has gained considerable prominence in recent years, is built around a set of theories contending that when it comes to politically charged issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth. According to this view, political passions essentially make people unreasonable, even —indeed, especially — if they tend to be good at reasoning in other contexts. (Roughly: The smarter you are, the better you are at rationalizing.)

Some of the most striking evidence used to support this position comes from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change was greater among people who scored higher on measures of science literary and numerical ability than it was among those who scored lower on these tests. Apparently, more “analytical” Democrats were better able to convince themselves that climate change was a problem, while more “analytical” Republicans were better able to convince themselves that climate change was not a problem. Professor Kahan has found similar results in, for example, studies about gun control in which he experimentally manipulated the partisan slant of information that participants were asked to assess.

The implications here are profound: Reasoning can exacerbate the problem, not provide the solution, when it comes to partisan disputes over facts.Further evidence cited in support of this of argument comes from a 2010 study by the political scientists Brendan Nyhan and Jason Reifler, who found that appending corrections to misleading claims in news articles can sometimes backfire: Not only did corrections fail to reduce misperceptions, but they also sometimes increased them. It seemed as if people who were ideologically inclined to believe a given falsehood worked so hard to come up with reasons that the correction was wrong that they came to believe the falsehood even more strongly.

But this “rationalization” account, though compelling in some contexts, does not strike us as the most natural or most common explanation of the human weakness for misinformation. We believe that people often just don’t think critically enough about the information they encounter.

A great deal of research in cognitive psychology has shown that a little bit of reasoning goes a long way toward forming accurate beliefs. For example, people who think more analytically (those who are more likely to exercise their analytic skills and not just trust their “gut”response) are less superstitious, less likely to believe in conspiracy theories and less receptive to seemingly profound but actually empty assertions (like “Wholeness quiets infinite phenomena”). This body of evidence suggests that the main factor explaining the acceptance of fake news could be cognitive laziness, especially in the context of social media, where news items are often skimmed or merely glanced at.

To test this possibility, we recently ran a set of studies in which participants of various political persuasions indicated whether they believed a series of news stories. We showed them real headlines taken from social media, some of which were true and some of which were false. We gauged whether our participants would engage in reasoning or “go with their gut” by having them complete something called the cognitive reflection test, a test widely used in psychology and behavioral economics. It consists of questions with intuitively compelling but incorrect answers, which can be easily shown to be wrong with a modicum of reasoning. (For example: “If you’re running a race and you pass the person in second place, what place are you in?” If you’re not thinking you might say “first place,” when of course the answer is second place.)

We found that people who engaged in more reflective reasoning were better at telling true from false, regardless of whether the headlines aligned with their political views. (We controlled for demographic facts such as level of education as well as political leaning.) In follow-up studies yet to be published, we have shown that this finding was replicated using a pool of participants that was nationally representative with respect to age, gender, ethnicity and region of residence, and that it applies not just to the ability to discern true claims from false ones but also to the ability to identify excessively partisan coverage of true events.

Our results strongly suggest that somehow cultivating or promoting our reasoning abilities should be part of the solution to the kinds of partisan misinformation that circulate on social media. And other new research provides evidence that even in highly political contexts,people are not as irrational as the rationalization camp contends. Recent studies have shown, for instance, that correcting partisan misperceptions does not backfire most of the time — contrary to the results of Professors Nyhan and Reifler described above — but instead leads to more accurate beliefs. We are not arguing that findings such as Professor Kahan’s that support the rationalization theory are unreliable. Our argument is that cases in which our reasoning goes awry —which are surprising and attention-grabbing — seem to be exceptions rather than the rule. Reason is not always, or even typically, held captive by our partisan biases. In many and perhaps most cases, it seems, reason does promote the formation of accurate beliefs.

This is not just an academic debate; it has real implications for public policy. Our research suggests that the solution to politically charged misinformation should involve devoting resources to the spread of accurate information and to training or encouraging people to think more critically. You politicized times. Just remember that this is also true of people you disagree with.
_________________________________________________________________________________


References/Related Readings

-Jonathan Haidt: The Righteous Mind: Why Good People Are Divided by Politics and Religion; Pantheon Books; 2012

- Dan Kahan: Director of Cultural Cognition Project Website at Yale- http://www.culturalcognitio...
-Daniel Kahneman: Thinking, Fast and Slow; Farrar, Strauss and Giroux; 2011

-Richard Thaler: Quasi-Rational Economics; Russel Sage Foundation, 1994

-Gordon Pennycock and David G. Rand: Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning; Journal of Cognition; June, 2018
PDF here- https://static1.squarespace...
-Gordon Pennycock and David G. Rand: "Why Do People Fall For Fake News," NY Times , Jan. 19, 2019

Dan Sperber and Hugo Mercier: The Enigma of Reason; Harvard U. Press, 2017
Recommended Critical Thinking Series: Detecting fallacies and biases: https://www.khanacademy.org...


The above 7 minute video frames some of the main issues under discussion, and shows current efforts in education to combat the plague of Fake News. It comes from the PBS News Hour:

No comments:

Post a Comment