CDZ Cleaving to untestable beliefs and a PSA: Other irrational modes of argument

usmbguest5318

Gold Member
Jan 1, 2017
10,923
1,635
290
D.C.
It's become easy to test and establish facts -- whether in physics, biology, economics, geology, psychology or policy -- thus it's astounding that bias and polarization have not been defeated. Yet it has not. Indeed, more and more grows, it seems, the population of people for whom facts have so little effect. For example:

A: “There was a scientific study that showed vaccines cause autism”
B: “Actually, the researcher in that study lost his medical license, and overwhelming research since then has shown no link between vaccines and autism.”
A: “Well, regardless, it's still my personal right as a parent to make decisions for my child.”​

A debate that starts with one person uttering a testable factual statement, but upon being shown one or more of their pivotal premises is untrue, when the truth becomes inconvenient to their ends, the person takes a flight from facts. Over the past year of my time on USMB, that type of exchange -- one that doesn't happen in my "real world" life [1] -- has become all too familiar to me.

"The Psychological Advantage of Unfalsifiability: The Appeal of Untestable Religious and Political Ideologies," (PAU) examined a slippery way by which people get away from facts that contradict their beliefs. Of course, sometimes people just dispute the validity of specific facts; however, the researchers found that sometimes people go one step further and, as in the opening example, reframe an issue in untestable ways.


Consider the issue of same-sex marriage. Facts that could be relevant to forming a normative stance on same-sex marriage's legality -- for example, data showing that children raised by same-sex parents are worse off, or just as well-off -- as children raised by opposite-sex parents. But what if those facts contradict one's views?

The PAU researchers presented study participants who supported and opposed same-sex marriage with (supposed) scientific facts that bolstered or disputed their position. When the facts opposed their views, the participants -- on both sides of the issue -- were more likely to state that same-sex marriage isn't actually about facts, asserting upon being given the facts that it's more a question of moral opinion. In contrast, when the facts were on their side, they more often stated that their opinions were fact-based and much less about morals. In other words, the researchers observed participants transcend mere denial of particular facts: they observed the debaters deny the very relevance of facts themselves.

The PAU researchers conducted a similar test wherein the participants were given articles that were critical of religion and that were neutral regarding religion. In response to the critical article, believers who were especially high in religiosity turned to untestable “blind faith” arguments as reasons for their beliefs than to arguments founded on factual evidence, yet in responding to the neutral article, the same believers less often devolved to emotional lines of argument.

What those behaviors show is that when people's beliefs are threatened, they often take flight to a cognitive "safe space" where facts do not matter. In scientific terms, their beliefs become less “falsifiable” because they can no longer be tested scientifically for verification or refutation. [2] For instance, some people dispute government policies by arguing they are ineffective. Yet if facts suggest that the policies work, the same folks oppose the argument based "on principle," as it were.

Recognizing as much, leads one to ask: When people are made to see their important beliefs as relatively less rather than more testable, does it increase polarization and commitment to desired beliefs? Two experiments in the PAU study militate for the answer being "yes."

In one experiment, researchers told half the participants that much of Obama's policy performance was empirically testable and they did not inform the the other half of that. Participants in turn rated Obama's performance on five domains. Comparing proponents of and respondents to Obama, the researchers found that the reminder of testability reduced the average polarized assessments of Obama's performance by about 40%.

To further test the hypothesis that people anneal their desired beliefs when the beliefs are devoid of facts, the researchers looked at a sample of participants that varied from highly to moderately religious. They found that when highly religious participants were told that God's existence will always be untestable, they reported stronger desirable religious beliefs afterward (such as the belief that God was looking out for them), relative to when they were told that one day science might be able to investigate God's existence.

Together these findings show, for some people and topics, when testable facts are less a part of the discussion, people cleave more tightly to the beliefs they wish to have. These results bear similarities to the many studies that find when facts are fuzzier, people tend to exaggerate desired beliefs. [3]

Taken together, the results of the study show that bias is a cognitive distemperature for which the magnetism of facts and education provide the gravity needed for realignment. The researchers found that upon injecting facts into the conversation, the symptoms of bias become less severe. But, unfortunately, the study showed also that facts can only do so much. To avoid accepting sound conclusions individuals dislike, they deliberately run from facts, implement "ideological Faraday cages" and brandish other tools found in their deep, belief-protecting and bias confirming toolbox.


"Ideological Faraday Cages" and Other Implements:
The PAU researchers focused on the general approach by and circumstances under which folks flee to preserve their biases; however, recognizing that someone's resorted to emotionalism isn't enough. So, in the spirit of the findings in the paper discussed above, this is the PSA part of my OP. Accordingly, below is a short list of tactics [4] -- nearly all of them being deflection/derailment tactics I most often see used on USMB. [5]
  • Relative privation -- This is the explicit or tacit "such and such is better/worse" line of argument (The thing opposed is always worse, but the speaker/writer may phrase their argument in terms of their position being better rather than the opposition position being worse. On USMB, I've seen many posters carry this line to its extreme by arguing in effect that on the most extreme instances or states of being matter. Relative privation is sometimes called the "appeal to worse problems."
  • Composition fallacy -- A part is so; therefore all is so.
  • Fallacy of division -- The whole's qualities are shared by all it's parts.
  • Ad Hominem
    • Tu quoque -- This may be the most common of the ad hominem fallacies one encounters in political discourse. Essentially, it's the "she didn't/did do it too" justification. (A good video for this one is here.)
    • Impugn the source (Genetic fallacy) -- If tu quoque isn't the most common form of (fallacious) ad hominem argument one encounters, this one probably is. This is the most widely recognized personal argument. It is makes an attacks the source of information rather than the information itself.
    • Ex concessis -- This is your "guilt by association" and/or "you're a hypocrite" argument. It says a proponent's current argument/positions fails because either proponent formerly associated with persons or ideas diametrically different from the one's s/he currently propones. The short of what makes this tactic fail is that whatever the hell one associated with in the past has absolutely no bearing on the quality of one's current argument. While these arguments aren't necessarily themselves conspiracy theories, they often form material parts of or all of the foundation for conspiracy theories.
  • Equivocation -- This is essentially twisting people's words. This is a key tool for stand up comics, and when used thus, it's fine, but in a serious discussion (i.e, the "other guy" is being serious, regardless of whether "you" want to be) it's fallacious and users of it look stupid. (they may still get a laugh, but it'll more often be a laugh-at than a laugh-with)
  • "If" (Conditional) Arguments -- (watch the video)
  • Non sequitur -- This is really many tactics, but all of them one way or another involve a respondent going off-topic. Non sequitur arguments range from subtle to flat-out "ain't got a goddamned thing to do with topic being discussed." Some common forms of non sequitur are below. Folks often enough confuse them, but outside of a philosophy classroom, that one puts the right label on them matters not.
    • Straw Man -- This is attacking something -- a position, a stance -- a proponent didn't assert. "Straw men" are often created by respondents so they can avoid addressing the topic under discussion or to avoid answering a question they know will lead to the shredding of their position. The straw man fallacy features oversimplification, exaggeration, distortion, or ridicule of someone’s argument in order to make the argument easier to attack, i.e., violating the principle of charity.
    • Ignoratio elenchi -- Creating a distraction from the argument typically with some sentiment that seems to be relevant but isn’t really on-topic. Typically, the distraction sounds relevant but isn’t quite on-topic. This tactic is common when someone doesn’t like the current topic and wants to detour into something else instead, something easier or safer to address.

      In the red herring, the arguer begins with a claim that needs support. You expect the arguer is going to offer support for that claim, but that doesn’t happen. Instead, the original claim is just lost. It’s left behind: the arguer jumps on board another train of thought on another topic, and rides it at some length (usually several sentences). THEN the arguer just stops, as if to say “I’ve proved it now.” Sometimes the arguer actually states that his original claim has, after the digression, been proved.

      In the red herring, there is no distortion of original topic -- the original argument is just lost, whereas some semblance of it remains in a straw man.

    • Missing the Point -- This entails drawing a conclusion that (1) simply doesn’t follow AND (2) no other fallacy captures the error more precisely. "Missing the point" is a more "sophisticated" form of equivocation in that rather than "spinning" a word, it "spins"/ignores a key idea, theme or contextual element of a discussion. In missing the point, there is no distortion or exaggeration (that would be the straw man). There is also no lengthy segue into an entirely new subject (that would be a red herring).
  • Argumentum ad ignorantiam -- It's not proven/disproven; therefore it is not/it is the case. This tactic seems strong, but it's not. I'm sure I don't have to mention what topics are most often confounded using this tactic...the "Russia" matter, climate change, poverty, crime and gun misuse/violence [2], and God's existence are probably the big ones these days. [6]
  • False Dichotomy -- Setting-up as an "either-or" question/scenario something that is not at all binary.
  • Causal Fallacies
  • Argumentum ad misericordiam ("Think of the baby" or Appeal to Pity) -- Appeals to the compassion and emotional sensitivity of others when these factors are not strictly relevant to the argument. Appeals to pity often appear as emotional manipulation. [7]



  • Bandwagon, Status and Popularity -- This is one of those tactics that say more about folks using and falling for than about the topic/thing for which they are used to support.



To close, I will note that it's as important to call out the "shielding tactics" one encounters [8], but most importantly, after one has done so, one must say no more. There is no refuting unsound arguments other than by identifying what makes them unsound. To do anything other than that is to allow the fact-denying party to draw the argument down the road of irrationality, or as some might say "go down the rabbit hole," "bark up the wrong tree," or "go on a snipe hunt." [9] Lastly, don't go telling folks their arguments are fallacious unless you know what you're talking about. I included links to content that explains the structure of the tactics above so that when you are given to rejecting fallacious/emotional arguments, you'll know what to look for and be able to test the argument with which you're confronted using a template.

For a more complete exposition and categorization of common fallacy types, see Recognizing Microstructural Fallacies in Argumentation and Public Advocacy.



Notes:
  1. I wonder how many posts will go by before someone remarks about same-sex marriage, vaccines and/or autism instead of the about the actual thread topic. I think simply being clueless about the topic of discussion has something to do with why debates/conversations go as has the hypothetical "vaccine" one I composed.
  2. Readers seeking a "real world" illustration of this behavior pattern need only look at the "Dickey Amendment." Other examples include Conservatives and O-care, and Liberals and the 2007 "Surge."
  3. Those studies are cited in the PAU paper.
  4. Make no mistake, these fallacies need not be presented discretely. They can be combined in myriad ways amongst themselves and/or with other fallacies not explicitly noted above.
  5. As noted earlier, people in my circles don't much invoke likes of argument that depend on fallacious reasoning, perhaps because in them, nearly everyone beyond the age of ~22 recognizes the most commonly used fallacies, and quite a few that are less common.
  6. One of the most famous cases of argumentum ad ignorantiam concerns the classic case of the McCarthy hearings in the early 1950's. In a series of televised hearings, Senator Joseph R. McCarthy slanderously accused many innocent people of being Communists in a witch hunt atmosphere in which unfounded but highly damaging accusations were often made. McCarthy would show up with a bulging briefcase full of files on accused individuals. But, in many cases, little or no real evidence was presented, and a person was accused on the grounds that there was nothing in McCarthy's files to disprove his Communist sympathies.

    It is important not to confuse ad ignorantiam arguments with ex silencio arguments, which are a form of legitimate negative argument. (Poor debaters and/or poorly informed parties generally should eschew negative arguments.) For example, it has been argued that the Romans did not award medals posthumously by citing negative evidence of such posthumous decorations. Decorations on tombstones and written writings record no evidence of decorations ever having been given to soldiers who died in battle. On the other hand, there are many cases of soldiers who survived who were given decorations. So, we could argue on negative grounds that, if such a practice had existed, it would probably have been reflected in some way in the existing evidence of the giving of awards. But, since there is no known single instance of such an award having been given, then, on the basis of an ex silentio argument, we can conclude that it appears plausible, generally, that the Romans did not award medals posthumously.
    (Source -- Anyone who cares about making strong arguments should read the short paper linked here.)
  7. Principals of many stripes -- attorneys, doctors, consultants, economists, business executives/managers, generals and jurists in particular -- must often parry this appeal because rationally sound decisions often enough will adversely affect someone.
  8. Whether one correctly classifies a fallacy doesn't matter outside of a classroom. Whether one aptly recognizes it and its irrationality in terms of another's argument is all that matters.
  9. FWIW, one of the best tools for developing the strength of one's own arguments is putting in the work needed to strongly and soundly present, as effectively as is possible, the counterargument to one's own stance. (See also: Dialog Theory for Critical Argumentation) Of course, doing this is only possible for folks who care about truth; ideologues are incapable of doing this, probably because long before arriving at a debate, they've already wrapped themselves in their "lead lined" cocoon.
 

Forum List

Back
Top