Home > Uncategorized > Appealing to Authority: From Decision Making to Disinformation

Appealing to Authority: From Decision Making to Disinformation

Debates: you either love them or you hate them. For some, they can be interesting discussions that provoke thought in fields ranging from philosophy to politics to the sciences and everything in between. For others though, they are outlets for pretentious people to engage in pretentious fighting about pretentious topics. Among some of the more pretentious aspects of debate is the calling out of fallacies. 

To someone who has no idea what these debaters are talking about, it can be confusing to hear the words “strawman,” “red herring,” or “ad hominem” over and over. These terms all fall under the category of fallacies, which Merriam-Webster defines as arguments using false logic. By using false logic in arguments, debaters would be able to back up their claims using evidence that, while convincing, is actually deceptive. And by understanding these fallacies and calling them out, debaters would be able to avoid losing their argument – essentially filtering for disinformation. Understanding how fallacies like these work can point towards key cognitive processes that we rely on on a daily basis to decide what we deem as true.

One of these fallacies is what is called argumentum ad verecundiam. On top of being a mouthful, it describes an argument that is based on an appeal to a false authority. We appeal to authority very regularly for information, so one might ask what makes this a false argument. It is a fallacy because the expertise of the authority figure in question does not translate to expertise in the field of the argument. It is like citing an expert painter to make an argument about roadwork. Sure, maybe their fields have some potential overlap, but they are different and require different structures of knowledge.

This is a Palla’s cat. Unlike a domestic cat, it is not a source of authority regarding the intricacies of climbing cat trees nor will it be reliable on topics concerning the theory and practice of litterbox management (Photo by Radovan Zierik for Pexels)

How does this fallacy relate to our cognitive process of decision making? In order to look into this, we have to take a slight tangent to mention Solomon Asch’s study on conformity. This study, commonly known as the Asch conformity study, is well known in the field of psychology – you might have heard of it if you took an introductory psychology class. In the experiment, a participant was placed in a group with six to eight other people. These people were assumed to be other participants, but were actually confederates, meaning they were in on the experiment. The group was tasked with identifying the length of three lines compared to a standard line. The difference was clearly visible, however, in some trials, the confederates all responded incorrectly in their judgments. As in line with the name, these trials were intended to determine whether individuals would conform to the opinion of everyone else even if they knew they were right. Asch found that 33% of the time, individuals responded by conforming to the majority, while the other 67% of the time, individuals responded with what they knew as correct. Another replication of this experiment in Kuwait found similar results, meaning that the results are not different based on social context – the results instead point towards a more common cognitive process.

If you’re familiar with this experiment from a psychology class, you might’ve heard that it was about social conformity – how we change our opinions based on societal pressures – but that is not exactly true, especially considering that 67% of results came out non-conforming. With these results, Asch actually points towards a broader phenomenon: shared reality. When the participants were introduced to the majority incorrect opinion, they were brought to a fork in their cognitive road: do they value their own thinking more or the thoughts of others? What used to be something that required little thought now becomes a matter that requires more attention. This is a major focus of the field of cognitive psychology: controlled and automatic processes. Basically, the cognitive process we undergo are positioned somewhere on a spectrum based on how quick and effortful they are as well as the cognitive resources they take up. Automatic processes are quick and require little effort, essentially occurring subconsciously, while controlled processes are time consuming and require and active, conscious effort.

As the participants of this study are forced to reconcile the differences between their thinking and the judgements of those individuals around them, what used to be an automatic process of decision making now requires more effort, as the more controlled process of judging between sources arises.

Rather than being a matter of avoiding seclusion by sticking with the crowd, participants are tasked with understanding a scenario in which their reality is drastically different from the reality of others. This is especially profound because this experiment is set up to measure an objective reality. There is no subjectivity in determining whether a line is the same length or not. And this drastic setup drives in the point that our processes of decision making rely on both our reality/opinion as well as the reality of others. Most of the time, these two are in agreement, but when they diverge, we are faced with a choice as to which source we rely on most.

Although this may have seemed like a tangent, we can apply this concept toward our understanding of how fallacies, especially argumentum ad verecundiam, work. Instead of focusing on our own thoughts and empirically collected data, we often rely on shortcuts (here is an interesting article that delves deeper into cognitive shortcuts and how they relate to the bandwagon effect). One of the easiest of these shortcuts is accepting the words of authority. And it’s not a bad shortcut, since, at least in most cases, authority figures are reputable sources. However, similar to the divergence between individual thought and group thought in Asch’s study, in this scenario, there is a divergence between one person’s argument (i.e. their thoughts) and the opinions of those they rely on for their knowledge. While this can sometimes cause cognitive dissonance, when the authority is legitimate, this usually means that people change their beliefs to accept what the authority is saying. That is basically the basis of science: new evidence comes out from people we can trust, which causes us to change and update our current understanding.

Of course empiricism is the basis of science, but authority-based knowledge plays a considerable role, much to the dismay of how things ‘should work.’ Brett Holt, a researcher in education, argues that sometimes appealing even to a true authority (who is an expert in their field) is relied on too much in the field of research. He gives the example of Dr. Theo Painter, whose research pointed to animals having 24 pairs of chromosomes. Even though other researchers found, what we now hold as correct, that humans have 23 pairs of chromosomes, the academic and lay communities still held on to Dr. Painter’s view for years since he was an established authority. 

Considering that appealing wholeheartedly to genuine authorities can cause these issues, it goes without saying that appealing to people who aren’t genuine authorities can be even more problematic. Most times, people will not catch these fallacies because it is an automatic process in our daily decision making to accept judgements from authorities. But in understanding the existence of these fallacies, debaters seem to be able to be stronger at fighting their influence on arguments. Because we aren’t all debaters (thank God), we don’t have to go through those long lists detailing all kinds of fallacies, but what we can do in this case is be aware of the judgements of authorities and experts, especially when they are being used to promote certain viewpoints. If they are used to make an argument, maybe look into the expertise of the figure. Overall, just hold a healthy skepticism in experts – it takes time and effort to work against a process that occurs automatically, but it is doable if you’re in a situation where you need to check the facts.

References:

Amir, T. (1984). The Asch conformity effect: A study in Kuwait. Social Behavior and Personality, 12(2), 187-190. https://doi.org/10.2224/sbp.1984.12.2.187

Friend, R., Rafferty, Y., & Bramel, D. (1990). A puzzling misinterpretation of the Asch ‘conformity’ study. European Journal of Social Psychology, 20, 29-44. https://doi.org/10.1002/ejsp.2420200104

Holt, B.J. (2021). Appealing to false authority: How the accreditation process encroaches on academic freedom. International Journal for Cross-Disciplinary Subjects in Education, 12(3), 4521-4528. 10.20533/ijcdse.2042.6364.2021.0553

Jackson, C.S. (2020). Deception, lies, and manipulation in cyberspace: Critical thinking as a cognitive hacking countermeasure. The Infragard Journal, 3(1), 14-22. 

Additional resources if you’re interested:

https://www.verywellmind.com/the-asch-conformity-experiments-2794996#:~:text=The%20Asch%20conformity%20experiments%20were,influenced%20by%20those%20of%20groups.

https://philosophy.lander.edu/logic/authority.html

  1. No comments yet.
You must be logged in to post a comment.