PRAGMATICS AND TECHNIQUES OF COMMUNICATION. THEORY OF BOUNDED RATIONALITY Conf.dr. Daniela Dumitru Philosophical Counselling and Consultancy M.A. Overview Course objectives
Set a discussion about psychological background of reasoning and its errors Introducing the concepts of disposition to reasoning, bounded rationality, two thinking systems and heuristics (Tversky&Kahneman, 1974; Kahneman, 2011). Practicing & applying new knowledge Testing CT and inserting it into education. Selective bibliography
Eysenck, M., Keane, M., (2015), Cognitive Psychology, Psychology Press. Chapter 13. Kahneman, D., (2012), Thinking, Fast and Slow, Penguin Books. Parts I to IV. Facione, P. A., (1990), Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction The Delphi Report, prepared for Committee on Pre-College Philosophy of the American Philosophical Association. ERIC ED 315 423. California Academic Press. Tversky, A., Kahneman, D. (1974), Judgement under Uncertainty: Heuristics and Biases, Science, vol.185.
Semiotics According to Semiotics, the study of language is threefold: Syntax - a study of the signs and of the formal rules of the language; main outcome - rules to obtain well-formed formulae. Semantics- a study of the interpretation of the signs
of the language; main results- meaning, truth, and other related concepts. Pragmatics - a study of the use of the language by its speakers; main results - understanding, persuasion, other rhetorical concepts. Argumentation is related to pragmatics. Aristotle and argumentation theory The structure of Rhetoric I & II is determined by two tripartite divisions. The first division consists in the
distinction among the three means of persuasion: The speech can produce persuasion either through the character of the speaker, the emotional state of the listener, or the argument (logos) itself. The second tripartite division concerns the three species of public speech. The speech that takes place in the assembly is
defined as the deliberative species. In this rhetorical species, the speaker either advises the audience to do something or warns against doing something. The audience has to judge things that are going to happen in the future, and they have to decide whether these future events are good or bad for the polis, whether they will cause advantage or harm. The speech that takes place before a court is
defined as the judicial species. The speaker either accuses somebody or defends herself or someone else. Naturally, this kind of speech treats things that happened in the past. The third species: the epideictic speech praises or blames somebody, it tries to describe things or deeds of the respective person as honorable or shameful. In the first two species the listener has to decide in favor of one of two opposite sides, in the third she does not aim such a decision. Forms of thinking
Problem solving: cognitive activity that involves moving from the recognition that there is a problem through a series of steps to the solution. Most other forms of thinking involve some problem solving. Decision making: selecting one out of a number of presented options or possibilities, with the decision having personal consequences. Judgement: a component of decision making that involves calculating the likelihood of various possible events; the emphasis is on accuracy.
Deductive reasoning: deciding what conclusions follow necessarily, provided that various statements are assumed to be true; a form of reasoning that is supposed to be based on logic. Inductive reasoning: deciding whether certain statements or reasoning hypotheses are true on the basis of the available information. It is used by scientists and detectives but is not guaranteed to produce valid conclusions. Informal reasoning: evaluating the strength of arguments by taking reasoning account of ones
knowledge and experience. Judgement and decision making Judgement researchers address the question How do people integrate multiple, incomplete, and sometimes conflicting cues to infer what is happening in the external world?. In contrast, decision-making researchers address the question How do people choose
what action to take to achieve labile [changeable], sometimes conflicting goals in an uncertain world? (Hastie, 2001). We assess judgement in terms of accuracy and decision-making in terms of consequences. Dealing with probability Reverend Thomas Bayes Bayes theorem uses the relative probabilities of two hypotheses before data are obtained (prior odds), and we
calculate the probabilities of obtaining observed data under each hypothesis (posterior odds). Taxi-cab problem. A taxi-cab was involved in a hit-and-run accident one night. Two cab companies, the Green and the Blue, operate in the city. You are given the following data: (a) 85% of the cabs in the city are Green, and 15% are Blue, and (b) in court a witness identified the cab as a Blue cab.
However, the court tested the witnesss ability to identify cabs under appropriate visibility conditions. When presented with a series of cabs, half of which were Blue and half of which were Green, the witness made the correct identification in 80% of the cases, and was wrong in 20% of cases. What was the probability that the cab involved in the accident was Blue rather than Green? Bayesian theorem If we consider the hypothesis that the Blue cab
is HA, and the Green as HB, the prior odds are 0.15 for HA and 0.85 for HB. The probability of the witness recognizing the Blue when it was Blue, i.e.,p(D/HA), is .80, and the probability of the witness stating the cab was Blue when it was Green, i.e., p(D/HB), is 20. The odds ratio is 12:17 and the probability is of 41% for the taxi-cab to have been Blue, thus remaining a 59% probability that it was Green. Why we are so poor on probabilities?
Our surrounding environment is full of very different types of information and stimuli. As we have limited resources for processing the information and limited periods of time we have to select only the information we need from all the available data. So we use shortcuts! They are called heuristics. Heuristics Heuristics or kind of rules of thumb are
our allies when it comes to reasoning and decision making being cognitively undemanding, examining fewer pieces of information or cues and are used rapidly, enabling us to save our efforts. But we are liable to errors when using heuristics and biases, although they make the cognitive load of making decisions a lot easier for us.
Daniel Kahnman and Amos Tversky are the most prominent figures exploring the area of human judgment. Kahneman and Tversky elaborated their own theory based on extensive research regarding the concept of bounded rationality, previously explained by Herbert Simon (our decision making is bounded by environmental constrains eg. information costs, and by cognitive constrains- eg. limited attention). The three general-purpose heuristics availability, representativeness, and anchoring and adjustment that form the basis of many intuitive judgments under uncertainty. Two Families of Cognitive Operations
Cognitive processes can be viewed as pertaining to two main families: intuition and reason. This hypothesis has been accepted under the name of dual-process theories (Kahneman, 2002) having two systems: 1 and 2 (so labelled by Stanovich and West, 2002). System 1
System 1 is more primitive than System 2, automatic and immediate and involving less effort. It produces intuitive answers, meaning to permanently preserve and update our mental universe, epitomizing and guaranteeing its sense of normality. Often emotionally charged, not open to introspection and difficult to control. It is the most prone to mental shortcuts, heuristics and biases.
System 2 On the other hand, System 2 is more analytical, governed by rules and more controlled. Kahneman stated that the operations of System 2 are slower, serial [one at a time] effortful. It has the ability to monitor and correct the intuitive answers and has
some capacity to control the memories brought to mind and also to schedule it so that the detection of an event in the Evans (2007) put forward the concept of dual parallel processing with both systems operating simultaneously. System 1 is not necessarily less capable. A fascinating demonstration of the intelligence of System 1 is the ability of chess masters to perceive the strength
or weakness of chess positions instantly. For those experts, pattern matching has replaced effortful serial processing. Similarly, prolonged cultural exposure eventually produces a facility for social judgments. The roles of the two systems depend on features of the task and of the individual, including the time available for deliberation, the respondents mood, intelligence, and exposure to statistical
thinking. (Kahneman, 2002). Heuristics Availability Heuristic: individuals judge the frequency of an event or the likelihood of its occurrence by the ease with which instances or associations come to mind. Examples!
Lista 1 Lista 2 Representativeness Heuristic Representativeness is the assumption that an object or an individual belongs to a specific category because it is representative (typical) of that category. Prototype. Example: The following description is selected randomly from 100 descriptions, of lawyers (70) and engineers (30).
Jack is a 45-year-old man. He is married and has four children. He is generally conservative, careful and ambitious. He shows no interest in political and social issues and spends most of his free time on his many hobbies, which include home carpentry, sailing and numerical puzzles. Estimate from 1 to 100 % to which group does Jack belongs?
At the core of the representativeness heuristic usage lies our poor understanding of the concept of base rates and our tendency to exaggerate the coherence of what we perceive. We tend to underestimate the correct probability of a certain situation to occur and in order to improve this trait, which is essential for effective judgment and sound decision making we should Anchoring and Adjustment
Tversky and Kahneman (1974, p. 1128) defined anchoring as a process in which people make estimates by starting from an initial value that is adjusted to yield a final answer [and] . . . adjustments are typically insufficient. It is related to availability, people regulating their estimations by taking into consideration certain reference cues called end-anchors.
When assessing the attractiveness of a gamble, the anchor most consonant with the response mode is the gamble attribute. In a gamble, people anchor on the monetary outcome of the gamble and make adjustments from there. Another example is uninformative numerical anchoring. The percentage of African countries in the United Nations is more or less than 10%? Can you offer an estimation?
All anchoring procedures involve the displaying of an anchor. The focus is on numeric anchors that are uninformative but salient to the subjects that make a decision. The multiplication problem. Please estimate: 1x2x3x4x5x6x7x8
8x7x6x5x4x3x2x1 The traits of anchors, judgements and targets Attention to the Anchor: anchoring experiments consist of a two-step procedure: initial comparison task followed by numeric evaluation of the target. This procedure assures that subjects consider the anchor and correlate it to with the target.
AnchorTarget Compatibility: the anchor is an arbitrary number on the same scale as the target response. Extreme Anchors: studies have found that anchoring occurs even when the anchors are extreme or represent far-fetched answers to the question. Strack and Mussweiler (1997) asked subjects, for instance, to estimate the year that Einstein first visited the United States after considering anchors of 1215 or 1992. Interestingly enough, these highly unlikely anchors produced anchoring effects just as much as more plausible anchors.
Awareness: Due to the prevalent nature of anchoring, experimenters considered as another necessary condition to inquire subjects regarding their awareness when it comes to its influence. Wilson (1996) inquired the participants regarding the anchors power of influence. There was a moderate and significant correlation between reported awareness and the size of the anchoring effect. However, even if most of the subjects answered that they had not been influenced by the anchor they still showed an anchoring effect. Causes of anchoring
Anchor Retrieve and select information Integrate information Expressing information Framing effect
Decisions can be influenced by situational aspects irrelevant to good decision. We would rather single out options representing a small but certain gain, rather than a larger but uncertain gain, unless the uncertain gain is either excessively greater or only modestly less than certain. Loss aversion. Example of framing effect
There was a potential danger of being an Asian disease outbreak that would most likely kill 600 people. Two main solutions were proposed: one would save 200 people, while the other had 1 in 3 likelihood that 600 people would be saved and 2 in 3 probability that none would be saved. Full-fat milk The Asian disease: 72% of the
respondents chose the first solution, although the two solution would both represent the saving of an average of 200 lives. Affect heuristics Using ones emotional responses to influence rapid judgements or decisions. Give examples. Judgement theories
The support theory (Tversky & Koehler, 1994): it is based on usage of availability heuristics. It says that an event appears more or less likely depending how it is described. E.g.: you may assume that probability you will die on your next summer holiday is very low. However, it may seem more likely if you are asked What is the probability to die on your next holiday from a disease, a car accident, a plane crash, or from any other cause?
This is happening due to the following assumptions: A more specific description, providing more details could highlight certain facets that might be overlooked had they not been provided Memory impediments have to be taken into account, e.g. not always people can recollect all the data which might help them formulate the most accurate answer (Eysenck and Keane,
Fast and Frugal Heuristics Proposed by Gerd Gigerenzer, Peter M. Todd, and the ABC Research Group, in Simple Heuristics That Make Us Smart (1999). We are born with an adaptive toolbox which contains: the recognition heuristic, one good reason (such as take-the-best heuristic) and on the wisdom of others (such as imitate-themajority heuristic)
These heuristics are called ecologically rational as a result of having been adapted to the environment. The recognition heuristic entails granting a higher value to the recognized object of a given set of two objects.
One of the most prominent fast-and-frugal heuristic is take-the-best strategy which has the following elements: search rule (searching for cues like a city name recognition), stopping rule (stop after finding an inequitable hint) decision rule (electing the outcome response). In other words, we choose only the most important criterion for us. For instance, when choosing a new car, the main factor could be either the look of the car, or the safety it brings us, or the speed it can reach.
Biases Cognitive biases are erroneous judgment models which were correlated with heuristics and for which they served as signatures. Some biases were categorized as adaptive, as there are cases where their use rendered efficient actions or triggered fast decisions when this requirement was crucial.
Other can bring about instances of fallacious judgment, perceptual distortion, or irrational interpretation. There are more than 300 cognitive biases identified by now. The sunk-cost effect This fallacy represents a greater tendency to continue an endeavor once an investment in money, effort, or time has been made. Eg. Justifying the remaining in a relationship with an
abusive partner. Based on our aversion of loss. Conjunction fallacy Kahneman and Tversky named conjunction fallacy a failure to use elementary notions of logic when assessing a conjunction of two events to be more probable than one of the events. Linda
is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear Assess the probability of each statement. A. B. C. D. E.
F. G. H. Linda is a teacher in elementary school. Linda works in a bookstore and takes yoga classes. Linda is active in the feminist movement. Linda is a psychiatric social worker. Linda is a member of the League of Women Voters. Linda is a bank teller. Linda is an insurance salesperson. Linda is a bank teller and is active in the feminist movement.
It is based on the Representativeness heuristic. Naturalistic decision making The main criticism brought to the model of decision making analyzed in a lab was that it lacks the general atmosphere that this process presupposes in a natural environment. The illformulated problems, the mental state, time pressure, high risks, entailed by a decision for which one can be held accountable. A new scientific field of study has appeared under the name naturalistic decision making,
Galotti, 2002. Gamblers fallacy and hot hand Gamblers fallacy is an erroneous conviction that the likelihood of a random event (e.g. winning or losing at gambling or a particular sport) is triggered by a prior random event/situation. The gamblers fallacy is related to the representative heuristic but the direction of its action is quite the opposite: considering the representation of the past occurrences, once
uses this model and predicts that this pattern is about to change due to an undue employment of cognitive System 1. Due to the overuse of System 1, we tend to disregard the fact that each bet has an equal probability of succeeding or failing. On the other hand, the hot hand
fallacy has an opposite effect: it makes one have the conviction that an initial lucky event will only trigger other successful events. This fallacy refers to the cognitive illusion generated by the outstanding performance of a certain player. Group decision making
Good and beneficial for creativity But attention to groupthink Groupthink is characterized by superficial decisions that are made due to several factors (Janis, 1971): closed-mindedness (the group doesnt accept alternative suggestions), rationalization (there are complex explanations concocted to defend their decisions, even by altering reality if necessary),
squelching of dissent (the members who didnt agree with the majority were ignored, reprimanded or excluded), formation of a mindguard for the group (one member declares to be in charge with the rules and make sure the other individuals observes them), feeling invulnerable (the group has absolute confidence in the decisions made due to their trust in the other members and to the information they have) feeling unanimous (the members of the group believes that everybody agrees with them)
The conditions enumerated by Janis (1971) resulting in groupthink are: an authoritarian leader, inside or outside the group, the group being isolated or homogenous and
time or risk pressuring the group. Thus, anxiety is an important factor that can lead to groupthink. Overconfidence The deliberation after taking into account various surmises and the formation of belief are essential factors of human thought. The evaluation of evidence and the estimation of confidence has been the object of debate for philosophers and statisticians and also of experiments for psychologists and decision researchers.
A valid evaluation of the degree of confidence in a given inference presupposes the synthesis of various types of evidence. For a large category of problems, a dissimilarity between the strength of the evidence and its weight is possible. For example, Dale Griffin (2002) showed how this model of thinking is applied when assessing a letter of recommendation for a graduate student written by a former teacher: (1) How positive or warm is the letter? and (2) How credible or knowledgeable is the writer? The first question refers to the strength or extremeness of the evidence, whereas the second question refers to its weight or credence. The
distinction between the strength of evidence and its weight is closely related to the distinction between the size of an effect (e.g., a difference between two means) and its reliability (e.g., the standard error of the difference). The distinction between the strength of evidence and its weight is closely related to the distinction between the size of an effect (e.g., a difference between two means) and its reliability (e.g., the standard error of the difference).
People dont take into consideration both strength and weight On the contrary, intuitive judgments are overly influenced by the degree to which the available evidence is representative of the hypothesis in question (Kahneman, 2002). Returning to the letter of recommendation example, people first focus on the warmth of the recommendation and then consider the knowledge or reputation of the writer. The overall conclusion was that people pay attention
to the strength of the evidence and afterwards make some adjustment regarding its weight. Hindsight Bias The object of study for many psychologists (Fischhoff, Wasserman and Lempert) has been the outcome of peoples reevaluations after having initially decided something. The conclusion reached was that the human mind is limited when it comes to reconstruct past states of knowledge or beliefs that have been altered. When a new perspective
regarding a certain event, situation is embraced, we are immediately unable to recollect the perspective that we had before altering it. Baruch Fischhoff is the first psychologist to study the effects of this universal I-knew-it-all-along bias that can virtually influence anyone. Nixon visit to China. The most affected category is the one in the service
of others (physicians, financial advisers, leaders, social workers, politicians) as we have this propensity to incriminate the decision makers if a sensible decision doesnt have a favorable outcome and not to appreciate a good decision until it proves to be successful. Experience doesnt reduce this effect. Illusory Correlation "Illusory correlation" was coined by Chapman and Chapman (1967) when showing the human minds propensity to overestimation of the relationship
between two groups as a result of the contrasting and peculiar information presented. Most likely due to the representativeness heuristic we may make correlations and even cause-effect relationships between our own preconceptions which we use to form stereotypes and a given situation, person, attribute, etc. As a result the traits that best serve our purpose of demonstrating our biased beliefs are most likely to be recollected easier than the ones that go against our partisan judgment. Draw-a-Person tests. According to the experimenters, the
false correlation apparently emerges between the diagnosis and certain reactions. For instance, they suggested that people diagnosed with paranoia have the tendency to draw people with large eyes more than do people with other diagnoses which in fact is not the case. Nonetheless, when the subjects were attuned to recognize a correlation between a large eyed drawing and the diagnosis of paranoia they showed the effect.
Part II. Theory of Argumentation There are a lot of definitions of critical thinking. The following is a selection of theories that Ive considered relevant. Modern concept of critical thinking is outlined by the philosopher, psychologist and educator John Dewey. The author used the term of reflective thinking to
describe this concept as the active, attentive and perseverant consideration of an opinion or any form of knowledge in the light of the proofs they support and the conclusions they wish to ground. Edward Glaser is an author that followed the conceptual line opened by Dewey. Critical thinking is an attitude according to which someone is willing to take into consideration the facing problems in a reflective manner. This feature is followed by the acknowledgment of investigation and logical judgment methods and
certain skills of operating and using these methods. This means that possessing certain reasoning methods and knowing logics is not enough. You also have to be willing to use them in everyday situations. In 1989 Robert Ennis, one of the most important representatives who decisively contributed to the development of this domain, proposed probably the most used definition today: critical thinking is reasonable and reflective thinking focusing on deciding what to believe or do (Norris and Ennis, 1989).
Richard Paul (1993) added another feature in the definition of critical thinking: metacognition. The way of thinking about any subject, content, or problem, in which the thinker improves the quality of his or her thinking by skillfully taking charge of the inherent structures in thinking and imposing intellectual standards on them. Alec Fisher (2001, p. 5) said that this
definition drew the attention on the fact that the only way to improve a persons critical thinking abilities consisted in that persons conscious participation in the improving process, having a model of good thinking, an ideal model of thinking correctly, to which the person can constantly relate. In 1988, the Delphi Project started. It was financed by the American Philosophical Association and gathered an interdisciplinary team of specialists (philosopher, teachers, psychologists, sociologists, critical thinking
specialists, assessment specialists, an economist, a computer science specialist, a zoologist and a physician). Its aim was to conceptualize critical thinking. The official title of this project was Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Here is the entire Delphi definition: critical thinking is purposeful, selfregulatory judgment, which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual
considerations upon which that judgment is based. The definition continues with: critical thinking is an essential tool of inquiry. The Delphi Project experts believe that the following set of cognitive skills represents
the main dimensions of critical thinking: Analysis Interpretation Evaluation Inference Explanation Self-regulation Disposition to think critically the Delphi Report After reaching a consensus through the Delphi method, here is how experts describe the dispositions or characteristics of a good critical thinker: he commits and encourages
the others to commit to critical judgment; he is capable of such judgments in a wide area of contexts and a variety of purposes (The Delphi Report, Executive Summary, 1990, p. 12). The experts make up a list of affective dispositions that define a good critical thinker, divided into two categories:
Affective dispositions regarding life and the quotidian in general: inquisitiveness with regard to a wide range of issues; concern to become and remain generally wellinformed; alertness to opportunities to use critical thinking; trust in the processes of research, reasoned inquiry; confidence in one's own ability to reason; openness to different visions of the world; flexibility in considering alternatives and opinions;
understanding the opinions of other people; fair-mindedness in appraising reasoning; honesty in acknowledging one's own prejudices, stereotypes, egocentric or socio-centric tendencies; prudence in suspending, making or altering judgments;
willingness to reconsider and revise the points of view where correct, honest reflection justifies it.
Affective dispositions regarding themes, matters and specific questions: the desire for clarity in stating the questions and considerations; the need for orderliness in approaching matters and complex subjects; diligence in seeking relevant information; reasonableness in selecting and applying criteria; care in focusing the attention on the concerned matter; persistence in front of the difficulties that block the argumentation; the need for as much precision as possible. The degree is limited only by the type of the concerned subject or theme.
61% of the Delphi experts consider the dispositions listed above as being an integrant part of the critical thinking concept. However, the consensus, meaning 83% of the experts, agreed that a good critical thinker could be characterized by the dispositions described above, which, however, are not defining for the individualization of the concept. Critical thinking development. Cognitive values
Deanna Kuhn (1999, 2003, 2004, 2007, 2008) describes a stage model of the development of critical thinking. The author acknowledges four stages of the critical thinking development that closely follow the Piagetian stages. The first stage, realism, corresponds to the preschool age and to the preoperational stage. In this stage, assertions are copies of the external reality, the knowledge comes from an external source and is certain and critical
thinking is unnecessary because there cannot be any dispute, since everybody sees the same thing. The second stage, absolutist, corresponds to the school age and the stage of concrete operations. This is the stage of an accumulation of some safe facts, assertions are facts that are either correct or incorrect in their representation of reality and the knowledge comes from an external source and is certain, but not directly accessible, thus producing false beliefs. Critical thinking is a vehicle for comparing assertions about reality,
aiming to determine their truth or falsehood. The third stage, multiplism or relativism, corresponds to adolescence. Assertions are freely chosen opinions and are seen as personal goods and therefore, unavailable to discussions. In this stage, knowledge is seen as generated by the human mind and that is the reason why it is uncertain. Adolescents fall into a whirl of doubt (Chandler, 2003 apud. Kuhn, 2003), from which they may never get out. The conclusion according to which everybody is right in their own way and everybody is right in their own way from certain points of view they agree more than others, namely their
opponents and that they are free to believe whatever they want and in what they want is a well-known picture of adolescence. However, in this picture, critical thinking is irrelevant. In adolescence lies the beginning of the development of formal operations and this development continues until around the age of twenty.
In exchange, critical thinking development can continue with a fourth stage, called the evaluativist stage. This stretches out over the entire age period of the young adult and is characterized by considering assertions as judgments that can be evaluated and compared using the criteria of rationality, demonstration and alternatives. In the evaluativist stage, knowledge is seen as being generated by the human mind and therefore uncertain and can be susceptible to evaluation. It is perfectly acceptable that some
opinions are better than others, meaning that some are better supported by proofs and arguments and the justification of the opinions must be more than pure personal preference. In the evaluativist stage, critical thinking is the vehicle that brings valid assertions and enhances understanding. D. Kuhn says that thinking critically is not the same with thinking in general.
Besides the generally-human cognitive substrate, there is also a long and voluntarily cultivated cognitive component, which is not equivalent to thinking ability in general or, even more, to intelligence. What makes critical thinking an individual characteristic that a person can hold to a smaller or greater extent is the powerful social, contextual and educational dimension needed in order to say we are critical thinkers. In her book Education for Thinking, Deanna Kuhn presents the results of a research conducted over three years and
she shows that there is a strong connection between the value attributed by family to knowledge and a childs school results. In other words, if the family thinks it is very important and positive to know, to possess knowledge, then the child is highly motivated and will make great efforts to gather as much Arguments
Logic is the science that evaluates arguments. An argument is a group of statements, one or more of which (the premises) are claimed to provide support for, or reasons to believe, one of the other (the conclusion). Indicators of the conclusion: therefore, accordingly, we my conclude, thus, implies that, etc. Indicators of the premises: since, as
A statement is a sentence that is either true or false, and typically a declarative sentence. An inference is the reasoning process expressed by an argument. In a loose sense, inference and argument are used interchangeably. A proposition is the meaning or information content of a statement. Because information may be expressed in different ways, the same proposition
We may recognize arguments by the fact that they are passages that try to prove something. The logical evaluation of arguments is quite difficult because it is established in degrees of more or less defective. Placing the arguments on a continuum that has solid arguments at one end and defective sophistic arguments at the other end is regulated by some factors on which the quality of the
argumentation depends and that the arguments must possess. They are grouped in four categories (Stoianovici, 1. Acceptability of the premises. Given the fact that argumentations are carried out in the opinions area, the requirement for the starting premises to be true is a bit overestimated. However, if the assertions used as premises are not supported by other assertions, they become less plausible. Acceptability is a vague concept that always raises the questions to whom is it acceptable? or why is this
accepted?. Therefore, the acceptability degree of some premises is in relation to the audiences knowledge or capacity of understanding. Certainly, some premises may seem acceptable to an audience and unacceptable to a more critic audience. 2. Acceptability of logical connections between premises and conclusions. Moreover, the connections between premises and conclusion should not be mandatory, as in the case of formal logics and demonstrative reasoning.
If within an argumentation there are validdeductive logical links, thats even better, but if the audience accepts the premises, it is not mandatory for the audience to refuse the conclusion only by contradicting themselves. 3. Non-circularity. The premises on which the conclusion of the argument stands arent supposed to have as ground exactly the opinion to be founded. Thus, a circular argument requires the audience to accept right from the start the opinion on the ground of which we built the argument.
4. Dialectic requirements. The argumentation is addressed to different audiences which are distinct according to the information degree, to the shared values, to the capacity of following logical connection with various degrees of complexity between ideas, to the vocabulary used etc. so it must take into account some dialectics. The aspects we take into account when evaluating arguments are related to the communication between participants. Reasoning errors (fallacies)
A fallacy is the defect from an argument implying something else than false premises. A fallacy can be committed in many ways: it can imply a reasoning error or it can create the illusion that a wrong argument is right (or both). Both deductive and inductive reasoning can contain fallacy. Formal fallacies
can be identified through an inspection of the form or of the structure of the argument (this type is more often met in deductive arguments). Example: All tigers are animals. All mammals are animals. Therefore, all tigers are mammals. The argument has the following form: All A are B. All C are B.
Therefore, all A are C. The argument is invalid. We can even prove it by means of diagrams. Although, at first sight, it seemed correct. Informal fallacies are detectable by analyzing the content of the argument. These sophisms divide in five groups: relevance fallacy,
weak induction fallacy, presumption fallacy, ambiguity fallacy and grammatical analogy (the definitions and examples below are taken from the book, A Concise Introduction to Logic, by Patrick Hurley, 1994). I. Relevance fallacies are present in the arguments that possess the characteristic described by logically irrelevant premises to a conclusion. Nonetheless, they are psychologically relevant to the conclusion, because it
seems to be drawn from the presented premises. Appeal to force (Argumentum ad Baculum): the speaker threatens his listener that something bad is going to happen if he does not accept the conclusion. Heres an example: The secretary to the boss: - I deserve a raise for next year. In fact, you know how close I am to your wife and, of course, you wouldnt want her to find about
everything that is going on between you and that cute client. Appeal to pity (Argumentum ad Misericordiam): the speaker asks for the listeners pity in order for his conclusion to be accepted. For instance: The tax-payer to the judge: Your Honor, I admit I declared I had 13 children, even though I only have 2 and I realize its wrong, but if you find me guilty of tax evasion, my reputation will be ruined. Ill probably lose my job, my poor wife
wont be able to have the surgery she need and my children will starve. Of course, I am innocent. Appeal to people (Argumentum ad Populum): resorts to everybodys wish to be accepted, loved and acknowledged by others. There are two approaches: Direct: it addresses to a large number of people in order to induce a mass mentality (herd instinct). For instance, Adolf Hitlers speeches. People want to share the same euphoria, comradeship, so they accept a large number of
conclusions with growing effervescence. Indirect: the speaker does not direct his speech towards the crowd as a mass, but only to some individuals, by setting himself on some aspects of the their relationships with the group. Thus, there occur the majority argument, the resort to vanity and the resort to snobbism (they are all techniques used in publicity, for example). The argument against the person (Argumentum ad Hominem). This sophism always two people. One of them launches an argument and the other one answers by directing his attention to the person and
not to the argument. This sophism has three forms: abusive against the person, circumstantial against the person, tu quoque (you too). In the first one, the speaker attacks verbally the interlocutor. The circumstantial argument against the person: one the speakers presents the other as being likely to argue the way he did (he has some interests). Tu quoque: one of the speakers presents the other as a hypocrite. The accident fallacy: Takes place when a general rule is applied to a
special case that shouldnt be covered by the rule. For instance: Freedom of speech is guaranteed by the constitution. Therefore, J.Q. Radical shouldnt have been arrested for instigating rebellion through his speech last week. The straw man fallacy. The arguer distorts the opponents reasoning and then attacks the distorted argument, thus creating the impression that he demolished the original argument. For example:
Mr. Goldberg argued against praying in public schools. Of course, Mr. Goldberg is a defender of atheism, but atheism is what they had in Russia. Atheism leads to the suppression of all religions and to the replacement of God with an omnipotent state. A state is what we want for our country? I find it hard to believe. Clearly, Mr. Goldbergs argument is nonsense.
Missing the point (IgnoratioElenchi). It illustrates a special form of irrelevance. This sophism occurs when the premises of an argument support a particular conclusion, but then a different conclusion was drawn, vaguely related to the premises. E.g.: Crimes like robbery and mugging have alarmingly increased. The conclusion is clear: we must reinstate death penalty.
Red herring fallacy. It is similar to the previous density. It occurs when the arguer drifts away listeners attention, by changing the subject on totally different problems. The arguer finishes either by drawing a conclusion for a different problem or by assuming that a certain conclusion has already been drawn. For example: Ecologists always speak about the dangers of nuclear energy. Unfortunately, electricity is dangerous, no matter where it comes from. Every year, hundreds of people get accidentally electrocuted. Given that most accidents take place out of negligence, they could be avoided if people were more careful.
Red herring fallacy can be mistaken for the straw man fallacy because they both produce the listeners digression from the subject. This confusion can be avoided by finding out if the arguer destroyed a distorted argument (straw man fallacy) or he just changed the subject. Both the straw man fallacy and the red herring fallacy can be mistaken for the missing the point
fallacy. In order to avoid this confusion we must realize that both red herring fallacy and straw man fallacy generate a new set of premises while the irrelevant conclusion fallacy does not. In addition, the conclusion drawn from the red herring and straw man fallacies is relevant in relation to the premises from which it was drawn. In ignoratio elenchi the conclusion is irrelevant in relation to the premises from which it was drawn. Weak induction fallacies This type of reasoning error occurs when the connection between premises and conclusion is not strong enough to
support the conclusion. Many times these fallacies involve affective bases in the acceptance of the conclusion. Appeal to authority (Argumentum ad Verecundiam). It is a version of the argument based on authority and it takes place when the quoted authority is not trustworthy (lost trustworthiness). For instance: President Clinton declared he smoked
marijuana when he was a student at Oxford, but he never inhaled. Therefore, we cannot draw the conclusion that President Clinton never got high smoking marijuana during those years. Appeal to ignorance (Argumentum ad Ignorantiam). The premises show that nothing is known or demonstrated and then the conclusion is drawn. For instance: People have been trying for centuries to
conclusively prove the claims of astrology and nobody has yet succeeded. Therefore, we must conclude that astrology is nonsense. But in the following example: Nobody saw Mr. Andrew drinking wine, beer or other strong drinks. Probably Mr. Andrew is not a drinker, we dont have a fallacy. If Mr. Andrew was a drinker, someone might have seen him drinking. So the argument is inductively correct, the cases in which he was seen led to
drawing this conclusion that is probable, but not necessary. Hasty generalization (converse accident). It is a fallacy which affects inductive generalizations. It occurs when the sample is not representative to the group, namely the sample is either too small or it hasnt been randomly selected. It is also called converse accident because it proceeds in the opposed direction to the accident. If the accident starts from general to particular, the converse accident starts
False cause. This fallacy takes place when the connection between premises and conclusion depends on a causal connection that probably does not exist. For example: In the past two months, every time the cheerleaders wore blue ribbons in their hair, the basketball defeated. Therefore, in order to prevent a future defeat, the cheerleaders will have to give up wearing blue ribbons. (This fallacy is also known as post hoc ergo propter hoc).
Successful CEOs are extra paid up to 50000 dollars. Therefore, in order to make sure that Popescu is going to become a successful CEO, we should raise his salary up to 50000 dollars (non causa pro causafallacy). One of the most frequent fallacies is oversimplified cause. For instance: There are more laws today than there were once and there are more crimes than ever. Therefore, in order to diminish the number of crimes we should eliminate laws.
The slippery slope. It is a version of the false cause. We encounter this type when the conclusion of an argument is based on a chain reaction that is not motivated enough to produce. For example: Urgent measures should be taken against pornography. If they carry on producing and selling pornographic products, the number of sexual crimes, such as rape and incest, will increase. In their turn, these will cause the wear down of the moral basis of the society, which will lead to the increase of all sorts of crimes, maybe even to the
complete dissolution of law and order, leading to the collapse of the society. Weak analogy. It is produced when the analogy is not strong enough to support the conclusion. For example: Popescus new car is light blue and has leather stuffing and also low fuel consumption. Ionescus new car is also light blue and has leather stuffing as well. Therefore, it has low fuel consumption.
Presumption fallacies Circular argument. It occurs when the arguer uses a certain type of phraseology which tends to hide the doubtful nature of the key premises. For instance: Death penalty is justified for crimes such as homicide and kidnapping, because it is only right and legitimate for someone to be killed for having committed such inhumane crimes.
Complex question. It occurs when a single question, which in fact formed of two or more questions, is asked and a single answer is given to all questions. This argument aims to trap the respondent in admitting something he wouldnt admit otherwise. For example: Have you stopped cheating in exams?
Someone might answer yes, I have. This means that he had cheated in exams in the past. If he answered no, I havent, that would means that he is still cheating. Another example: Where did you hide the cookies you stole?. If he answered he had hidden them under the bed, obviously he had also stolen them. If he answered nowhere, it would mean that he had stolen and had eaten them. Clearly, the questions above are, in fact, two questions: Have you ever cheated in exams? If you have, do you still cheat?
We must distinguish the complex question from the leading question. E.g.: Tell us, on April 9th, did you see the defendant shooting the victim?, as opposed to tell us, what did you see on April 9th? This question does not contain any fallacies. False dichotomy (false bifurcation or the eitheror fallacy). It occurs when one of the premises is eitheror type and it presents two alternatives as being exhaustive (a third
alternative wouldnt be possible). But this sentence hides additional alternatives. For example: You either let me go to the Nightwish concert or Ill feel miserable for the rest of my life. I know you dont want me to feel miserable for the rest of my life, so you are going to let me go to the Nightwish concert. Cherry picking or suppressing evidence. This fallacy takes place when the arguer ignores evidence that exceeded the importance of the offered proofs and that demand another conclusion. For example:
The salesman of used cars to the buyer: Madam, I have the right car for you, it was mostly kept in the garage, its engine was segmented and it has a really low mileage. The madam bought the car and after two months it fell apart. The salesman forgot to mention that when it wasnt in the garage, the car crossed the entire country, so the mileage started over again from zero. Ambiguity fallacies They occur when there is a form of ambiguity in the premises, in the conclusion or in both. These fallacies occur because of the language used. As
we said before, when we discussed the evaluation criteria of the arguments, one of them was language or dialectical requirements, the avoidance of ambiguous multivocal terms, as well the adjustment of language to the speech universe of the audience. Equivocation occurs when the conclusion of an argument depends on the fact that one or more words are explicitly or implicitly used with two
different meanings within the article. For instance: We have the duty to do whats right. We have the right to talk about defending the innocent. Therefore, we have the duty to talk about defending the innocent. Clear arguments, such as the ones above, are less likely to contain the ambiguity fallacies because they are easily detectable. This type of fallacies is often encountered in political speeches. Politicians use words or collocations
(such as demilitarization) with two different meanings, if they speak with a group of weaponry suppliers or with a group of anti-war campaigners. The meaning can be discovered by Amphiboly occurs when the arguer misinterprets a statement which is ambiguous because of some structure flaws and draws a conclusion based on this defective misinterpretation. For instance:
Ion told Eugen he made a mistake. That means at least Ion has the courage to admit his mistakes. Professor Ionescu said he would give a lecture about heart diseases in the biology laboratory. We draw the condition that heart diseases have occurred there lately. The amphiboly fallacy distinguishes from equivocation by means of two characteristics: the amphiboly implies a structural defect in the sentence, while the equivocation comes from the
ambiguity of one or more words. The second difference is that amphiboly is a misinterpretation of someone elses words, while the equivocation is the creation of the arguer. The fallacies of the grammatical analogy The arguments containing these fallacies are grammatically similar to some correct arguments from all points of view. Due to this grammatical similarity,
they seem correct, but they are not. They consist of: Composition. It occurs when the conclusion of the argument depends on the erroneous transfer of the argument from a part to the whole. Supposedly if parts possess a quality, so does the whole containing the parts with the quality. For example: Mary likes cucumbers. She also likes chocolate icecream. Therefore, Mary would love an ice-cream chocolate with slices of cucumber. The composition can be mistaken for hasty
generalization. Here, we must consider the distinction between distributive and collective predication. For example, Flies are small and Flies are numerous. They are two different sentences: the first one has a distributive predication. The attribute of being small is assigned to every class member (general sentence), while in the second example, the attribute belongs to the entire class. It is a collective predicate. The flies class is numerous. If the conclusion is a general sentence, then the fallacy is a hasty generalization. If it is a sentence referring to a class, then the fallacy is a composition.
Division. It is the opposite of composition. As composition travels from parts to whole, so the division travels from the whole to the parts. The fallacy is produced when the erroneous transfer of an attribute from the whole (or class) to the parts (or members) is made. For example: Salt is a nontoxic compound. Therefore, its component elements, sodium and chlorine, are nontoxic. The division can be mistaken for the fallacy of the accident. In order to avoid the confusion, we must see if the premises of the argument are general sentences (the attribute is applied to each member) or refer to classes. In the first case, the fallacy is accident and in the second, it is division.
We must mention that, in real life, where argumentation situations are infinite, it is possible for a fallacy to be part of several categories. In fact, the introduction of the arguments in these categories is not even a purpose per se. What is important is that they are recognized and rejected as weak and the conclusions supported by them recognized as being unacceptable.