The discussion that
precedes the list begins with an account of
the ways in which the term
"fallacy" is vague.
Attention then turns to the number of
competing and overlapping ways to classify
fallacies of argumentation.
For pedagogical purposes, researchers in the
field of fallacies disagree about the
following topics: which name of a fallacy is
more helpful to students' understanding;
whether some fallacies should be
de-emphasized in favor of others; and which
is the best taxonomy of the fallacies.
Researchers in the field are also deeply
divided about how to define the term
"fallacy," how to define certain fallacies,
and whether any general theory of fallacies
at all should be pursued if that theory's
goal is to provide necessary and sufficient
conditions for distinguishing between
fallacious and non-fallacious reasoning
generally. Analogously, there is doubt in
the field of ethics regarding whether
researchers should pursue the goal of
providing necessary and sufficient
conditions for distinguishing moral actions
from immoral ones.
Table of Contents
1.
Introduction
The first known systematic study of
fallacies was due to Aristotle in his De
Sophisticis Elenchis (Sophistical
Refutations), an appendix to the
Topics. He listed thirteen types.
After the Dark Ages, fallacies were again
studied systematically in Medieval
Europe. This is why so many fallacies
have Latin names. The third major period of
study of the fallacies began in the later
twentieth century due to renewed interest
from the disciplines of philosophy, logic,
communication studies, rhetoric, psychology,
and artificial intelligence.
The more
frequent the error within public discussion
and debate the more likely it is to have a
name. That is one reason why there is no
specific name for the fallacy of subtracting
five from thirteen and concluding that the
answer is seven, though the error is common
among elementary school children.
The
term "fallacy" is not a precise term. One
reason is that it is ambiguous. It can
refer either to (a) a kind of error in an
argument, (b) a kind of error in reasoning
(including arguments, definitions,
explanations, etc.), (c) a false belief, or
(d) the cause of any of the previous errors
including what are normally referred to as
"rhetorical techniques". Philosophers who
are researchers in fallacy theory prefer to
emphasize meaning (a), but their lead is
often not followed in textbooks and public
discussion.
Regarding meaning (d), ill
health, being a bigot, being hungry, being
stupid, and being hypercritical of our
enemies are all sources of error in
reasoning, so they could qualify as
fallacies of kind (d), but they are
not included in the list below. On
the other hand, wishful thinking,
stereotyping, being superstitious,
rationalizing, and having a poor sense of
proportion are sources of error and
are included in the list below,
though they wouldn't be included in a list
devoted only to faulty arguments. Thus there
is a certain arbitrariness to what appears
in lists such as this. What have been left
off the list below are the following
persuasive techniques commonly used to
influence others and to cause errors in
reasoning: apple polishing, exaggerating,
inappropriately assigning of the burden of
proof, promising a proof without producing
it, using propaganda techniques, ridiculing,
being sarcastic, selecting terms with strong
negative or positive associations, using
innuendo, and weasling. All of the
techniques are worth knowing about if one
wants to avoid the fallacies.
In
describing the fallacies below, the custom
is followed of not distinguishing between a
reasoner committing a fallacy and the
reasoning itself committing the fallacy,
though it would be more accurate to say that
a reasoner commits the fallacy and
the reasoning contains the fallacy.
In the list below, the examples are very
short. If they were long, the article would
be too long. Nevertheless real arguments
are often embedded within a very long
discussion. Richard Whately, one of the
greatest of the 19th century researchers
into informal logic, wisely said, "A very
long discussion is one of the most
effective veils of Fallacy; ...a Fallacy,
which when stated barely...would not deceive
a child, may deceive half the world if
diluted in a quarto volume."
2.
Taxonomy of Fallacies
There are a number of competing and
overlapping ways to classify fallacies of
argumentation. They can be classified as
either formal or informal. A formal fallacy
can be detected by examining the logical
form of the reasoning, whereas an informal
fallacy depends upon the content of the
reasoning and possibly the purpose of the
reasoning. The list below contains very few
formal fallacies. Fallacies also can be
classified as deductive or inductive,
depending upon whether the fallacious
argument is deductive or inductive, that is,
assessed by deductive standards or inductive
standards. Deductive standards demand deductive
validity, but inductive standards
require inductive strength such as making
the conclusion more likely.
Similar
fallacies are often grouped together under a
common name. For example, fallacies of
relevance include fallacies that occur due
to reliance on an irrelevant reason. Ad hominem, appeal to pity, and affirming
the consequent are some of the fallacies
of relevance. Accent,
amphiboly and equivocation are
examples of fallacies of ambiguity. The
fallacies of illegitimate presumption
include begging the question, false dilemma, no true
Scotsman, complex question and suppressed
evidence.
3.
Pedagogy
For pedagogical purposes, researchers in the
field of fallacies disagree about the
following topics: which name of a fallacy is
more helpful to students' understanding;
whether some fallacies should be
de-emphasized in favor of others; and which
is the best taxonomy of the fallacies.
Fallacy theory is criticized by some
teachers of informal reasoning for its
emphasis on poor reasoning rather than good.
Do colleges teach the Calculus by
emphasizing all the ways one can make
mathematical mistakes? The critics want
more emphasis on the forms of good arguments
and on the implicit rules that govern proper
discussion designed to resolve a difference
of opinion. But there has been little
systematic study of which emphasis is more
successful.
4.
What is a fallacy?
Researchers disagree about how to define the
very term "fallacy". Focusing just on
fallacies in sense (a) above, namely
fallacies of argumentation, some researchers
define a fallacy as a kind of invalid
argument, meaning one that is deductively
invalid or that has very little inductive
strength. Because examples of false dilemma, inconsistent
premises, and begging the question are valid
arguments in this sense, this definition
misses some standard fallacies. Other
researchers define a fallacy as an argument
that is not good. Good arguments are then
defined as those that are deductively valid
or inductively strong, and that contain only
true, well-established premises, but are not
question-begging. A complaint with this
definition is that its requirement of truth
would improperly lead to calling too much
scientific reasoning fallacious; every time
a new scientific discovery caused scientists
to label a previously well-established claim
as false, all the scientists who used that
claim as a premise would become fallacious
reasoners. This consequence of the
definition is acceptable to some researchers
but not to others. Because informal
reasoning regularly deals with hypothetical
reasoning and with premises for which there
is great disagreement about whether they are
true or false, many researchers would relax
the requirement that every premise must be
true. One widely accepted definition
defines a fallacious argument as one that
either is deductively invalid or is
inductively very weak or contains an
unjustified premise or that ignores relevant
evidence that is available and that should
be known by the arguer. Finally, yet
another theory of fallacy says a fallacy is
a failure to provide adequate proof for a
belief, the failure being disguised to make
the proof look adequate.
Other
researchers recommend characterizing a
fallacy as a violation of the norms of good
reasoning, the rules of critical discussion,
dispute resolution, and adequate
communication. The difficulty with this
approach is that there is so much
disagreement about how to characterize these
norms. There is even controversy about
whether a fallacy can be committed only
during a dialogue, for this implies that a
stand-alone argument by one person cannot be
fallacious.
In addition, all the above
definitions are often augmented with some
remark to the effect that the fallacies are
likely to persuade many reasoners. It is
notoriously difficult to be very precise
about this vague and subjective notion of
being likely to persuade, and some
researchers in fallacy theory have therefore
recommended dropping the notion in favor of
"can be used to persuade."
Some
researchers complain that all the above
definitions of fallacy are too broad and do
not distinguish between mere blunders and
actual fallacies, the more serious errors.
Researchers in the field are deeply
divided, not only about how to define the
term "fallacy" and how to define some of the
individual fallacies, but also about whether
any general theory of fallacies at all
should be pursued if that theory's goal is
to provide necessary and sufficient
conditions for distinguishing between
fallacious and non-fallacious reasoning
generally. Analogously, there is doubt in
the field of ethics whether researchers
should pursue the goal of providing
necessary and sufficient conditions for
distinguishing moral actions from immoral
ones.
5.
Other Controversies
In the field of rhetoric, the primary goal
is to persuade the audience. The audience
is not going to be persuaded by an otherwise
good argument with true premises unless they
believe those premises are true.
Philosophers tend to de-emphasize this
difference between rhetoric and informal
logic, and they concentrate on arguments
that should fail to convince the ideally
rational reasoner rather than on arguments
that are likely not to convince audiences
who hold certain background beliefs.
Advertising in magazines and on television
is designed to achieve visual persuasion.
And a hug or the fanning of fumes from
freshly baked donuts out onto the sidewalk
are occasionally used for visceral
persuasion. There is some controversy among
researchers in informal logic as to whether
the reasoning involved in this nonverbal
persuasion can always be assessed properly
by the same standards that are used for
verbal reasoning.
6.
Partial List of Fallacies
Consulting the list below will give a
general idea of the kind of error involved
in passages to which the fallacy name is
applied. However, simply applying the
fallacy name to a passage cannot substitute
for a detailed examination of the passage
and its context or circumstances because
there are many instances of reasoning to
which a fallacy name might seem to apply,
yet, on further examination, it is found
that in these circumstances the reasoning is
really not fallacious.
Abusive Ad
Hominem
See Ad Hominem.
Accent
The accent fallacy is a fallacy of
ambiguity due to the different ways a word
is emphasized or accented.
Example:
A member of Congress is asked by a reporter
if she is in favor of the President's new
missile defense system, and she responds,
"I'm in favor of a missile defense system
that effectively defends America."
With an emphasis on the word
"favor", this remark is likely to favor the
President's missile defense system. With an
emphasis, instead, on the words "effectively
defends", this remark is likely to be
against the President's missile defense
system. Aristotle's fallacy of accent
allowed only a shift in which syllable is
accented within a word.
Accident
We often arrive at a generalization but
don't or can't list all the exceptions.
When we reason with the generalization as if
it has no exceptions, we commit the fallacy
of accident. This fallacy is sometimes
called the fallacy of sweeping
generalization.
Example:
People should keep their promises, right? I
loaned Dwayne my knife, and he said he'd
return it. Now he is refusing to give it
back, but I need it right now to slash up my
neighbors' families. Dwayne isn't doing
right by me.
People should keep their promises, but there
are exceptions as in this case of the
psychopath who wants Dwayne to keep his
promise to return the knife.
Ad
Baculum
See Scare
Tactic and Appeal to Emotions (Fear).
Ad Consequentiam
See Appeal
to Consequence.
Ad
Crumenum
See Appeal to
Money.
Ad Hoc
Rescue
Psychologically, it is understandable that
you would try to rescue a cherished belief
from trouble. When faced with conflicting
data, you are likely to mention how the
conflict will disappear if some new
assumption is taken into account. However,
if there is no good reason to accept this
saving assumption other than that it works
to save your cherished belief, your rescue
is an ad hoc rescue.
Example:
Yolanda: If you take four of these
tablets of vitamin C every day, you will
never get a cold.
Juanita: I
tried that last year for several months, and
still got a cold.
Yolanda: Did
you take the tablets every day?
Juanita: Yes.
Yolanda:
Well, I'll bet you bought some bad tablets.
The burden of proof is definitely on
Yolanda's shoulders to prove that Juanita's
vitamin C tablets were probably "bad" --
that is, not really vitamin C. If Yolanda
can't do so, her attempt to rescue
her hypothesis (that vitamin C prevents
colds) is simply a dogmatic refusal to face
up to the possibility of being wrong.
Ad
Hominem
You commit this fallacy if you make an
irrelevant attack on the arguer and suggest
that this attack undermines the argument
itself. It is a form of the Genetic Fallacy.
Example:
What she says about Johannes Kepler's
astronomy of the 1600's must be just so much
garbage. Do you realize she's only fourteen
years old?
This attack may undermine the arguer's
credibility as a scientific authority, but
it does not undermine her reasoning. That
reasoning should stand or fall on the
scientific evidence, not on the arguer's age
or anything else about her personally.
If the fallacious reasoner points out
irrelevant circumstances that the reasoner
is in, the fallacy is a circumstantial ad
hominem. Tu Quoque and Two Wrongs Make a
Right are other types of the ad hominem
fallacy.
The major difficulty with
labeling a piece of reasoning as an ad
hominem fallacy is deciding whether the
personal attack is relevant. For example,
attacks on a person for their actually
immoral sexual conduct are irrelevant to the
quality of their mathematical reasoning, but
they are relevant to arguments promoting the
person for a leadership position in the
church. Unfortunately, many attacks are not
so easy to classify, such as an attack
pointing out that the candidate for church
leadership, while in the tenth grade,
intentionally tripped a fellow student and
broke his collar bone.
Ad
Ignorantiam
See Appeal to
Ignorance.
Ad
Misericordiam
See Appeal to
Emotions.
Ad
Novitatem
See Bandwagon.
Ad
Numerum
See Appeal
to the People.
Ad
Populum
See Appeal to the People.
Ad
Verecundiam
See Appeal to
Authority.
Affirming
the Consequent
If you have enough evidence to affirm the
consequent of a conditional and then suppose
that as a result you have sufficient reason
for affirming the antecedent, you commit the
fallacy of affirming the consequent. This
formal fallacy is often mistaken for modus
ponens, a valid
form of reasoning also using a conditional.
A conditional is an if-then statement; the
if-part is the antecedent, and the then-part
is the consequent.
Example:
If she's Brazilian, then she speaks
Portuguese. Hey, she does speak Portuguese.
So, she is Brazilian.
The two premises of this argument do make it
somewhat likely that the person is
Brazilian, provided the argument isn't
taking place in Portugal. However, if the
arguer believes that the premises definitely
establish that she is Brazilian, the arguer
is committing the fallacy. Some examples
of this fallacy are more difficult to detect
because sentences which don't have the
surface grammar of conditionals can be
analyzed as being conditionals, as in the
following three sentences.
Example:
Arguments are used to convince others.
Hugs often are used to convince others.
Therefore, hugs often are
arguments.
Amphiboly
This is an error due to taking a
grammatically ambiguous phrase in two
different ways during the reasoning.
Example:
In a cartoon, two elephants are driving
their car down the road in India. They say,
"We've better not get out here," as they
pass a sign saying:
ELEPHANTS
PLEASE STAY IN YOUR CAR
Upon one grammatical construction of the
sign, the pronoun "YOUR" refers to the
elephants in the car, but on another
construction it refers to those humans who
are driving cars in the vicinity. Unlike
equivocation,
which is due to multiple meanings of a
phrase, amphiboly is due to syntactic
ambiguity, ambiguity caused by alternative
ways of taking the grammar.
Anecdotal
Evidence
If you discount evidence arrived at by
systematic search or by testing in favor of
a few firsthand stories, you are committing
the fallacy of overemphasizing anecdotal
evidence.
Example:
Yeah, I've read the health warnings on those
cigarette packs and I know about all that
health research, but my brother smokes, and
he says he's never been sick a day in his
life, so I know smoking can't really hurt
you.
Anthropomorphism
This is the error of projecting uniquely
human qualities onto something that isn't
human. Usually this occurs with projecting
the human qualities onto animals, but when
it is done to nonliving things, as in
calling the storm cruel, the pathetic fallacy is
created. There is also, but less commonly,
called the Disney Fallacy or the Walt Disney
Fallacy.
Example:
My dog is wagging his tail and running
around me. Therefore, he knows that I love
him.
The fallacy would be averted if the speaker
had said "My dog is wagging his tail and
running around me. Therefore, he is happy
to see me." Animals are likely to have some
human emotions, but not the ability to
ascribe knowledge to other beings. Your dog
knows where it buried its bone, but not that
you also know where the bone is.
Appeal to
Authority
You appeal to authority if you back up your
reasoning by saying that it is supported by
what some authority says on the subject.
Most reasoning of this kind is not
fallacious. However, it is
fallacious whenever the authority appealed
to is not really an authority in this
subject, when the authority cannot be
trusted to tell the truth, when authorities
disagree on this subject (except for the
occasional lone wolf), when the reasoner
misquotes the authority, and so forth.
Although spotting a fallacious appeal to
authority often requires some background
knowledge about the subject or the
authority, in brief it can be said that it
is fallacious to accept the word of a
supposed authority when we should be
suspicious.
Example:
You can believe the moon is covered with
dust because the president of our
neighborhood association said so, and he
should know.
This is a fallacious appeal to authority
because, although the president is an
authority on many neighborhood matters, he
is no authority on the composition of the
moon. It would be better to appeal to some
astronomer or geologist. If you place too
much trust in expert opinion and overlook
any possibility that experts talking in
their own field of expertise make mistakes,
too, then you also commit the fallacy of
appeal to authority.
Example:
Of course she's guilty of the crime. The
police arrested her, didn't they? And
they're experts when it comes to
crime.
Appeal to
Emotions
You commit the fallacy of appeal to emotions
when someone's appeal to you to accept their
claim is accepted merely because the appeal
arouses your feelings of anger, fear, grief,
love, outrage, pity, pride, sexuality,
sympathy, relief, and so forth. Example of
appeal to relief from grief:
[The speaker knows he is talking to an
aggrieved person whose house is worth much
more than $100,000.] You had a great job and
didn't deserve to lose it. I wish I could
help somehow. I do have one idea. Now your
family needs financial security even more.
You need cash. I can help you. Here is a
check for $100,000. Just sign this standard
sales agreement, and we can skip the
realtors and all the headaches they would
create at this critical time in your life.
There is nothing wrong with using emotions
when you argue, but it's a mistake to use
emotions as the key premises or as tools to
downplay relevant information.
Regarding the fallacy of appeal to pity, it is proper to
pity people who have had misfortunes, but if
as the person's history instructor you
accept Max's claim that he earned an A on
the history quiz because he broke his wrist
while playing in your college's last
basketball game, then you've committed the
fallacy of appeal
to pity. However, if you realize he
didn't earn the A, but nevertheless
you still give him an A, then you have not
committed the fallacy, but you may have
acted improperly.
Appeal to Force
See Scare
Tactic.
Appeal to
Ignorance
The fallacy of appeal to ignorance comes in
two forms: (1) Not knowing that a certain
statement is true is taken to be a proof
that it is false. (2) Not knowing that a
statement is false is taken to be a proof
that it is true. The fallacy occurs in
cases where absence of evidence is not good
enough evidence of absence. The fallacy
uses an unjustified attempt to shift the
burden of proof. The fallacy is also called
"Argument from Ignorance."
Example:
Nobody has ever proved to me there's a God,
so I know there is no God.
This kind of reasoning is generally
fallacious. It would be proper reasoning
only if the proof attempts were quite
thorough, and it were the case that if God
did exist, then there would be a
discoverable proof of this.
Appeal to the Masses
See Appeal to the People.
Appeal to Money
The fallacy of appeal to money uses the
error of supposing that, if something costs
a great deal of money, then it must be
better, or supposing that if someone has a
great deal of money, then they're a better
person in some way unrelated to having a
great deal of money. Similarly it's a
mistake to suppose that if something is
cheap it must be of inferior quality, or to
suppose that if someone is poor financially
then they're poor at something unrelated to
having money.
Example:
He's rich, so he should be the president of
our Parents and Teachers
Organization.
Appeal to the
People
If you suggest too strongly that someone's
claim or argument is correct simply because
it's what most everyone believes, then
you've committed the fallacy of appeal to
the people. Similarly, if you suggest too
strongly that someone's claim or argument is
mistaken simply because it's not what most
everyone believes, then you've also
committed the fallacy. Agreement with
popular opinion is not necessarily a
reliable sign of truth, and deviation from
popular opinion is not necessarily a
reliable sign of error, but if you assume it
is and do so with enthusiasm, then you're
guilty of committing this fallacy. It is
also called mob appeal, appeal to the
gallery, argument from popularity, and
argumentum ad populum. The 'too strongly'
is important in the description of the
fallacy because what most everyone believes
is, for that reason, somewhat likely to be
true, all things considered. However, the
fallacy occurs when this degree of support
is overestimated.
Example:
You should turn to channel 6. It's the most
watched channel this year.
This is fallacious because of its implicitly
accepting the questionable premise that the
most watched channel this year is, for that
reason alone, the best channel for you.
Appeal to Pity
See Appeal to
Emotions.
Appeal to
Consequence
Arguing that a belief is false because
it implies something you'd rather not
believe. Also called Argumentum Ad
Consequentiam.
Example:
That can't be Senator Smith there in the
videotape going into her apartment. If it
were, he'd be a liar about not knowing her.
He's not the kind of man who would lie.
He's a member of my
congregation.
Smith may or may not be the person in that
videotape, but this kind of arguing should
not convince us that it's someone else in
the videotape.
Argument from
Outrage
See Appeal to
Emotions.
Argument from
Popularity
See Appeal
to the People.
Argumentum Ad
....
See Ad .... without the word
"Argumentum."
Avoiding the
Issue
A reasoner who is supposed to address an
issue but instead goes off on a tangent has
committed the fallacy of avoiding the issue.
Also called missing the point, straying off
the subject, digressing, and not sticking to
the issue.
Example:
A city
official is charged with corruption for
awarding contracts to his wife's consulting
firm. In speaking to a reporter about why
he is innocent, the city official talks only
about his wife's conservative wardrobe, the
family's lovable dog, and his own
accomplishments in supporting Little League
baseball.
However, the fallacy isn't committed
by a reasoner who says that some other issue
must first be settled and then continues by
talking about this other issue, provided the
reasoner is correct in claiming this
dependence of one issue on the other.
Avoiding the
Question
The fallacy of avoiding the question is a
type of fallacy of avoiding the issue that
occurs when the issue is how to answer some
question. The fallacy is committed when
someone's answer doesn't really respond to
the question asked.
Example:
Question: Would the Oakland Athletics
be in first place if they were to win
tomorrow's game? Answer: What
makes you think they'll ever win tomorrow's
game?
Bald
Man
See Line-Drawing.
Bandwagon
If you suggest that someone's claim is
correct simply because it's what most
everyone is coming to believe, then you're
committing the bandwagon fallacy. Get up
here with us on the wagon where the band is
playing, and go where we go, and don't think
too much about the reasons. The Latin term
for this fallacy of appeal to novelty is
Argumentum ad Novitatem.
Example:
[Advertisement] More and more people are
buying sports utility vehicles. Isn't it
time you bought one, too? [You commit the
fallacy if you buy the vehicle solely
because of this advertisement.]
Like its close cousin, the fallacy of appeal
to the people, the bandwagon fallacy needs
to be carefully distinguished from properly
defending a claim by pointing out that many
people have studied the claim and have come
to a reasoned conclusion that it is correct.
What most everyone believes is likely to
be true, all things considered, and if one
defends a claim on those grounds, this is
not a fallacious inference. What is
fallacious is to be swept up by the
excitement of a new idea or new fad and to
unquestionably give it too high a degree of
your belief solely on the grounds of its new
popularity, perhaps thinking simply that
'new is better.'
Begging the
Question
A form of circular reasoning in which a
conclusion is derived from premises that
presuppose the conclusion. Normally, the
point of good reasoning is to start out at
one place and end up somewhere new, namely
having reached the goal of increasing the
degree of reasonable belief in the
conclusion. The point is to make progress,
but in cases of begging the question there
is no progress.
Example:
"Women have rights," said the Bullfighters
Association president. "But women shouldn't
fight bulls because a bullfighter is and
should be a man."
The president is saying basically that women
shouldn't fight bulls because women
shouldn't fight bulls. This reasoning isn't
making an progress toward determining
whether women should fight bulls.
Insofar as the conclusion of a deductively
valid argument is 'contained' in the
premises from which it is deduced, this
containing might seem to be a case of
presupposing, and thus any deductively valid
argument might seem to be begging the
question. It is still an open question
among logicians as to why some deductively
valid arguments are considered to be begging
the question and others are not. Some
logicians suggest that, in informal
reasoning with a deductively valid argument,
if the conclusion is psychologically new
insofar as the premises are concerned, then
the argument isn't an example of the
fallacy. Other logicians suggest that we
need to look instead to surrounding
circumstances, not to the psychology of the
reasoner, in order to assess the quality of
the argument. For example, we need to look
to the reasons that the reasoner used to
accept the premises. Was the premise
justified on the basis of accepting the
conclusion? A third group of logicians say
that, in deciding whether the fallacy is
committed, we need more. We must determine
whether any premise that is key to deducing
the conclusion is adopted rather blindly or
instead is a reasonable assumption made by
someone accepting their burden of proof.
The premise would here be termed reasonable
if the arguer could defend it independently
of accepting the conclusion that is at
issue.
Biased Sample
See Small
Sample.
Biased Statistics
This fallacy is committed whenever we use
biased statistics as if they are not biased.
When the error is with the size of the
sample, it is called small sample or biased
sample. When the error is with the
representativeness of the sample, it is
called unrepresentative sample. When some of
the statistical evidence is hidden or
overlooked, it is also called suppressed
evidence.
Example:
We talked to a random sample of the
management in our corporation and all their
subsidiaries, including the companies that
supply us with products, and they say they
are voting Republican. We predict a
Republican landslide in the national
election.
A random sample within that management group
is likely not to be very representative of
the nation's voters.
Bifurcation
See Black-or-White.
Black-or-White
The black-or-white fallacy is a false dilemma
fallacy that unfairly limits you to only two
choices.
Example:
Well, it's
time for a decision. Will you contribute $10
to our environmental fund, or are you on the
side of environmental destruction?
A proper challenge to this fallacy could be
to say, "I do want to prevent the
destruction of our environment, but I don't
want to give $10 to your fund. You are
placing me between a rock and a hard place."
The key to diagnosing the black-or-white
fallacy is to determine whether the limited
menu is fair or unfair. Simply saying,
"Will you contribute $10 or won't you?" is
not unfair.
Circular
Reasoning
Circular reasoning occurs when the reasoner
begins with what he or she is trying to end
up with. The most well known examples are
cases of the fallacy of begging the
question. However, if the circle is very
much larger, including a wide variety of
claims and a large set of related concepts,
then the circular reasoning can be
informative and so is not considered to be
fallacious. For example, a dictionary
contains a large circle of definitions that
use words which are defined in terms of
other words that are also defined in the
dictionary. Because the dictionary is so
informative, it is not considered as a whole
to be fallacious. However, a small circle
of definitions is considered to be
fallacious.
Example:
Definition: A couch is a sofa.
Definition: A sofa is a davenport.
Definition: A davenport is a
couch.
For additional difficulties in deciding
whether an argument is deficient because it
is circular, see Begging the Question.
Circumstantial Ad
Hominem
See Ad
Hominem.
Clouding the
Issue
See Smokescreen.
Common Belief
See Appeal
to the People and Traditional Wisdom.
Common Cause
This fallacy occurs during causal reasoning
when a causal connection between two kinds
of events is claimed when evidence is
available indicating that both are the
effect of a common cause.
Example:
Noting that the auto accident rate rises and
falls with the rate of use of windshield
wipers, one concludes that the use of wipers
is somehow causing auto accidents.
However, it's the rain that's the common
cause of both.
Common Practice
See Appeal
to the People and Traditional Wisdom.
Complex Question
You commit this fallacy when you frame a
question so that some controversial
presupposition is made by the wording of the
question.
Example:
[Reporter's question] Mr. President: Are
you going to continue your policy of wasting
taxpayer's money on missile
defense?
The question unfairly presumes the
controversial claim that the policy really
is a waste of money. The fallacy of complex
question is a form of begging the question.
Composition
The composition fallacy occurs when someone
mistakenly assumes that a characteristic of
some or all the individuals in a group is
also a characteristic of the group itself,
the group "composed" of those members. It
is the converse of the division fallacy.
Example:
Each human cell is
very lightweight, so a human being composed
of cells is also very lightweight.
Confirmation
Bias
The tendency to look only for evidence
in favor of one's controversial hypothesis
and not to look for disconfirming evidence,
or to pay insufficient attention to
it. This is the most common kind of Fallacy of
Selective Attention.
Example:
She loves me, and there are
so many ways that she has shown it.
When we signed the divorce papers in her
lawyer's office, she wore my favorite
color. When she slapped me at the bar
and called me a "handsome pig,"
she used the word "handsome" when
she didn't have to. When I called her
and she said never to call her again, she
first asked me how I was doing and whether
my life had changed. When I suggested
that we should have children in order to
keep our marriage together, she
laughed. If she can laugh with me and
want to know how I am doing and whether my
life has changed, and if she calls me
"handsome" and wears my favorite
color on special occasions, then I know she
really loves me.
Committing the fallacy of
confirmation bias is often a sign that one
has adopted some belief dogmatically and
isn't seriously setting about to confirm or
disconfirm the belief.
Consensus Gentium
Fallacy of argumentum consensus gentium
(argument from the consensus of the
nations). See Traditional Wisdom.
Consequence
See Appeal
to Consequence.
Converse Accident
If we reason by paying too much attention to
exceptions to the rule, and generalize on
the exceptions, we commit this fallacy.
This fallacy is the converse of the accident
fallacy. It is a kind of Hasty Generalization.
Example:
I've heard that turtles live longer than
tarantulas, but the one turtle I bought
lived only two days. I bought it at
Dowden's Pet Store. So, I think that
turtles bought from pet stores do not live
longer than tarantulas.
The original generalization is "Turtles live
longer than tarantulas." There are
exceptions, such as the turtle bought from
the pet store. Rather than seeing this for
what it is, namely an exception, the
reasoner places too much trust in this
exception and generalizes on it to produce
the faulty generalization that turtles
bought from pet stores do not live longer
than tarantulas.
Cover-up
See Suppressed Evidence.
Cum Hoc, Ergo Propter
Hoc
Latin for "with this, therefore because of
this." This is a false cause fallacy that doesn't
depend on time order (as does the post hoc
fallacy), but on any other chance
correlation of the supposed cause being in
the presence of the supposed effect.
Example:
Gypsies live near our low-yield cornfields.
So, gypsies are causing the low
yield.
Definist
The definist fallacy occurs when someone
unfairly defines a term so that a
controversial position is made easier to
defend. Same as the Persuasive
Definition.
Example:
During a controversy about the truth or
falsity of atheism, the fallacious reasoner
says, "Let's define 'atheist' as someone who
doesn't yet realize that God exists."
Denying the
Antecedent
You are committing a fallacy if you deny the
antecedent of a conditional and then suppose
that doing so is a sufficient reason for
denying the consequent. This formal fallacy
is often mistaken for modus tollens, a valid
form of argument using the conditional. A
conditional is an if-then statement; the
if-part is the antecedent, and the then-part
is the consequent.
Example:
If she were Brazilian, then she would know
that Brazil's official language is
Portuguese. She isn't Brazilian; she's from
London. So, she surely doesn't know this
about Brazil's language.
Digression
See Avoiding
the Issue.
Distraction
See Smokescreen.
Division
Merely because a group as a whole has a
characteristic, it often doesn't follow that
individuals in the group have that
characteristic. If you suppose that it does
follow, when it doesn't, you commit the
fallacy of division. It is the converse of
the composition
fallacy.
Example:
Joshua's soccer team is the best in the
division because it had an undefeated season
and shared the division title, so Joshua,
who is their goalie, must be the best goalie
in the division.
Domino
See Slippery
Slope.
Double Standard
There are many situations in which you
should judge two things or people by the
same standard. If in one of those situations
you use different standards for the two, you
commit the fallacy of using a double
standard.
Example:
I know we will hire any man who gets over a
70 percent on the screening test for hiring
Post Office employees, but women should have
to get an 80 to be hired because they often
have to take care of their children.
This example is a fallacy if it can be
presumed that men and women should have to
meet the same standard for becoming a Post
Office employee.
Either/Or
See Black-or-White.
Equivocation
Equivocation is the illegitimate switching
of the meaning of a term during the
reasoning.
Example:
Those noisy people object to racism because
they believe it is discrimination. Yet even
they agree that it is OK to choose
carefully which tomatoes to buy in the
supermarket. They discriminate between the
over-ripe, the under-ripe, and the just
right. They discriminate between the TV
shows they don't want to watch and those
they do. So, what's all this fuss about
racism if they're willing to discriminate,
too?
The word "discrimination" changes its
meaning without warning in the passage.
Etymological
The etymological fallacy occurs whenever
someone falsely assumes that the meaning of
a word can be discovered from its etymology
or origins.
Example:
The word "vise" comes from the Latin "that
which winds", so it means anything that
winds. Since a hurricane winds around its
own eye, it is a vise.
Every and
All
The fallacy of every and all turns on errors
due to the order or scope of the quantifiers
'every' and 'all' and 'any'. This is a
version of the scope
fallacy.
Example:
Every action of ours has some final end. So,
there is some common final end to all our
actions.
In proposing this fallacious argument,
Aristotle believed the common end is the
supreme good, so he had a rather optimistic
outlook on the direction of history.
Exaggeration
When we overstate or overemphasize a point
that is a crucial step in a piece of
reasoning, then we are guilty of the fallacy
of exaggeration. This is a kind of
error called Lack of
Proportion.
Example:
She's practically admitted that she
intentionally yelled at that student while
on the playground in the fourth grade.
That's assault. Then she said nothing
when the teacher asked, "Who did
that?" That's lying, plain and
simple. Do you want to elect as
secretary of this society someone who is a
known liar prone to assault? Doing so
would be a disgrace to the Collie
Society.
When we exaggerate in order to make a
joke, though, we aren't guilty of the
fallacy.
Excluded Middle
See False
Dilemma or Black-or-White.
False Analogy
When reasoning by analogy, the fallacy
occurs when the analogy is irrelevant or
very weak or when there is a more relevant
disanalogy. See also Faulty Comparison.
Example:
The book Investing for Dummies
really helped me understand my finances
better. The book Chess for Dummies
was written by the same author, was
published by the same press, and costs about
the same amount, so it would probably help
me understand my finances as
well.
False Cause
Improperly concluding that one thing is a
cause of another. The Fallacy of Non Causa
Pro Causa is another name for this fallacy.
Its four principal kinds are the Post Hoc Fallacy, the
Fallacy of Cum Hoc, Ergo Propter Hoc, the Regression Fallacy,
and the Fallacy of Reversing Causation.
Example:
My psychic adviser says to expect bad things
when Mars is aligned with Jupiter. Tomorrow
Mars will be aligned with Jupiter. So, if a
dog were to bite me tomorrow, it would be
because of the alignment of Mars with
Jupiter.
False
Dichotomy
See False
Dilemma or Black-or-White.
False
Dilemma
A reasoner who unfairly presents too few
choices and then implies that a choice must
be made among this short menu of choices
commits the false dilemma fallacy, as does
the person who accepts this faulty
reasoning.
Example:
I want to go to Scotland from London. I
overheard McTaggart say there are two roads
to Scotland from London: the high road and
the low road. I expect the high road
is dangerous because it's through the hills.
But it's raining, so both roads are
probably slippery. I don't like either
choice, but I guess I should take the low
road.
This would be fine
reasoning is you were limited to only two
roads, but you've falsely gotten yourself
into a dilemma with such reasoning. There
are many other ways to get to Scotland.
Don't limit yourself to these two choices.
You can take other roads, or go by boat or
train or airplane. Think of the unpleasant
choice as a charging bull. By
demanding other choices beyond those on the
unfairly limited menu, you thereby "go
between the horns" of the dilemma, and
are not gored. For another example of
the fallacy, see Black-or-White.
Far-Fetched
Hypothesis
This is the fallacy of offering a bizarre
(far-fetched) hypothesis as the correct
explanation without first ruling out more
mundane explanations.
Example:
Look at that mutilated cow in the field, and
see that flattened grass. Aliens must have
landed in a flying saucer and savaged the
cow to learn more about the beings on our
planet.
Faulty Comparison
If you try to make a point about something
by comparison, and if you do so by comparing
it with the wrong thing, you commit the
fallacy of faulty comparison or the fallacy
of questionable analogy.
Example:
We gave half the members of the hiking club
Durell hiking boots and the other half
good-quality tennis shoes. After three
months of hiking, you can see for yourself
that Durell lasted longer. You, too, should
use Durell when you need hiking boots.
Shouldn't Durell hiking boots be compared
with other hiking boots, not with tennis
shoes?
Formal
Formal fallacies are all the cases or kinds
of reasoning that fail to be deductively
valid. Formal fallacies are also called
logical fallacies or invalidities.
Example:
Some cats are tigers. Some tigers are
animals. So, some cats are animals.
This might at first seem to be a good
argument, but actually it is fallacious
because it has the same logical form as the
following more obviously invalid
argument: Some women are
Americans. Some Americans are
men. So, some women are men.
Nearly all the infinity of
types of invalid inferences have no
specific fallacy names.
Four
Terms
The fallacy of four terms (quaternio
terminorum) occurs when four rather than
three categorical terms are used in a
standard-form syllogism.
Example:
All rivers have banks. All banks have
vaults. So, all rivers have
vaults.
The word "banks" occurs as two distinct
terms, namely river bank and financial bank,
so this example also is an equivocation.
Without an equivocation, the four term
fallacy is trivially invalid.
Gambler's
This fallacy occurs when the gambler falsely
assumes that the history of outcomes will
affect future outcomes.
Example:
I know this is a fair coin, but it has come
up heads five times in a row now, so tails
is due on the next toss.
The fallacious move was to conclude that the
probability of the next toss coming up tails
must be more than a half. The assumption
that it's a fair coin is important because,
if the coin comes up heads five times in a
row, one would otherwise become suspicious
that it's not a fair coin and therefore
properly conclude that the probably is high
that heads is more likely on the next toss.
Genetic
A critic commits the genetic fallacy if the
critic attempts to discredit or support a
claim or an argument because of its origin
(genesis) when such an appeal to origins is
irrelevant.
Example:
Whatever your reasons are for buying that
DVD they've got to be ridiculous. You said
yourself that you got the idea for buying it
from last night's fortune cookie. Cookies
can't think!
Fortune cookies are not reliable sources of
information about what DVD to buy, but the
reasons the person is willing to give are
likely to be quite relevant and should be
listened to. The speaker is committing the
genetic fallacy by paying too much attention
to the genesis of the idea rather than to
the reasons offered for it. An ad hominem fallacy is one
kind of genetic fallacy, but the genetic
fallacy in our passage isn't an ad hominem.
If I learn that your plan for building
the shopping center next to the Johnson
estate originated with Johnson himself, who
is likely to profit from the deal, then my
pointing out to the planning commission the
origin of the deal would be relevant in
their assessing your plan. Because not all
appeals to origins are irrelevant, it
sometimes can be difficult to decide if the
fallacy has been committed. For example, if
Sigmund Freud shows that the genesis of a
person's belief in God is their desire for a
strong father figure, then does it follow
that their belief in God is misplaced, or
does this reasoning commit the genetic
fallacy?
Group Think
A reasoner commits the group think fallacy
if he or she substitutes pride of membership
in the group for reasons to support the
group's policy. If that's what our group
thinks, then that's good enough for me.
It's what I think, too. 'Blind' patriotism
is a rather nasty version of the fallacy.
Example:
We K-Mart employees
know that K-Mart brand items are better than
Wall-Mart brand items because, well, they
are from K-Mart, aren't they?
Guilt by
Association
Guilt by association is a version of the ad hominem fallacy in
which a person is said to be guilty of error
because of the group he or she associates
with.
Example:
Secretary of State Dean Acheson is soft on
communism as you can see by the fuzzy-headed
liberals who come to his White House
cocktail parties and the bleeding hearts of
his Democratic Party who call for
"moderation and constraint" against Soviet
terror.
Has any evidence been presented here that
Acheson's actions are inappropriate in
regards to communism? This sort of
reasoning is an example of McCarthyism, the
technique of smearing liberal Democrats that
was so effectively used by the late Senator
Joe McCarthy in the early 1950s. In fact,
Acheson was strongly anti-communist and the
architect of President Truman's firm policy
of containing Soviet power.
Hasty
Conclusion
See Jumping to Conclusions.
Hasty
Generalization
A hasty generalization is a fallacy of jumping to
conclusions in which the conclusion is a
generalization. See also Biased Statistics.
Example:
I've met two people in Nicaragua so far, and
they were both nice to me. So, all people I
will meet in Nicaragua will be nice to
me.
Heap
See Line-Drawing.
Hooded Man
This is an error in reasoning due to
confusing the knowing of a thing with the
knowing of it under all its various names or
descriptions.
Example:
You claim to know Socrates, but you must be
lying. You admitted you didn't know the
hooded man over there in the corner, but the
hooded man is Socrates.
Ignoratio Elenchi
See Irrelevant Conclusion.
Ignoring a Common
Cause
See Common
Cause.
Incomplete
Evidence
See Suppressed Evidence.
Inconsistency
The fallacy occurs when we accept an
inconsistent set of claims, that is, when we
accept a claim that logically conflicts with
other claims we hold.
Example:
I'm not racist.
Some of my best friends are white. But I
just don't think that white women love their
babies as much as our women do.
That last remark implies the speaker is
a racist, although the speaker doesn't
notice the inconsistency.
Intensional
The mistake of treating different
descriptions or names of the same object as
equivalent even in those contexts in which
the differences between them matter.
Reporting someone's beliefs or assertions or
making claims about necessity or possibility
can be such contexts. In these contexts,
replacing a description with another that
refers to the same object is not valid and
may turn a true sentence into a false one.
Example:
Michelle said she
wants to meet her new neighbor Stalnaker
tonight. But I happen to know Stalnaker is
a spy for North Korea, so Michelle said she
wants to meet a spy for North Korea tonight.
Michelle said no such thing.
The faulty reasoner illegitimately assumed
that what is true of a person under one
description will remain true when said of
that person under a second description even
in this context of indirect quotation. What
was true of the person when described as
Òher new neighbor StalnakerÓ is that
Michelle said she wants to meet him, but it
wasnÕt legitimate for me to assume this is
true of the same person when he is described
as Òa spy for North Korea.Ó Extensional
contexts are those in which it is legitimate
to substitute equals for equals with no
worry. But any context in which this
substitution of co-referring terms is
illegitimate is called an intensional
context. Intensional contexts are produced
by quotation, modality, and intentionality
(propositional attitudes). Intensionality
is failure of extensionality, thus the name
Òintensional fallacyÓ.
Invalid Inference
An argument can be assessed by deductive
standards to see if, were we to agree that
the premises are true, then the conclusion
would have to be true also. If the argument
cannot meet this standard, it is invalid.
Any invalid inference is a non sequitur. Also called a
formal fallacy. An argument is invalid only
if it is not an instance of any valid
argument form.
Example:
If it's raining, then there are clouds in
the sky. It's not raining. Therefore,
there are no clouds in the sky.
This invalid argument is an instance of denying the
antecedent.
Irrelevant
Conclusion
If an arguer argues for a certain conclusion
while falsely believing or suggesting that a
different conclusion is established, one for
which the first conclusion is irrelevant,
then the arguer commits the fallacy of
irrelevant conclusion.
Example:
In court, Thompson testifies that the
defendant is a honorable person, who
wouldn't harm a flea. The defense attorney
rises to say that Thompson's testimony shows
his client was not near the murder
scene.
The testimony of Thompson may be relevant to
a request for leniency, but it is irrelevant
to any claim about the defendant not being
near the murder scene.
Irrelevant Reason
This fallacy is a kind of non sequitur in which the
premises are wholly irrelevant to drawing
the conclusion.
Example:
Lao Tze Beer is the top selling beer in
Thailand. So, it will be the best beer for
Canadians.
Is-Ought
The is-ought fallacy occurs when a
conclusion expressing what ought to be so is
inferred from premises expressing only what
is so, in which it is supposed that no
implicit or explicit ought-premises are
need. There is controversy in the
philosophical literature regarding whether
this type of inference is always fallacious.
Example:
He's torturing the cat.
So, he shouldn't
do that.
This argument clearly would not commit the
fallacy if there were an implicit premise
indicating that he is a person and persons
shouldn't torture other beings.
Jumping to
Conclusions
When we draw a conclusion without taking the
trouble to acquire all the relevant
evidence, we commit the fallacy of jumping
to conclusions, provided there was
sufficient time to assess that extra
evidence, and that the effort to get the
evidence isn't prohibitive.
Example:
This car is really cheap. I'll buy it.
Hold on. Before concluding that you should
buy it, you ought to have someone check its
operating condition, or else you should make
sure you get a guarantee about the car's
being in working order. And, if you stop to
think about it, there may be other factors
you should consider before making the
purchase. Are size or appearance or gas
mileage relevant?
Lack
of Proportion
Either exaggerating or
downplaying a point that is a crucial step
in a piece of reasoning is an example of the
Fallacy of Lack of Proportion. It's a
mistake of not adopting the proper
perspective. An extreme form of
downplaying occurs in the Fallacy of
Suppressed
Evidence.
Example:
Chandra just overheard the terrorists say
that they are about to plant the bomb in the
basement of the courthouse, after which
they'll drive to the airport and get
away. But they won't be taking along
their cat. The poor cat. The
first thing that Chandra and I should do is
to call the Humane Society and check the
"Cat Wanted" section of the local
newspapers to see if we can find a proper
home for the cat.
Line-Drawing
If we improperly reject a vague claim
because it's not as precise as we'd like,
then we commit the line-drawing fallacy.
Being vague is not being hopelessly vague.
Also called the Bald Man Fallacy, the
Fallacy of the Heap and the Sorites Fallacy.
Example:
Dwayne can never grow bald. Dwayne isn't
bald now. Don't you agree that if he loses
one hair, that won't make him go from not
bald to bald? And if he loses one hair
after that, then this one loss, too, won't
make him go from not bald to bald.
Therefore, no matter how much hair he loses,
he can't become bald.
Loaded Language
Loaded language is emotive terminology that
expresses value judgments. When used in
what appears to be an objective description,
the terminology unfortunately can cause the
listener to adopt those values when in fact
no good reason has been given for doing so.
Also called Prejudicial Language.
Example:
[News broadcast] In today's top stories,
Senator Smith carelessly cast the deciding
vote today to pass both the budget bill and
the trailer bill to fund yet another
excessive watchdog committee over coastal
development.
This broadcast is an editorial posing as a
news report.
Logical
See Formal.
Lying
A fallacy of reasoning that depends on
intentionally saying something that is known
to be false. If the lying occurs in an
argument's premise, then it is an example of
the fallacy of questionable premise.
Example:
Abraham Lincoln, Theodore Roosevelt, and
John Kennedy were assassinated.
They
were U.S. presidents.
Therefore, at
least three U.S. presidents have been
assassinated.
Roosevelt was never assassinated.
Many
Questions
See Complex
Question.
Misconditionalization
See Modal Fallacy.
Misleading
Vividness
When the fallacy of jumping to conclusions is
committed due to a special emphasis on an
anecdote or other piece of evidence, then
the fallacy of misleading vividness has
occurred.
Example:
Yes, I read the side of the cigarette pack
about smoking being harmful to your health.
That's the Surgeon General's opinion, him
and all his statistics. But let me tell you
about my uncle. Uncle Harry has smoked
cigarettes for forty years now and he's
never been sick a day in his life. He even
won a ski race at Lake Tahoe in his age
group last year. You should have seen him
zip down the mountain. He smoked a
cigarette during the award ceremony, and he
had a broad smile on his face. I was really
proud. I can still remember the cheering.
Cigarette smoking can't be as harmful as
people say.
The vivid anecdote is the story about Uncle
Harry. Too much emphasis is placed on it
and not enough on the statistics from the
Surgeon General.
Misrepresentation
If the misrepresentation occurs on purpose,
then it is an example of lying. If the
misrepresentation occurs during a debate in
which there is misrepresentation of the
opponent's claim, then it would be the cause
of a straw man
fallacy.
Missing the Point
See Irrelevant Conclusion.
Modal
This is the error of treating modal
conditionals as if the modality applies only
to the consequent of the conditional. "The"
modal fallacy is the most well known of the
infinitely many errors involving modal
concepts, concepts such as necessity,
possibility and so forth. A conditional is
an if-then proposition.
The consequent is the then-part, and the
antecedent is the if-part.
Example:
If a proposition is true, then it can not be
false. But if a proposition can not be
false, then it is not only true but
necessarily true. Therefore, if a
proposition is true, then it's necessarily
true.
The acceptable interpretation of the first
premise, requires the modality to apply to
the entire conditional in the sense that it
really means "It's not possible that if a
proposition is true, then it's false."
However, the entire inference works only if
the first premise is miscontrued as saying
"If a proposition is true, then it is
necessary that it's not false." To see
that the misconstrual is unacceptable, pick
a proposition such as "It's raining in
Detroit." Let's suppose it actually is
raining in Detroit. So, the antecedent of
the misconstrual is true, but the consequent
isn't, because it says "It is necessary that
'it's raining in Detroit' is not false."
This isn't necessary, is it?
Monte Carlo
See Gambler's
Fallacy.
Name
Calling
See Ad Hominem.
Naturalistic
Although many philosophers argue that the
naturalistic fallacy is not a fallacy, and
although the term is not used entirely
consistently, the fallacy is most often
meant to apply to any attempt to define good
in naturalistic terms.
Example:
"Good" means any object of
desire.
The example is from Hobbes. The early Hume's
definition of "good" as "what I approve of"
also would be an example, but the definition
"what God approves of" would not because
this is defining good in supernaturalistic
terms, not naturalistic terms. Some
philosophers define the fallacy more broadly
as any attempt to define good in value-free
terms, in which case the previous definition
in terms of God would be an example of the
fallacy. Sometimes the fallacy is construed
as being committed whenever "good" is
defined in terms of anything else that is
supposed to be a more basic part or aspect
of good. Finally, on yet another
interpretation of the fallacy, it is said to
apply to any attempt to argue from premises
involving only naturalistic terms to a
conclusion involving values, in which case
the following would be an example:
Homosexual acts cannot produce
children, so they are immoral.
Neglecting a
Common Cause
See Common
Cause.
No Middle Ground
See False
Dilemma.
No
True Scotsman
This error is a kind of ad hoc rescue of
one's generalization in which the reasoner
re-characterizes the situation solely in
order to escape refutation of the
generalization.
Example:
Smith: All Scotsmen are loyal and
brave. Jones: But McDougal over
there is a Scotsman, and he was arrested by
his commanding officer for running from the
enemy.
Smith: Well, if that's
right, it just shows that McDougal wasn't a
TRUE Scotsman.
Non
Causa Pro Causa
This label is Latin for mistaking the
"non-cause for the cause." See False Cause.
Non
Sequitur
When a conclusion is supported only by
extremely weak reasons or by irrelevant
reasons, the argument is fallacious and is
said to be a non sequitur. However, we
usually apply the term only when we cannot
think of how to label the argument with a
more specific fallacy name. Any
deductively invalid inference is a non
sequitur.
Example:
Nuclear disarmament is a risk, but
everything in life involves a risk. Every
time you drive in a car you are taking a
risk. If you're willing to drive in a car,
you should be willing to have disarmament.
One-Sidedness
See Slanting and Suppressed
Evidence.
Oversimplification
You oversimplify when you cover up relevant
complexities or make a complicated problem
appear to be too much simpler than it really
is.
Example:
President Bush wants our country to trade
with Fidel Castro's Communist Cuba. I say
there should be a trade embargo against
Cuba. The issue in our election is Cuban
trade, and if you are against it, then you
should vote for me for president.
Whom to vote for should be decided by
considering quite a number of issues in
addition to Cuban trade. When an
oversimplification results in falsely
implying that a minor causal factor is the
major one, then the reasoning also commits
the false cause
fallacy.
Past
Practice
See Traditional Wisdom.
Pathetic
The pathetic fallacy is a mistaken belief
due to attributing peculiarly human
qualities to inanimate objects (but not to
animals). The fallacy is caused by
anthropomorphism.
Example:
Aargh, it won't start again. This old car
always breaks down on days when I have a job
interview. It must be afraid that if I get a
new job, then I'll be able to afford a
replacement, so it doesn't want me to get to
my interview on time.
Persuasive
Definition
Some people try to win their arguments by
getting you to accept their faulty
definition. If you buy into their
definition, they've practically persuaded
you already. Same as the Definist Fallacy.
Poisoning the
Well when presenting a definition would
be an example of a using persuasive
definition.
Example:
Let's define a Democrat as a leftist who
desires to overtax the corporations and
abolish freedom in the economic sphere.
Perfectionist
If you remark that a proposal or claim
should be rejected solely because it doesn't
solve the problem perfectly, in cases where
perfection isn't really required, then
you've committed the perfectionist fallacy.
Example:
You said hiring a house cleaner would
solve our cleaning problems because we both
have full-time jobs. Now, look what
happened. Every week she unplugs the
toaster oven and leaves it that way. I
should never have listened to you about
hiring a house cleaner.
Petitio Principii
See Begging
the Question.
Poisoning the
Well
Poisoning the well is a preemptive attack on
a person in order to discredit their
testimony or argument in advance of their
giving it. A person who thereby becomes
unreceptive to the testimony reasons
fallaciously and has become a victim of the
poisoner. This is a kind of ad hominem.
Example:
[Prosecuting attorney in court] When is the
defense attorney planning to call that
twice-convicted child molester, David
Barnington, to the stand? OK, I'll rephrase
that. When is the defense attorney planning
to call David Barnington to the
stand?
Post
Hoc
Suppose we notice that an event of kind A is
followed in time by an event of kind B, and
then hastily leap to the conclusion that A
caused B. If so, we commit the post hoc
fallacy. Correlations are often good
evidence of causal connection, so the
fallacy occurs only when the leap to the
causal conclusion is done 'hastily'. The
Latin term for the fallacy is post hoc, ergo
propter hoc ("After this, therefore because
of this"). It is a kind of false cause fallacy.
Example:
I ate in that Ethiopian restaurant three
days ago and now I've just gotten food
poisoning. The only other time I've eaten
in an Ethiopian restaurant I also got food
poisoning, but that time I got sick a week
later. My eating in those kinds of
restaurants is causing my food
poisoning.
Your background knowledge should tell you
this is unlikely because the effects of food
poisoning are felt soon after the food is
eaten. Before believing your illness was
caused by eating in an Ethiopian restaurant,
you'd need to rule out other possibilities,
such as your illness being caused by what
you ate a few hours before the onset of the
illness.
Prejudicial
Language
See Loaded
Language.
Questionable
Analogy
See False
Analogy.
Questionable
Cause
See False Cause.
Questionable
Premise
If you have sufficient background
information to know that a premise is
questionable or unlikely to be acceptable,
then you commit this fallacy if you accept
an argument based on that premise. This
broad category of fallacies of argumentation
includes appeal to authority, false dilemma, inconsistency, lying, stacking the deck, straw man, suppressed
evidence, and many others.
Quibbling
We quibble when we complain about a minor
point and falsely believe that this
complaint somehow undermines the main point.
To avoid this error, the logical reasoner
will not make a mountain out of a mole hill
nor take people too literally.
Example:
I've found typographical errors in your
poem, so the poem is neither inspired nor
perceptive.
Quoting out of
Context
If you quote someone, but select the
quotation so that essential context is not
available and therefore the person's views
are distorted, then you've quoted "out of
context." Quoting out of context in an
argument creates a straw man fallacy.
Example:
Smith: I've been reading about a
peculiar game in this article about
vegetarianism. When we play this game, we
lean out from a fourth-story window and drop
down strings containing "Free food" signs on
the end in order to hook unsuspecting
passers-by. It's really outrageous, isn't
it? Yet isn't that precisely what sports
fishermen do for entertainment from their
fishing boats? The article says it's time
we put an end to sport fishing.
Jones: Let me quote Smith for you. He
says "We...hook unsuspecting passers-by."
What sort of moral monster is this man
Smith?
Jones's selective quotation is fallacious
because it makes Smith appear to advocate
this immoral activity when the context makes
it clear that he doesn't.
Rationalization
We rationalize when we inauthentically offer
reasons to support our claim. We are
rationalizing when we give someone a reason
to justify our action even though we know
this reason is not really our own reason for
our action, usually because the offered
reason will sound better to the audience
than our actual reason.
Example:
"I bought the matzo bread from Kroger's
Supermarket because it is the cheapest brand
and I wanted to save money," says Alex [who
knows he bought the bread from Kroger's
Supermarket only because his girlfriend
works there].
Red
Herring
A red herring is a smelly fish that
would distract even a bloodhound. It is also
a digression that leads the reasoner off the
track of considering only relevant
information.
Example:
Will the new tax in Senate Bill 47 unfairly
hurt business? One of the provisions of the
bill is that the tax is higher for large
employers (fifty or more employees) as
opposed to small employers (six to
forty-nine employees). To decide on the
fairness of the bill, we must first
determine whether employees who work for
large employers have better working
conditions than employees who work for small
employers.
Bringing up the issue of working conditions
is the red herring.
Refutation by
Caricature
See Ad Hominem.
Regression
This fallacy occurs when regression to the
mean is mistaken for a sign of a causal
connection. Also called the Regressive
Fallacy. It is a kind of false cause fallacy.
Example:
You are investigating the average heights of
groups of Americans. You sample some people
from Chicago and determine their average
height. You have the figure for the mean
height of Americans and notice that your
Chicagoans have an average height that
differs from this mean. Your second sample
of the same size is from people from Miami.
When you find that this group's average
height is closer to the American mean height
[as it is very likely to be due to common
statistical regression to the mean], you
falsely conclude that there must be
something causing Miamians rather than
Chicagoans be more like the average
American.
There is most probably nothing causing
Miamians to be more like the average
American; but rather what is happening is
that averages are regressing to the mean.
Reversing
Causation
Drawing an improper conclusion about
causation due to a causal assumption that
reverses cause and effect. A kind of false cause fallacy.
Example:
All the corporate officers of Miami
Electronics and Power have big boats. If
you're ever going to become an officer of
MEP, you'd better get a bigger boat.
The false assumption here is
that having a big boat helps cause you to be
an officer in MEP, whereas the reverse is
true. Being an officer causes you to have
the high income that enables you to purchase
a big boat.
Scapegoating
If you unfairly blame an unpopular person or
group of people for a problem, then you are
scapegoating. This is a kind of fallacy of
appeal to
emotions.
Example:
Augurs were official diviners of ancient
Rome. During the pre-Christian period, when
Christians were unpopular, an augur would
make a prediction for the emperor about,
say, whether a military attack would have a
successful outcome. If the prediction
failed to come true, the augur would not
admit failure but instead would blame nearby
Christians for their evil influence on his
divining powers. The elimination of these
Christians, the augur would claim, could
restore his divining powers and help the
emperor. By using this reasoning tactic, the
augur was scapegoating the Christians.
Scare Tactic
If you suppose that terrorizing your
opponent is giving him a reason for
believing that you are correct, you are
using a scare tactic and reasoning
fallaciously.
Example:
David: My father owns the department
store that gives your newspaper fifteen
percent of all its advertising revenue, so
I'm sure you won't want to publish any story
of my arrest for spray painting the college.
Newspaper editor: Yes, David, I
see your point. The story really isn't
newsworthy.
David has given the editor a financial
reason not to publish, but he has not given
a relevant reason why the story is not
newsworthy. David's tactics are scaring the
editor, but it's the editor who commits the
scare tactic fallacy, not David. David has
merely used a scare tactic. This fallacy's
name emphasizes the cause of the fallacy
rather than the error itself. See also the
related fallacy of appeal to emotions.
Scope
The scope fallacy is caused by improperly
changing or misrepresenting the scope of a
phrase.
Example:
Every concerned citizen who believes that
someone living in the US is a terrorist
should make a report to the authorities. But
Shelley told me herself that she believes
there are terrorists living in the US, yet
she hasn't made any reports. So, she must
not be a concerned citizen.
The first sentence has ambiguous scope. It
was probably originally meant in this sense:
Every concerned citizen who believes (of
someone that this person is living in the US
and is a terrorist) should make a report to
the authorities. But the speaker is clearly
taking the sentence in its other, less
plausible sense: Every concerned citizen who
believes (that there is someone or other
living in the US who is a terrorist) should
make a report to the authorities. Scope
fallacies usually are amphibolies.
Secundum Quid
See Accident and Converse
Accident, two versions of the fallacy.
Selective
Attention
Improperly focusing attention on certain
things and ignoring others.
Example:
Father: Justine,
how was your school day today? Another
C on the history test like last time?
Justine: Dad, I got an A- on my
history test today. Isn't that
great? Only one student got an A.
Father: I see you weren't the one
with the A. And what about the math
quiz?
Justine: I think I did OK,
better than last time.
Father: If
you really did well, you'd be sure.
What I'm sure of is that today was a pretty
bad day for you.
The
pessimist who pays attention to all the bad
news and ignores the good news thereby
commits the fallacy of selective
attention. The remedy for this fallacy
is to pay attention to all the relevant
evidence. The most common examples of
selective attention are the fallacy of Suppressed
Evidence and the fallacy of Confirmation
Bias.
Self-Fulfilling
Prophecy
The fallacy occurs when the act of
prophesying will itself produce the effect
that is prophesied, but the reasoner doesn't
recognize this and believes the prophesy is
a significant insight.
Example:
A group of students are selected to be
interviewed individually by the teacher.
Each selected student is told that the
teacher has predicted they will do
significantly better in their future school
work. Actually, though, the teacher has no
special information about the students and
has picked the group at random. If the
students believe this prediction about
themselves, then, given human psychology, it
is likely that they will do better merely
because of the teacher's making the
prediction.
The prediction will fulfill itself, so to
speak, and the students commit the fallacy.
This fallacy can be dangerous in an
atmosphere of potential war between nations
when the leader of a nation predicts that
their nation will go to war against their
enemy. This prediction could very well
precipitate an enemy attack because the
enemy calculates that if war is inevitable
then it is to their military advantage not
to get caught by surprise.
Slanting
This error occurs when the issue is not
treated fairly because of misrepresenting
the evidence by, say, suppressing part of
it, or misconstruing some of it, or simply
lying. See the following fallacies: Lying, Misrepresentation,
Questionable
Premise, Quoting out of Context, Straw Man, Suppressed
Evidence.
Slippery Slope
Suppose someone claims that a first step (in
a chain of causes and effects, or a chain of
reasoning) will probably lead to a second
step that in turn will probably lead to
another step and so on until a final step
ends in trouble. If the likelihood of the
trouble occurring is exaggerated, the
slippery slope fallacy is committed.
Example:
Mom: Those look like bags under your
eyes. Are you getting enough sleep?
Jeff: I had a test and stayed up late
studying.
Mom: You didn't take
any drugs, did you?
Jeff: Just
caffeine in my coffee, like I always do.
Mom: Jeff! You know what happens
when people take drugs! Pretty soon the
caffeine won't be strong enough. Then you
will take something stronger, maybe
someone's diet pill. Then, something even
stronger. Eventually, you will be doing
cocaine. Then you will be a crack addict!
So, don't drink that coffee.
The form of a slippery slope fallacy looks
like this:
A leads to B.
B leads to C.
C
leads to D.
...
Z leads to HELL.
We don't want to go to HELL.
So,
don't take that first step A.
Think of the sequence A, B, C, D, ..., Z as
a sequence of closely stacked dominoes. The
key claim in the fallacy is that pushing
over the first one will start a chain
reaction of falling dominoes, each one
triggering the next. But the analyst asks
how likely is it really that pushing the
first will lead to the fall of the last?
For example, if A leads to B with a
probability of 80 percent, and B leads to C
with a probability of 80 percent, and C
leads to D with a probability of 80 percent,
is it likely that A will eventually lead to
D? No, not at all; there is about a 50- 50
chance. The proper analysis of a slippery
slope argument depends on sensitivity to
such probabilistic calculations. Regarding
terminology, if the chain of reasoning A, B,
C, D, ..., Z is about causes, then the
fallacy is called the Domino Fallacy.
Small
Sample
This is the fallacy of using too small a
sample. If the sample is too small to
provide a representative sample of the
population, and if we have the background
information to know that there is this
problem with sample size, yet we still
accept the generalization upon the sample
results, then we commit the fallacy. This
fallacy is the fallacy of hasty generalization,
but it emphasizes statistical sampling
techniques.
Example:
I've eaten in restaurants twice in my life,
and both times I've gotten sick. I've
learned one thing from these experiences:
restaurants make me sick.
How big a sample do you need to avoid the
fallacy? Relying on background knowledge
about a population's lack of diversity can
reduce the sample size needed for the
generalization. With a completely
homogeneous population, a sample of one is
large enough to be representative of the
population; if we've seen one electron,
we've seen them all. However, eating in one
restaurant is not like eating in any
restaurant, so far as getting sick is
concerned. We cannot place a specific number
on sample size below which the fallacy is
produced unless we know about homogeneity of
the population and the margin of error and
the confidence level.
Smear
Tactic
A smear tactic is an unfair
characterization either of the opponent or
the opponent's position or argument.
Smearing the opponent causes an ad hominem fallacy. Smearing
the opponent's argument causes a straw man fallacy.
Smokescreen
This fallacy occurs by offering too many
details in order either to obscure the point
or to cover-up counter-evidence. In the
latter case it would be an example of the
fallacy of suppressed evidence. If you
produce a smokescreen by bringing up an
irrelevant issue, then you produce a red herring fallacy.
Sometimes called clouding the issue.
Example:
Senator, wait before you vote on Senate Bill
88. Do you realize that Delaware passed a
bill on the same subject in 1932, but it was
ruled unconstitutional for these twenty
reasons. Let me list them here.... Also,
before you vote on SB 88 you need to know
that .... And so on.
There is no recipe to follow in
distinguishing smokescreens from reasonable
appeals to caution and care.
Sorites
See Line-Drawing.
Special Pleading
Special pleading is a form of inconsistency
in which the reasoner doesn't apply his or
her principles consistently. It is the
fallacy of applying a general principle to
various situations but not applying it to a
special situation that interests the arguer
even though the general principle properly
applies to that special situation, too.
Example:
Everyone has a duty to help the police do
their job, no matter who the suspect is.
That is why we must support investigations
into corruption in the police department.
No person is above the law. Of course, if
the police come knocking on my door to ask
about my neighbors and the robberies in our
building, I know nothing. I'm not about to
rat on anybody.
In our example, the principle of helping the
police is applied to investigations of
police officers but not to one's neighbors.
Specificity
Drawing an overly specific conclusion from
the evidence. A kind of jumping to conclusions.
Example:
The trigonometry calculation came out to
35,005.6833 feet, so that's how wide the
cloud is up there.
Stacking the Deck
See Suppressed Evidence and Slanting.
Stereotyping
Using stereotypes as if they are accurate
generalizations for the whole group is an
error in reasoning. Stereotypes are general
beliefs we use to categorize people,
objects, and events; but these beliefs are
overstatements that shouldn't be taken
literally. For example, consider the
stereotype "SheÕs Mexican, so sheÕs going to
be late." This conveys a mistaken
impression of all Mexicans. On the other
hand, even though most Mexicans are
punctual, a German is more apt to be
punctual than a Mexican, and this fact is
said to be the "kernel of truth" in the
stereotype. The danger in our using
stereotypes is that speakers or listeners
will not realize that even the best
stereotypes are accurate only when taken
probabilistically. As a consequence, the
use of stereotypes can breed racism, sexism,
and other forms of bigotry.
Example:
German people aren't good at dancing our
sambas. She's German. So, she's not going
to be any good at dancing our
sambas.
This argument is deductively valid, but it's
unsound
because it rests on a false, stereotypical
premise. The grain of truth in the
stereotype is that the average German
doesn't dance sambas as well as the average
South American, but to overgeneralize and
presume that ALL Germans are poor samba
dancers compared to South Americans is a
mistake called "stereotyping."
Straw
Man
You commit the straw man fallacy whenever
you attribute an easily refuted position to
your opponent, one that the opponent
wouldn't endorse, and then proceed to attack
the easily refuted position believing you
have undermined the opponent's actual
position. If the misrepresentation is on
purpose, then the straw man fallacy is
caused by lying.
Example (a debate
before the city council):
Opponent: Because of the killing and
suffering of Indians that followed
Columbus's discovery of America, the City of
Berkeley should declare that Columbus Day
will no longer be observed in our city.
Speaker: This is ridiculous,
fellow members of the city council. It's
not true that everybody who ever came to
America from another country somehow
oppressed the Indians. I say we should
continue to observe Columbus Day, and vote
down this resolution that will make the City
of Berkeley the laughing stock of the
nation.
The speaker has twisted what his opponent
said; the opponent never said, nor even
indirectly suggested, that everybody who
ever came to America from another country
somehow oppressed the Indians.
Style
Over Substance
Unfortunately the style with which an
argument is presented is sometimes taken as
adding to the substance or strength of the
argument.
Example:
You've just been told by the salesperson
that the new Maytag is an excellent washing
machine because it has a double washing
cycle. If you were to notice that the
salesperson smiled at you and was well
dressed, this wouldn't add to the quality of
the original argument, but unfortunately it
does for those who are influenced by style
over substance, as most of us are.
Subjectivist
The subjectivist fallacy occurs when it is
mistakenly supposed that a good reason to
reject a claim is that truth on the matter
is relative to the person or group.
Example:
Justine has just given Jake her reasons for
believing that the Devil is an imaginary
evil person. Jake, not wanting to accept
her conclusion, responds with, "That's
perhaps true for you, but it's not true for
me."
Superstitious
Thinking
Reasoning deserves to be called
superstitious if it is based on reasons that
are well known to be unacceptable, usually
due to unreasonable fear of the unknown,
trust in magic, or an obviously false idea
of what can cause what. A belief produced
by superstitious reasoning is called a
superstition.
Example:
I never walk under ladders; it's bad luck.
It may be a good idea not to walk under
ladders, but a proper reason to believe this
is that workers on ladders occasionally drop
things, and that ladders might have dripping
wet paint that could damage your clothes. An
improper reason for not walking under
ladders is that it is bad luck to do so.
Suppressed
Evidence
Intentionally failing to use information
suspected of being relevant and significant
is committing the fallacy of suppressed
evidence. This fallacy usually occurs when
the information counts against one's own
conclusion. Perhaps the arguer is not
mentioning that experts have recently
objected to one of his premises. The
fallacy is a kind of fallacy of Selective
Attention.
Example:
Buying the Cray Mac 11 computer for our
company was the right thing to do. It meets
our company's needs; it runs the programs we
want it to run; it will be delivered
quickly; and it costs much less than what we
had budgeted.
This appears to be a good argument, but
you'd change your assessment of the argument
if you learned the speaker has intentionally
suppressed the relevant evidence that the
company's Cray Mac 11 was purchased from his
brother-in-law at a 30 percent higher price
than it could have been purchased elsewhere,
and if you learned that a recent unbiased
analysis of ten comparable computers placed
the Cray Mac 11 near the bottom of the list.
If the relevant information is not
intentionally suppressed by rather
inadvertently overlooked, the fallacy of
suppressed evidence also is said to occur,
although the fallacy's name is misleading in
this case.
Sweeping
Generalization
See Fallacy of
Accident.
Syllogistic
Syllogistic fallacies are kinds of invalid
categorical syllogisms. This list contains
the fallacy of undistributed middle and the
fallacy of four
terms, and a few others though there are
a great many such formal
fallacies.
Tokenism
If you interpret a merely token gesture as
an adequate substitute for the real thing,
you've been taken in by tokenism.
Example:
How can you call our organization racist?
After all, our receptionist is African
American.
If you accept this line of reasoning, you
have been taken in by tokenism.
Traditional
Wisdom
If you say or imply that a practice must be
OK today simply because it has been the
apparently wise practice in the past, you
commit the fallacy of traditional wisdom.
Procedures that are being practiced and that
have a tradition of being practiced might or
might not be able to be given a good
justification, but merely saying that they
have been practiced in the past is not
always good enough, in which case the
fallacy is committed. Also called
argumentum consensus gentium when the
traditional wisdom is that of nations.
Example:
Of course we should buy IBM's computer
whenever we need new computers. We have been
buying IBM as far back as anyone can
remember.
The "of course" is the problem. The
traditional wisdom of IBM being the right
buy is some reason to buy IBM next time, but
it's not a good enough reason in a climate
of changing products, so the "of course"
indicates that the fallacy of traditional
wisdom has occurred.
Tu
Quoque
The fallacy of tu quoque is committed if we
conclude that someone's argument not to
perform some act must be faulty because the
arguer himself or herself has performed it.
Similarly, when we point out that the arguer
doesn't practice what he preaches, we may be
therefore suppose that there must be an
error in the preaching, but we are reasoning
fallaciously and creating a tu quoque. This
is a kind of ad
hominem fallacy.
Example:
You say I shouldn't become an alcoholic
because it will hurt me and my family, yet
you yourself are an alcoholic, so your
argument can't be worth listening to.
Discovering that a speaker is a hypocrite is
a reason to be suspicious of the speaker's
reasoning, but it is not a sufficient reason
to discount it.
Two Wrongs Make a
Right
When you defend your wrong action as being
right because someone previously has acted
wrongly, you commit the fallacy called "two
wrongs make a right." This is a kind of ad hominem fallacy.
Example:
Oops, no paper this morning. Somebody in
our apartment building probably stole my
newspaper. So, that makes it OK for me to
steal one from my neighbor's doormat while
nobody is out here in the hallway.
Undistributed
Middle
In syllogistic logic, failing to distribute
the middle term over at least one of the
other terms is the fallacy of undistributed
middle. Also called the fallacy of
maldistributed middle.
Example:
All collies are animals.
All dogs are
animals.
Therefore, all collies are
dogs.
The middle term ('animals') is in the
predicate of both universal affirmative
premises and therefore is undistributed.
This formal fallacy has the logical form:
All C are A. All D are A. Therefore, All C
are D.
Unfalsifiability
This error in explanation occurs when the
explanation contains claims that are not
falsifiable, because there is no way to show
them to be true or false. For example, they
cannot be tested and cannot be deduced from
other well accepted claims.
Example:
He's lying because he's possessed by
demons.
This could be the correct explanation of his
lying, but since there's no way to collect
evidence for or against the claim that he's
possessed by demons, presenting it as the
correct explanation is an error.
Unrepresentative
Sample
If the means of collecting the sample from
the population are likely to produce a
sample that is unrepresentative of the
population, then a generalization upon the
sample data is an inference committing the
fallacy of unrepresentative sample. A kind
of hasty
generalization or biased statistics.
Example:
The two men in the matching green suits that
I met at the Star Trek Convention in Las
Vegas had a terrible fear of cats. I
remember their saying they were from
Delaware. I've never met anyone else from
Delaware, so I suppose everyone there has a
terrible fear of cats.
Most people's background information is
sufficient to tell them that people at this
sort of convention are unlikely to be
typical members of society. Large
samples can be unrepresentative, too.
Example:
We've polled over 400,000 Southern Baptists
and asked them whether the best religion in
the world is Southern Baptist. We have over
99% agreement, which proves our point about
which religion is best.
Sample size does not overcome sampling bias.
Untestability
See Unfalsifiability.
Weak
Analogy
See False
Analogy.
Willed ignorance
I've got my mind made up, so don't confuse
me with the facts. This is usually a case
of the Traditional Wisdom Fallacy.
Example:
Of course she's made a mistake. We've
always had meat and potatoes for dinner, and
our ancestors have always had meat and
potatoes for dinner, and so nobody knows
what they're talking about when they start
saying meat and potatoes are bad for us.
Wishful Thinking
A reasoner who suggests that a claim is
true, or false, merely because he or she
strongly hopes it is, is committing the
fallacy of wishful thinking. Wishing
something is true is not a relevant reason
for claiming that it is actually true.
Example:
There's got to be an error here in the
history book. It says Thomas Jefferson had
slaves. He was our best president, and a
good president would never do such a thing.
That would be awful.
7.
References and Further
Reading
Eemeren, Frans H. van, R. F. Grootendorst,
F. S. Henkemans, J. A. Blair, R. H. Johnson,
E. C. W. Krabbe, C. W. Plantin, D. N.
Walton, C. A. Willard, J. A. Woods, and D.
F. Zarefsky, 1996. Fundamentals of
Argumentation Theory: A Handbook of
Historical Backgrounds and Contemporary
Developments. Mahwah, New Jersey,
Lawrence Erlbaum Associates, Publishers.
Fischer, David H., 1970. Historian's
Fallacies. New York, Harper & Row.
Huff, Darrell, 1954. How to Lie with
Statistics. New York, W. W. Norton.
Groarke, Leo and C. Tindale, 2003. Good
Reasoning Matters! 3rd edition, Toronto,
Oxford University Press.
Hamblin,
Charles L., 1970. Fallacies.
London, Methuen.
Hansen, Has V. and R.
C. Pinto., 1995. Fallacies: Classical
and Contemporary Readings. University
Park, Pennsylvania State University Press.
Levi, D. S., 1994. "Begging What is at
Issue in the Argument," Argumentation,
8, 265-282.
Walton, Douglas N.,
1989. Informal Logic: A Handbook for
Critical Argumentation. Cambridge,
Cambridge University Press.
Walton,
Douglas N., 1995. A Pragmatic Theory of
Fallacy. Tuscaloosa, University of
Alabama Press.
Walton, Douglas N., 1997.
Appeal to Expert Opinion: Arguments from
Authority. University Park,
Pennsylvania State University Press.
Whately, Richard, 1836. Elements of
Logic. New York, Jackson.
Woods,
John and D. N. Walton, 1989. Fallacies:
Selected Papers 1972-1982. Dordrecht,
Holland, Foris.
Research on
the fallacies of informal logic is regularly
published in the following journals:
Argumentation, Argumentation and
Advocacy, Informal Logic,
Philosophy and Rhetoric, and
Teaching Philosophy.