Knowing That P Without Believing That P
Blake Myers-Schulz, University of Wisconsin at Madison
Eric Schwitzgebel, University of California at Riverside
Abstract:
Most epistemologists hold that knowledge entails belief. However, proponents of this claim rarely offer a positive argument in support of it. Rather, they tend to treat the view as obvious and assert that there are no convincing counterexamples. We find this strategy to be problematic. We do not find the standard view obvious, and moreover, we think there are cases in which it is intuitively plausible that a subject knows some proposition P without – or at least without determinately – believing that P. Accordingly, we present five plausible examples of knowledge without (determinate) belief, and we present empirical evidence suggesting that our intuitions about these scenarios are not atypical.
Keywords: knowledge, belief, capacity, tendency, in-between belief, entailment thesis, intuitions, experimental philosophy, Colin Radford
Knowing That P
Without Believing That P
NAT: [Impatiently.] Why, every one knows what Father looks for, man! The ship, of course.... Lost in a hurricane off the Celebes with all on board – three years ago!
HIGGINS: [Wonderingly.] Ah. [After a pause.] But your father still clings to a doubt –
NAT: There is no doubt for him or any one else to cling to. She was sighted bottom up, a complete wreck, by the whaler John Slocum.... He was the first to hear, naturally. Oh, he knows right enough, if that’s what you’re driving at. [He bends toward the doctor – intensely.] He knows, Doctor, he knows – but he won’t believe. He can’t – and keep living (O’Neill 1918, p. 181-182).
What is the relationship between knowledge and belief? The standard view in contemporary epistemology is that knowledge entails belief – or at least that propositional knowledge does. (Knowing how or knowing wh- might be a different matter.[1]) Necessarily, on the standard view, if one knows that P, one believes that P. This claim is only occasionally argued for; more often, it is treated simply as obvious. However, we the authors don’t find the claim obvious. We think that there are cases of determinate, propositional knowledge that either are not cases of belief or are, at most, “in-between” cases of belief in which the subject is on the vague border between believing and failing to believe. (On vagueness in belief attribution, see Schwitzgebel 2001a, 2002, 2010.) In this essay, we present five such cases along with empirical evidence that we are not alone in our unconventional intuitions about those cases. We conclude with some general reflections on the relationship between knowledge and belief.
1. The State of the Literature.
The standard contemporary analysis of knowledge runs as follows. A subject S knows a proposition P if and only if:
(i.) P;
(ii.) S believes that P;
(iii.) S is justified in believing that P;
and many philosophers would add some sort of further condition (iv.). Dispute tends to center on how to think about condition (iii) and what an additional condition (iv) might look like. Conditions (i) and (ii) are often treated as largely uncontroversial.[2]
Proponents of condition (ii) on knowledge – that is, of the view that (propositional) knowledge entails belief – might defend their view in one of two ways. They might present a general argument that shows that knowledge entails belief, or they might challenge those who would deny that knowledge entails belief to present a counterexample to the thesis – that is, a case of knowledge without belief – and then conclude the truth of condition (ii) from the failure of any opponents of that condition to present convincing counterexamples. The primary strategy in the literature has been the latter, which we will call the wait-for-counterexamples strategy (Cohen 1966; Armstrong 1969, 1973; Sorenson 1982; Dartnall 1986; Steup 2001/2006[3]). This strategy depends on the correct classification of hypothetical scenarios as cases of knowledge or belief: There must be no case that is intuitively, or properly, or in the judgment of a well-informed philosopher (here, it seems to us, the methodological assumptions and standards of success become a bit hazy), both a case of knowledge and not a case of belief.
The best-known putative counterexample to the view that knowledge entails belief is due to Colin Radford (1966).[4] Radford presents a scenario in which a student named Jean feels quite certain that he does not know any English history. But when Jean is asked to provide dates for certain events in English history, such as the death of Queen Elizabeth, he correctly answers many of the questions, though he feels like he is guessing. The correctness of his answers surprises Jean, and Jean concludes that he actually does know some English history (e.g., that Queen Elizabeth died in 1603). Radford finds it plausible to regard this as a case of knowledge without belief: Jean knew that Queen Elizabeth died in 1603 but did not believe that she died in 1603.
The standard response to Radford’s example is to deny that the case of Jean is a clear case of knowledge without belief (e.g., Lehrer 1968; Armstrong 1969, 1973). Armstrong thus argues:
I do not think that [Jean’s case] is one of those clear cases that can be used as a test of philosophical analysis. Rather, we must first develop a theory of the nature of knowledge and belief, basing it on securer evidence, and then see if our theory will accept Radford’s case (1969, p. 35-36).
To this objection, Radford replies that “perhaps it is a clear case” (1988, p. 499). Thus, we appear to have an intuition stalemate. (Or perhaps it’s not a stalemate, since Radford is in the minority? But mightn’t a philosophical minority be right?)
Armstrong proposes that we employ “securer evidence” to develop a theory of knowledge and belief, and then apply this theory to disputed scenarios such as Radford’s. While we agree with Armstrong’s general point that a more theoretical approach is needed, we disagree that scenarios like Radford’s must be set aside until we have an independent theory of knowledge and belief with which to judge such scenarios. Even if Radford’s example is not a clear case of knowledge without belief, it may nevertheless be of service to the construction of a theory of knowledge and belief. To illustrate, suppose that two nonequivalent theories – T1 and T2 – make the same predictions about all the “clear” (or uncontroversial) cases of knowledge and belief. Further suppose that according to T1, but not according to T2, one should expect to find certain types of unclear cases (e.g., cases in which it is not clear that S believes that P but clear that S knows that P). Unclear cases of the predicted type would thereby serve as evidence favoring T1 over T2.
Radford offers the following hypothesis for why the majority of philosophers appear to differ from him in their intuitions about the case of Jean:
perhaps the explanation is that this is not the kind of case which they had in mind when they learned, digested, and in their turn explained the classical analysis of knowledge in terms of justified true belief. A restricted diet of examples has fed their essentialism... (ibid.).
If Radford’s diagnosis is correct, we should expect ordinary people not trained in philosophy not to share the standard view. There should be cases – including cases like Jean’s – that non-philosophers will classify as knowledge but not as belief. This prediction can, of course, be empirically tested.
We believe that soliciting non-philosophers’ judgments about such cases is worthwhile not because we believe that philosophical disputes generally admit of resolution by appeal to the intuitions of non-philosophers. Rather, our aim is a modest one: We hope only to undermine the accusation that a view such as Radford’s – which maintains that, in certain cases, a person knows that P but does not believe that P – is clearly counterintuitive by showing that it is, at least, not unusual. We thus hope to force those who hold that knowledge entails belief to develop a more substantial argument for their view than the wait-for-counterexamples strategy. Putative counterexamples are available; people’s judgments about them are, we hope to show, divided; and thus a more theoretical approach to the question is necessary, perhaps one that can account for the divided judgments.
2. The Scenarios.
We designed five scenarios that we regard as plausible cases of knowledge without belief, and we presented these scenarios to students at University of Wisconsin at Madison. The scenarios appear verbatim below. To be clear: We don’t expect that most readers of this article will judge these scenarios to be cases of knowledge without belief. The scenarios are not intended to be compelling to philosophers trained in – warped by? – the mainstream tradition in analytic epistemology. We suspected, however, that ordinary English-speaking undergraduates would tend to attribute knowledge and deny belief.[5] (We highlight that subjects were English-speaking students only to convey that our findings might be culturally specific, not to suggest any particular linguistic hypothesis.)
Each respondent received just one scenario, with just one question at the end of it, asking whether the protagonist knows, or alternatively believes, the proposition in question. The only difference between the belief and knowledge scenarios was the substitution of “believe” for “know” in the prompt question at the end of each scenario. Each version of each scenario was given to exactly thirty participants (campus passersby offered the opportunity to complete a five-minute questionnaire in exchange for a candy bar). The titles that appear before each scenario are for ease of reference and were not shown to respondents.
(1.) The unconfident examinee (modified
from Radford 1966):
Kate spent many
hours studying for her history exam.
She’s now in class taking the exam.
Everything’s going quite well, until she comes to the final question. It reads, “What year did Queen Elizabeth
die?” Kate had reviewed this date many
times. She had even recited the date to
a friend just a few hours earlier. So,
when Kate sees that this is the last question, she feels relieved. She confidently looks down at the blank
space, waiting to recollect the answer.
But before she can remember it, the teacher interrupts and announces,
“Alright, the class session is almost over.
You have one more minute to finalize your answers.” Kate’s demeanor suddenly changes. She glances up at the clock, now flustered
and worried. “Oh, no. I can’t perform well under this kind of
pressure.” Her grip tightens around her
pencil. She strains to recall the
answer, but nothing comes to her. She
quickly loses confidence. “I suppose
I’ll just have to guess the answer,” she says to herself. With a sigh of disappointment, she decides to
write “1603” into the blank space. This
was, in fact, the correct answer.
Did Kate know
that Queen Elizabeth died in 1603?
yes no (circle one)
(2.) The absent-minded driver (modified from Schwitzgebel 2010):
Ben receives an
email informing him of a bridge closure on his normal route to work. He becomes mildly annoyed and says to
himself, “Now I’ll have to turn on Russell Street and go all the way down to
Langdon Avenue.”
So,
the next morning, Ben wakes up early and quickly gets ready for work. He makes it out of the house with plenty of
time to make the drive. Pleased with the
success of his early departure, he decides to listen to one of his favorite albums
and enjoy the long drive. By the time
Ben is approaching Russell Street, where he should turn, he is enthusiastically
tapping his fingers to the music, not paying much attention to where he is
going, and he drives right past Russell Street, continuing on his normal route
to work. Thus it’s only a matter of time
before Ben will reach the closed bridge and have to drive all the way back to
Russell Street. Nevertheless, Ben just
keeps on tapping his fingers to the music and continues to drive towards the
closed bridge.
Does Ben know that the bridge is closed?
yes no (circle one)
(3.) The prejudiced professor (modified
from Schwitzgebel 2010):
Juliet is a
university professor. Unfortunately, she
is also prejudiced against student athletes.
In her classes, she calls more often on non-athletes than athletes, and
she interprets the comments of the former more charitably. When two soccer players, Brett and Bernard,
come to visit her in office hours, she treats them patronizingly, explaining
the basic concepts of the course in a very rudimentary manner, failing to
recognize the sophistication and intelligence behind their questions. They leave, and shortly after, two students
with no involvement in school sports enter.
Juliet immediately launches into a high-level discussion, generously
assuming the students’ command of the elementary material. When Bernard writes the best essay in the
course, revealing the intelligence that a neutral observer would have
recognized in his previous remarks, Juliet is surprised. All of this is typical of her.
However, Juliet also repudiates all forms of prejudice. She openly affirms that students involved in athletics are just as capable as non-athletes. In fact, she has it on excellent authority that this is the case: Her chair just completed a study showing that the two groups perform equally well in their philosophy classes. Intrigued by this study, Juliet even reviews her own records and finds that, on average, the athletic students had actually performed better than the other students. But, in spite of all this, Juliet’s prejudice remains. She continues to treat her athletic students as if they are less intelligent than her other students.
Does Juliet know
that her athletic students are as capable as her other students?
yes no (circle one)
(4.) The freaked-out movie-watcher:
Susan loves to
watch old horror films. She finally
convinces her friend Jamie to watch one with her. It’s an old horror film that Susan actually
considers to be quite funny, due to its unrealistic plot. The film begins with a group of astronauts
who discover alien life on another planet.
The aliens look somewhat like bumblebees, but they are dark-green and
about two feet in length. The astronauts
capture one of these creatures and bring it back to Earth. Once they have it on Earth, it manages to
escape and starts laying numerous eggs.
The eggs need water to hatch, so the creature lays the eggs in sink
faucets. Thus, whenever people turn on
their sink faucet, hundreds of newly hatched alien creatures fly out and begin
to attack them.
During one of these attack scenes, Susan notices that Jamie is a bit tense. Susan remarks, “This isn’t bothering you, is it? Come on, you should be laughing at this movie. Look how unrealistic it is.” Jamie responds, “Yes, of course it’s unrealistic. But it’s still scary. I just don’t like these types of movies. They frighten me. Can’t we just watch something else?” “Well, I suppose,” Susan says. Susan then turns off the movie, and they quickly get ready for a second trip to the movie store.
On the way out, Susan stops. “Hold on for a second. I’m thirsty. Let me grab a glass of water.” Susan walks over and begins to turn on the sink faucet. Suddenly, Jamie shouts, “No! Don’t do it!” The words come out of Jamie’s mouth before she even has time to consider what she’s saying. Jamie then looks over and sees that it’s only water coming out of the faucet.
Did Jamie know that only water would come out of the sink faucet?
yes no (circle one)
(5.)
The self-deceived husband:
Tim’s wife Diane is cheating on him. For two years, Diane has been conducting a romantic affair with Mark, who is a colleague of hers at work. Over the past two years, Tim has seen frequent clues that Diane is cheating: unexpected credit card charges, late arrivals from work with weak and flustered explanations as to why, unexplained mysterious phone calls, etc. Diane even occasionally calls Tim “Mark”, and once Tim overheard her saying “I love you, Mark” on the telephone when Diane assumed Tim was not in the house. One night several months ago, Diane even confessed to him explicitly, saying anxiously in a quiet moment in bed, “Tim, you know that I have fallen in love with another man and have been cheating on you for a couple of years”. Tim loudly insisted that she was joking, just trying to get his goat because she was mad with him about some out-of-town travel he was doing – and Diane replied that, yes, she was of course just joking.
Despite all this evidence, Tim vehemently insists that his marriage is in good shape and that Diane would never even think of cheating on him. Perhaps, indeed, he says such things a little too vehemently. When Dan, a friend of Tim’s, gently points out to Tim some of the evidence of Diane’s affair, Tim dismisses Dan’s remarks as utter nonsense, saying to himself, “Dan is probably just jealous and wishes that his own marriage was as solid as Diane’s and mine”. When a woman whom Tim finds attractive starts flirting with him at work, Tim brushes her off, saying to himself that he could never do anything that might threaten his marriage. At the same time, however, when Diane comes home late, Tim finds himself much more anxious and bothered about it than he ever used to be, though he can’t quite put his finger on why. When he answers the phone and finds no one there, he sometimes finds himself wondering “could it be a lover of Diane’s?” and then, very quickly after that, “Ridiculous! Ridiculous! She would never cheat!” When he sees a credit card charge for an 8:00 pm dinner at a romantic restaurant, he finds himself with a visual image of Diane having a romantic dinner with a stranger – an image which he rejects as a horrible fantasy, but that he can’t quite put out of mind.
Does Tim know that Diane is cheating on him?
yes no (circle one)
We also created two control scenarios – one which we judged to be a clear case of both belief and knowledge (a man watches a tree fall over in his back yard, and participants were asked whether the man knows/believes that the tree fell over) and one which we judged to be a clear case of neither belief nor knowledge (a woman is about to receive a $20 late charge for a bill after her payment was lost in the mail, and participants were asked whether the woman knows/believes that she will be receiving this late charge). As another control condition, we created a false-P version of the unconfident examinee scenario (Kate writes “1613” instead of “1603”). Also, since “think” is often used in ordinary English to ascribe what philosophers would call beliefs, we asked “think” versions of the five main scenarios – identical to the above scenarios except that “think” replaced “know” in the prompt question.
Finally, we asked forty participants an abstract question about the possibility of knowledge without belief. Half of the participants received the following version of that question:
Some
philosophers have argued that a person can’t know that something is true unless
that person believes that it is true.
Other philosophers have argued that it is possible to know that
something is true without believing that it is true. Both sets of philosophers have portrayed
their views as consistent with the common sense opinions of ordinary
non-philosophers. So we want to know
what you think. Can someone know that
something is true without believing that it is true?
Please
select one response by checking the box next to it:
[ ] Yes, someone can know that something is
true without believing that it is true.
[ ] No, someone cannot know that something is
true without believing that it is true.
The remaining twenty participants received essentially the same abstract question but with the order of the philosophical positions reversed (beginning “Some philosophers have argued that it is possible to know that something is true without believing that it is true”).
The exact wording of all materials is available online at ****.
3. Results.
Results for the five main scenarios as well as the yes-yes and no-no control scenarios are presented in Figure 1. Across the five main scenarios, 77% of respondents attributed knowledge and 41% attributed belief. These percentages are statistically significantly different from each other and, in both cases, from 50%.[6] Given the diversity of the scenarios, however, the aggregate percentages may be less meaningful than the spread for each scenario considered individually. While we did not expect that the knowledge-belief difference would achieve statistical significance for each scenario considered individually, it did so for three of the five scenarios: the unconfident examinee (87% vs. 37%), the prejudiced professor (63% vs. 23%), and the freaked-out movie-watcher (83% vs. 30%). The remaining two scenarios still showed a good spread of response in the predicted direction (67% vs. 50% for the absent minded driver and 87% vs. 67% for the self-deceived husband).
FIGURE 1: Percentage of respondents attributing knowledge or belief to various scenarios. Error bars indicate one-proportion 95% confidence intervals. Stars indicate a statistically significant difference between the knowledge and belief responses at an alpha level of .05.
For three of the scenarios (absent-minded driver, prejudiced professor, freaked-out movie-watcher), the proportion answering “yes” to the “think” question (aggregate 39%) was similar to the proportion answering “yes” to the “believe” question in those same three scenarios (aggregate 34%). However, for the unconfident examinee, the proportion answering “yes” to the “think” question (77%) was more similar to the proportion attributing knowledge than to the proportion attributing belief, and for the self-deceived husband the proportion was intermediate (also, incidentally, 77%). We hypothesize that some respondents may have interpreted “think” in these scenarios as something like guess or suspect rather than believe.
In the false-P unconfident examinee scenario, 27% of respondents attributed belief, approximately the same percentage as attributed belief in the standard true-P version of that scenario.[7] On the abstract question about the possibility of knowledge without belief, respondents’ opinions were evenly split, with 21/40 (53%) asserting that someone can know that something is true without believing that it is true.
The pattern of results thus confirmed our expectations: A majority of respondents ascribed knowledge in our five scenarios, while only a minority ascribed belief. Although in one of the five scenarios (the self-deceived husband), the majority of respondents ascribed belief, that percentage (67%) was close enough to 50% and far enough from the percentage ascribing knowledge (87%) to harmonize with interpreting the case as a case of determinate knowledge and indeterminate, “in-betweenish” belief. We do not assert that a majority of respondents have intuitions in conformity with the view that knowledge does not entail belief, but only that a substantial proportion do, perhaps about half: In the abstract, opinion on the question divided evenly; and although we are hesitant to put much weight on responses to the abstract question (especially since, anecdotally, non-philosophers sometimes also deny the truth condition), the between-subjects spread on the three large-difference scenarios was also around 50%.
The control questions speak against various possible competing interpretations of the main results. The near-ceiling and near-floor responding on the yes-yes and no-no scenarios suggests that participants are willing to endorse “yes” or “no” to either question when the scenario clearly calls for it. The similar pattern of response to the “think” version of the question for the majority of the scenarios suggests that the overall results are not best explained by ordinary speakers’ using the term “believe” in a special way that is in tension with the more commonly used “think”. The results of the yes-yes control and the false-P control suggest that the pattern of responding on the main questions is not best explained by a pragmatically-driven unwillingness to ascribe belief when knowledge is also present.
We acknowledge that there are still other explanations that may be worth considering in future studies (e.g., subjects may be confusing belief with imagination or “alief”, or subjects may be assuming that someone who believes not-P does not also believe P[8]). However, the diversity of the scenarios creates explanatory challenges for those who would attempt to mount a unified debunking hypothesis (assuming a unified debunking hypothesis is preferable, which it may not be if the disparate elements in a disunified debunking hypothesis can be independently motivated): Some scenarios are broadly dispositional (prejudiced professor, self-deceived husband), while others are anchored to a particular moment of behavior (unconfident examinee, absent-minded driver, freaked-out movie-watcher) – and among the anchored scenarios one scenario involves something like a passing thought that P (unconfident examinee), one involves something like a passing thought that not-P (freaked-out movie-watcher), and one seems to involve no P-relevant passing thoughts at all (absent-minded driver).[9] The knowledge sources also vary: The unconfident examinee and absent-minded driver draw on testimony, the prejudiced professor on personal experience and empirical research, the freaked-out movie-watcher on direct observation plus common-sense induction, and the self-deceived husband on inference to the best explanation. In some scenarios, the protagonist would endorse P (absent-minded driver, freaked-out movie-watcher, prejudiced professor), while in one scenario the protagonist is at least momentarily explicitly doubtful about P (the unconfident examinee) and in still another the protagonist rejects P (the self-deceived husband). In fact, the two largely dispositional scenarios are in important respects mirror-images of each other: The prejudiced professor openly affirms P but does not otherwise behave or cognize in a very belief-that-P-ish manner, while the self-deceived husband openly rejects P but shows a fair bit of belief-that-P-ish behavior and cognition. One possible common strand through this diversity is this: Ordinary non-philosophers might often see belief as requiring more consistency in one’s behavioral and cognitive processes than does knowledge. If so, that would not appear to sit very comfortably with the claim that ordinary people intuitively regard knowledge as requiring belief.
4. The Capacity-Tendency Account.
Proponents of the traditional view may wonder what an account of knowledge that doesn’t require belief would look like. One potential attraction of the traditional view – that knowledge entails belief – is that its hypothesis about the relationship between knowledge and belief can be used in an attractively simple analysis of the nature of knowledge. Merely to reject this approach to knowledge, without having anything to replace it, may be unappealing. David Annis writes:
The problem is that philosophers who have attacked the entailment thesis have not offered an account of the relation of knowledge and belief which would explain our basic reaction. Well-entrenched tenets, be they scientific or not, are rarely rejected, even if they involve persistent anomalies, unless there is a competing alternative to fill the void (1977, p. 217).
Rightly so, as Thomas Kuhn (1962/1970) has emphasized.
Therefore, we think it worthwhile to briefly consider one alternative approach, which we will call the capacity-tendency account. Gilbert Ryle summarizes the view in The Concept of Mind:
Epistemologists are apt to perplex
themselves and their readers over the distinction between knowledge and
belief.... Part of this embarrassment is
due to their supposing that “know” and “believe” signify occurrences, but even
when it is seen that both are dispositional verbs, it has still to be seen that
they are dispositional verbs of quite disparate types. “Know” is a capacity verb, and a capacity
verb of that special sort that is used for signifying that the person described
can bring things off, or get things right.
“Believe”, on the other hand, is a tendency verb and one which does not
connote that anything is brought off or got right (1949, p. 133-134).
Although Ryle does not explicitly carry this view to what seems its natural conclusion – that one can have the capacity without the tendency – others do: Joseph Margolis suggests that knowledge involves “one’s capacity to provide the right information in the right way” while belief involves “the likelihood that one would perform appropriately if one were asked to” (1973, p. 78), and he further notes, “Knowledge, it seems, is ascribable in the absence of corresponding thoughts and beliefs, on the condition of certain relevant skills rather than certain dispositions obtaining” (1973, p. 7); Robert K. Shope proposes to analyze knowledge “avoiding the belief/acceptance condition” (i.e., condition ii in Section 1 of this article) and instead adding a condition that links knowledge with a particular type of power or capacity (2002, p. 53-55). Just as, in the case of knowing how, one might have the capacity to juggle six balls (and thus know how to do it), without the tendency to succeed in most of one’s attempts, so likewise, we suggest, Juliet has the capacity to act on her well-grounded information that student athletes are equally capable even if she lacks the tendency to act on that information; Ben has the capacity to recall the bridge’s closure even if he tends to forget about the closure; somewhere in Tim’s secret heart, it seems, lies knowledge of his situation even if he does not allow that knowledge to penetrate most of his thought and behavior; similarly, perhaps, for Kate and Jamie, although the lack of the tendency in their cases may be fairly short-lived. Though we are generally leery of storage-and-retrieval metaphors for the mind (Schwitzgebel 2001a; McGeer and Schwitzgebel 2006; Hurlburt and Schwitzgebel 2011), it’s as though knowledge requires only having the information stored somewhere and available to be deployed to guide action, while belief requires some consistency in deploying the information (at least dispositionally or counterfactually).
We suggest that, if the capacity-tendency model is correct, knowledge-sufficient capacities are determinately present in the five cases at hand (at least if the cases are fleshed out in intuitively plausible ways: e.g., with the assumption that Kate answered “1603” due to the right kind of trace from earlier learning). Whether belief-sufficient tendencies are also present is less clear – the protagonists’ dispositions are, by design, divided; the cases might best be regarded as vague or “in-between” cases on a dispositional approach to belief (Schwitzgebel 2001a, 2002, 2010; also Price 1969; Margolis 1973). One can’t have a knowledge-sufficient capacity, perhaps, without at least a bit of the corresponding dispositional tendency. If our five cases are clear instances of knowledge and vague instances of belief, that would harmonize nicely with one aspect of our empirical results: The percentage of subjects attributing knowledge in the five scenarios was not too far from the percent attributing knowledge in the yes-yes control scenario (75% vs. 90%), while there was a larger gap between the percent attributing belief in the five scenarios and the percent attributing belief in the no-no control scenario (41% vs. 0%). If scenarios of this sort are vague or in-between cases of belief, that could also explain Armstrong’s and others’ sense that they are not clear cases of knowledge without belief.
If philosophers regard it as prima facie obvious that knowledge entails belief and adopt a philosophical strategy of waiting for a clear counterexample before abandoning that view, they may take comfort in never finding such a counterexample. However, the empirical evidence just presented suggests that it is not prima facie obvious that all instances of knowledge are also instances of belief; and unclear cases might justifiably be regarded as an important type of evidence, rather than merely as cases to be dismissed in developing a philosophical theory – especially if there is a philosophical view that predicts the existence of unclear cases.[10]
References
Alexander, P.A., & Dochy, F.J.R.C. (1995). “Conceptions of Knowledge and Beliefs: A Comparison
across Varying Cultural and Educational Communities,” American Educational Research Journal, 32, 413-442.
Alexander, P.A., Murphy, P.K., Guan, J., & Murphy,
P.A. (1998). “How Students and Teachers
in Singapore and the United States Conceptualize Knowledge and Beliefs:
Positioning Learning Within Epistemological Frameworks,” Learning and Instruction, 8,
97-116.
Annis, D. (1969).
“A Note on Lehrer’s Proof That Knowledge Entails Belief,” Analysis, 29, 207-208.
Annis, D. (1977).
“Knowledge, Belief, and Rationality,” The Journal of Philosophy, 74,
217-225.
Armstrong, D.M. (1969). “Does Knowledge Entail Belief?” Proceedings of the Aristotelian Society, 70, 21-36.
Armstrong, D.M. (1973). Belief, Truth and Knowledge. Cambridge: Cambridge University Press.
Audi, R. (1998). Epistemology. London: Routledge.
Black, C. (1971).
“Knowledge without Belief,” Analysis,
31, 152-158.
Boldrin, A., & Mason, L. (2009). “Distinguishing Between Knowledge and Beliefs: Students’ Epistemic Criteria for Differentiating,” Instructional Science, 37, 107-127.
Cohen, J. (1966).
“More about Knowing and Feeling Sure,” Analysis, 27, 11-16.
Cohen, J. (1992). An Essay on Belief and Acceptance. New York: Clarendon Press.
Currie, G., & Ravenscroft, I. (2002). Recreative Minds. Oxford: Clarendon Press.
Dartnall, T. (1986). “Radford Revisited,” The Philosophical Quarterly, 36, 395-398.
Davidson, D. (1985). “Deception and Division,” in E. Lepore and B. McLaughlin (eds.), Actions and Events. New York: Basil Blackwell.
Dretske, F.I. (1981). Knowledge and the Flow of Information. Cambridge, MA: MIT.
Dretske, F.I. (1985/1994). “Précis of Knowledge and the Flow of Information,” in H. Kornblith (ed.), Naturalizing Epistemology, 2nd ed. Cambridge, MA: MIT.
Feldman, R. (2003). Epistemology. Upper Saddle River: Prentice Hall (Foundations of Philosophy Series).
Gendler, T.S. (2008). “Alief and Belief,” Journal of Philosophy, 105, 634-663.
Hamlyn, D.W. (1970). Theory of Knowledge. Garden City: Anchor Books.
Harker, J.E. (1980). “A Note on Believing That One Knows and Lehrer’s Proof That Knowledge Entails Belief,” Philosophical Studies, 37, 321-324.
Hurlburt, R.T., & Schwitzgebel, E. (2011). “Presuppositions and Background Assumptions,” Journal of Consciousness Studies, 18 (1), 206-233.
Kuhn, T.S. (1962/1970). The Structure of Scientific Revolutions, 2nd ed. Chicago: University of Chicago.
Lehrer, K. (1968). “Belief and Knowledge,” The Philosophical Review, 77, 491-499.
Lehrer, K. (1989). “Knowledge Reconsidered,” in M. Clay & K. Lehrer (eds.), Knowledge and Skepticism. Boulder, CO: Westview Press.
Lewis, D. (1996). “Elusive Knowledge,” Australasian Journal of Philosophy, 74, 549-67.
Maggioni,
L., Riconscente, M.M., & Aleander, P.A. (2006). “Perceptions of Knowledge and Beliefs among Undergraduate
Students in Italy and the United States,” Learning and Instruction, 16, 467-491.
Margolis, J. (1973). Knowledge and Existence. New York: Oxford University Press.
McGeer, V., & Schwitzgebel, E. (2006). “Disorder in the Representational Warehouse,” Child Development, 17, 1557-1562.
O’Neill, E. (1918). Where the Cross Is Made, reprinted in The Plays of Eugene O’Neill, vol. 12. New York: Charles Scribner’s Sons (1935).
Price, H.H. (1969). Belief.
London: Allen & Unwin.
Radford, C. (1966). “Knowledge – By Examples,” Analysis, 27, 1-11.
Radford, C. (1988). “Radford
Revisiting,” The Philosophical Quarterly,
38, 496-499.
Ring, M. (1977). “Knowledge: The Cessation of Belief,” American Philosophical Quarterly, 14:51-59.
Ryle, G. (1949). The Concept of Mind. London: Hutchinson.
Schwitzgebel, E. (2001a). “In-Between Believing,” Philosophical Quarterly, 51, 76-82.
Schwitzgebel, E. (2001b). “On Containers and Content, with a Cautionary Note to Philosophers of Mind,” Unpublished MS, available at http://faculty.ucr.edu/~eschwitz/SchwitzAbs/Containers.htm.
Schwitzgebel,
E. (2002). “A Phenomenal, Dispositional
Account of Belief,” Noûs, 36, 249-275.
Schwitzgebel, E. (2010).
“Acting Contrary to Our
Professed Beliefs, or the Gulf Between Occurrent Judgment and Dispositional
Belief,” Pacific Philosophical Quarterly,
91, 531-553.
Shope, R.K. (2002).
“Conditions and Analyses of Knowing,” in P.K. Moser (Ed.), The Oxford Handbook of Epistemology (pp.
25-70). Oxford: Oxford University Press.
Sorensen, R. (1982). “Knowing, Believing, and Guessing,” Analysis, 42, 212-213.
Stanley, J., & Williamson, T. (2001). “Knowing How,” Journal of Philosophy, 98, 411-444.
Stanley, J. (2010). “Knowing (How),” Noûs, no. doi: 10.1111/j.1468-0068.2010.00758.x.
Steup,
M. (2001/2006). “The Analysis of
Knowledge,” Stanford Encyclopedia of
Philosophy. http://plato.stanford.edu/entries/knowledge-analysis/ (Winter
2010 edition).
Walton, K.L. (1978). “Fearing Fictions,” Journal of Philosophy, 75, 5-27.
Williams,
B. (1970). “Deciding to Believe,”
reprinted in B. Williams, Problems of the
Self (pp.136-151). Cambridge:
Cambridge University Press (1973).
Williams, M. (2001). Problems of Knowledge. Oxford: Oxford.
Williamson, T. (2000). Knowledge and Its Limits. Oxford: Oxford.
Woozley, A.D. (1953). “Knowing and Not Knowing,” Proceedings of the Aristotelian Society, 53, 151-172.
[1] See Price 1969; Hamlyn 1970; though
see Stanley and Williamson 2000; Stanley 2010.
[2] Recent textbooks and review articles
that summarize the literature in this way include Audi 1998; Steup 2001/2006;
Williams 2001; Feldman 2003.
[3] Lehrer 1968 might appear to be an
important exception. However, his
positive theoretical argument turns on a premise (premise 3 in Section III)
that begs the question against the relevant opponents’ views; thus, the force
of his article, like most others’, rests primarily on his ability to undercut
his opponents’ putative counterexamples.
(See Annis 1969; Black 1971; Harker 1980.)
[4] Others who deny that propositional
knowledge entails belief include Woozley 1953; Williams 1970; Black 1971;
Margolis 1973; Annis 1977; Ring 1977; Harker 1980; Lewis 1996 (in passing);
Shope 2002; Schwitzgebel 2010. Williamson
2000 argues that belief is not conceptually prior to knowledge but nonetheless
asserts that knowledge entails belief.
We set to one side views on which the necessary attitude in condition ii
is “acceptance” rather than belief, as in Lehrer 1989 and Cohen 1992. Some reliabilists, such as Dretske (1981,
1985/1994), regard knowledge as possible without a lot of the cognitive
apparatus that one might think necessary for “justification”, but still insist
on the necessity of belief.
[5] One empirical precedent for our
expectation is a small literature in educational psychology examining students’
opinions about the relationship between knowledge and belief, when asked in the
abstract: Alexander and Dochy 1995; Alexander et al. 1998; Moggioni,
Riconscente, and Alexander 2006; Boldrin and Mason 2009.
[6] Knowledge vs. belief: two-tailed
two-proportion z test (116/150 vs. 62/150), p < .001. Knowledge vs. 50%: two-tailed one-proportion
z test (116/150 vs. 50%), p < .001.
Belief vs. 50%: two-tailed one-proportion z test (62/150 vs. 50%), p =
.04.
[7] Two-tailed two-proportion z test
(8/30 vs. 11/30), p = .20.
[8] See, e.g.,
Walton 1978; Davidson 1985; Currie and Ravenscroft 2002; Gendler 2008.
[9] In the two temporally anchored scenarios involving a passing P-relevant thought (unconfident examinee and freaked-out movie-watcher), the passing P-relevant thought corresponds to the scenario’s target moment. Given that these two scenarios end with some brief material that advances the narrative past this target moment, we formulated the prompt question in the past tense in order to highlight the moment at which the question is targeted.
[10] For helpful comments and discussion,
we thank Dave Chalmers, Jeremy Fantl, David Hunter, Joshua Knobe, Jon Kvanvig,
Al Mele, Mark Phelan, Gualtiero Piccinini, Jonathan Schaffer, Larry Shapiro,
Declan Smithies, Jason Stanley, Mike Titelbaum, Peter Vranas, Jonathan Weinberg, Timothy
Williamson, the audience at the 2009 Eastern Division meeting of the American
Philosophical Association, and readers of the following blogs: Brains, Certain
Doubts, Experimental Philosophy, and the Splintered Mind. Mark Phelan independently replicated our
results for all versions of the unconfident examinee scenario (know, believe,
think, and false-P), finding similar
results except for somewhat fewer yesses in “think” version (50%) – a result
that is no worse and perhaps better for our experimental hypothesis..