I used to think I cared about having knowledge. These days, I’m pretty sure I don’t care about it at all: what I care about really is having justified true beliefs. Should my justified true beliefs turn out to be knowledge, fine and dandy, but knowledge isn’t something I care about.
Justified true beliefs ≠ knowledge. Such was proved by Edmund Gettier in 1973. In a pathbreaking three-page paper, Gettier gave two counter-examples to the claim that knowledge = justified true belief, which basically everyone agrees were decisive.
Gettier’s original examples (aka ‘Gettier Cases’) were weird and complicated, so here’s a Gettier Case I’ve given before:
[I]magine that, below this paragraph, you perceive an alluring pink “subscribe” button, which beckons in your dreams and whispers softy in your nightmares.
With the enthusiasm of a roided woodpecker, you button-mash the glowing rectangle and form the belief that—for the rest of your life, when you’re all alone and your kids have stopped communicating—your inbox will never be barren, because you will always get emails from me.
As it happens, your belief is true: Going Awol is a blog that cares, and I will always send you emails, even if you might not want them. Nevertheless, your belief was true by accident. Up above you (don’t look or a fairy dies), there’s a Going Awol fanatic named Oak who likes to project Going Awol “subscribe” buttons onto people’s phones as part of an unauthorised marketing campaign. As luck would have it, her projection perfectly overlapped with the real thing, meaning you’d have seen—or seemed to have seen—the button no matter what.
Here, you have a justified true belief (my inbox will never be barren) that—intuitively—doesn’t count as knowledge. Your belief was true and justified, but the connection between its truth and its justification was a matter of luck, a product of happenstance.
Still, I don’t think you should care about that. Suppose you care—as I do—that (a) your beliefs are true, and (b) you didn’t arrive at them in a blameworthy or criticisable way. In this case, both values are attained: your belief is true, and you didn’t fail in your epistemic obligations. Given this, what’s there left to care about? True, you arrived at your true belief by luck: but what’s wrong with getting lucky? Getting lucky is great, I love getting lucky!
Granted, you don’t want to intentionally leave your beliefs up to luck. You want to do the best you can. But if you do the best you can, and your justified belief ends up being true by luck… well, lucky you! Who cares that you don’t have knowledge!
Maybe you’ll object as follows: “suppose, in the example above, a malicious Substack genie hacks the website to make it so that when I press the subscribe button (the real one, hidden under the hologram of the one I think I’m pressing), it subscribes me to “Bentham’s Newsletter” (*shudders, dies*) instead of Going Awol. In that case, you’d form the justified true belief that your inbox will never be barren, but be wrong about whose newsletters will be gracing it (or cursing it, as the case may be.) Surely you’d care about not having knowledge then...”
Admittedly, you would care in this scenario. Indeed, if you subscribed to “Bentham’s Newsletter”, you might even think about killing yourself. But what you’d care about wouldn’t be the fact that you don’t have knowledge: what you care about is that (a) you have to delete a bunch of typo-ridden emails from some Dunning-Kruger patient from Michigan, and (b) that you had a false belief about a closely related proposition: namely, about who you’d be getting emails from. The fact that, in addition, your justified true belief that your inbox will never be barren didn’t amount to knowledge is immaterial.
Knowledge is #overrated, send tweet. Justified true beliefs are where it’s at.1
Update: thanks to a certain Yale epistemologist and arch-Going Awol subscriber, it has come to Lady Whistledown’s attention that Mark Kaplan once defended the knowledge-is-lame view via a different argument in “It’s Not What You Know That Counts”.
Epistemic status: sub-zero. Someone please tell me why I’m wrong about this…
I subscribed to Bentham's Bulldog, and it's ruined my life. None of my friends reply to me anymore. Everything smells bad. I haven't had an erection in 3 weeks
Here's my best try at arguing against:
If aether accurately predicts the behavior of light & gravity, then any belief "light will go from [x] to [y] successfully" or "celestial body [a] will do [b]" (because of aether) is justified & true—but we're not gonna have a great time trying to do space travel or discover any more things about the universe. Having a *good, true model for why something happens*(knowledge) isn't important to the truth of this particular belief—but it is important to inform all the next beliefs that I could/should hold.
To your analogy: if I accidentally subscribed to Bentham's Newsletter, I might go back and try to subscribe to you some other way. (And again, hold the JTB that my inbox will be filled.) But, if the substack genie is still substack genie-ing, I would accidentally subscribe to Bentham and then think about killing myself again... The knowledge that a genie is getting in the way is very clearly important for all my future actions.