Tuesday, March 1, 2016

Another reason Gangadean should be a skeptic

We generally believes things for reasons. Sometimes we believe things for good or appropriate reasons. In such circumstances it is thought by philosophers that we form beliefs on the basis of justifying reasons. As a first pass, justifying reasons are the sorts of reasons that make a belief rational to have. What makes them justifying reasons? Or what makes a belief rational to have? Well it's hard to say with very much precision, but many philosophers think it has to do with the fact that such reasons make the belief in question more likely to be true. It's not obvious though what exactly it means for a reason to make a belief more likely to be true. The nature of the relationship between reasons and beliefs is anything but precise. I'll ignore such difficulties although the Gangadeanian who is after certainty of basic things, shouldn't.  Even if a reason is justifying only if it guarantees the truth of a belief, it isn't entirely clear what it means for a reason or bit of evidence to guarantee the truth of something. At any rate, if you believe that it is snowing outside because you look outside and see that it is snowing, then (part) of your reasons for believing it is snowing outside is that you see that it is so. In such a case, it's generally thought that you have a justifying reason for your belief i.e., you are justified in having your belief.

Other times we believe things for poor reasons or the wrong kind of reasons or perhaps even for no reason at all.  In such situations a person believes things for reasons that are not justifying. We are prey to a long list of cognitive biases. We are strapped for time and mental resources and form beliefs through heuristics, as a result of habit, and any number of automatic responses that aren't within our "direct" control. Further, our cognitive processes are the product of our upbringing, the social groups we belong to, the ideas we encounter regularly, and the like. To add insult to injury, sometimes we take something to be evidence or a good reason for believing something, but we are mistaken. Just because a reason appears to you to be a good reason to believe a proposition, doesn't ensure that this is so.

When we form beliefs on the basis of bad reasons, we do so on the basis of reasons that are not justifying reasons. They don't result in a justified belief. If I believe that God exists solely on the basis of the fact that I was raised to believe it, according to philosophical orthodoxy, I'm not justified in believing it. That is to say, my belief is not justified. It's hard to say precisely why it's epistemically bad for me to form beliefs on the basis of cognitive biases, or on the basis of what seems most familiar to me, but roughly, in such situations my reasons for believing P doesn't seem to make P more likely to be true. Of course, my belief might turn out true, but it wouldn't be knowledge, because believing something just because it's familiar or because you've been told it, is at least sometimes irrational. Again figuring out just why it is irrational and unjustified is no simple task. Suppose I go to a palm reader who tells me that I will get a flat tire today. I believe that I will get a flat tire today as a result. Suppose further that it turns out that I get a flat tire. My belief is thus true. But it doesn't seem rational--it doesn't seem like I should believe that I will get a flat tire on the basis of the palm reader's saying so. But my belief is true which means forming a belief on the basis of what the palm reader said is reliable on this given occasion. It got to me a true belief!

Having beliefs formed on the basis of good reasons, that is, having justified beliefs has primarily been framed in terms of the kinds of reasons which produce knowledge or at least put one in a position to know. We want knowledge and we can't have knowledge without justified beliefs. I'm not saying that that's the only reason philosophers care to theorize about epistemic justification, for there might be value in figuring out what it takes to have a reasonable/justified belief in and of itself, but the apparent connection to knowledge is certainly one of the main motivations behind the historical pursuits of the nature of epistemic justification. We want to know what knowledge is, and we think it has something to do with justification, that's a reason to think and talk about the nature of epistemic justification.

All of what I've said so far seems compatible with Gangadean's views. He cares to talk about knowledge and he thinks it has much to do with the nature of justification. And he thinks people sometimes have justified beliefs and sometimes don't---and that this amounts to believing things for the right vs. the wrong kinds of reasons. It's just that when it comes to whether a person knows something or not, he adds a more stringent criteria-- to have a justifying reason in believing P is to believe P for reasons that guarantee the truth of P. This is why for Gangadean I can't know that it's snowing outside, because my reasons for believing that it is snowing outside are not perfect--it's at least possible that my perceptual faculties are failing me.

Finally, accepting or believing or even knowing certain reasons is not the same as believing another proposition P  for those very reasons. I can believe that it is snowing outside and I can believe that it is no more than 32 degrees (at least up in the clouds--I'll drop this qualification, but nothing significant hinges on it). The former is evidence of the latter. That is to say, if I were to believe that it was no more than 32 degrees outside on the basis of my belief that it is snowing outside, then I would have a justifying reason (except of course for the Gangadeanian) for believing that it was no more than 32 degrees outside. But suppose I don't make the connection because I'm simply ignorant about the relationship between freezing temperatures and snow (indeed there are such persons!). Instead, suppose that I come to believe that it is no more than 32 degrees outside for the following reason. I looked online, but due to being distracted and tired, I accidentally typed in the wrong zip code into the weather channel website. As a result, I pulled up the weather report for a place very far from my current location. But I'm totally unaware of this because I'm not paying very close attention. But suppose it just so happens that it is 28 degrees in this other town according to the weather report. Suppose then that my reason for believing it is no more than 32 degrees outside at my location is that I looked at the wrong weather forecast (though again, I'm unaware that it's the wrong forecast). Now, I also believe it's snowing outside based on what I plainly see, but I haven't made the connection that snow and below freezing temperatures are intimately associated. To be sure, my belief about the temperature outside is not the result of my believing that it is snowing outside, even though I believe both propositions. In this case, I have a belief that could serve as a justifying reason for my belief concerning the current temperature at my location. But I don't believe the latter for that reason. Instead I believe it for a bad reason i.e., the wrong weather report. So having a justifying reason for believing P is not the same as believing P for that justifying reason. I can believe P, believe a reason that could be a justifying reason for P, and yet not believe P for that justifying reason. The thing to say then is that I believe P, have a justifying reason for believing P, but my belief in P is not justified.

Now here's an interesting question. How does one know that one believes something for the right kinds of reasons? For instance, if you take a certain belief I have, I can probably give you justifying reasons for it. That is to say, I can list off a number of reasons why I think the belief is rational to have. But that is not necessarily the same as giving you the reasons for which I have the belief. As I've mentioned, these come apart. So it's one thing to know a set of reasons which make a belief justified/rational/reasonable and it's quite another to have the belief in question in virtue of or on the basis of those reasons.

The more general question is this: how do I ever know the very reasons which lead me to believe something? Or how do I know the very reasons, on the basis of which, I come to believe some proposition? And the answer seems to be that I know such things at some level, immediately.  I introspect about why I believe something, and I find myself thinking about certain reasons which give me the feel that the belief in question is true. Of course, the fact that in thinking about why I believe that P, some reasons come to mind isn't proof that I believe P for those very reasons. But I take it that additionally there is a kind of feel or sense of being convinced or compelled to believe P on the basis of those reasons. Thus I infer that I believe P for those same reasons. Moreover, what is absent is anything like a deductive argument which has as its conclusion, "I believe P on the basis of reason R".  The inference I might make from the "feel" just mentioned and the association of the reasons that come to mind when I ask myself why I believe something is not one of entailment.  Maybe I could look at my behaviors--i.e., what I am inclined to cite as my reasons for believing something when asked--but this too depends on my introspection and what I am immediately aware as a result. Moreover, behaviors are at best an inductive and fallible means of getting at what is going on in the "head".

The moral: insofar as I can know why I believe something, (i.e, the reasons in virtue of which I believe something), I do so via introspection, induction, and ultimately intuition. I think about why I believe that this summer will be a warmer one, and what comes to mind are other beliefs that seem to bear a certain evidential relationship to the former and which have the immediate feel of making the former compelling. For instance, I think about the article I just read about the weather patterns which indicate a warmer than usual summer in my town and these considerations come with a sort of tug towards believing that it will be warmer this summer. The point is, it's possible that I am wrong when I try and determine what my reasons are for believing something--indeed it's not only possible, it's actual.

The problem, for Gangadean, begins to take form when we consider that none of the above provides an infalllible means by which we can know (with certainty) why we believe something. It's at least possible that I believe something for one set of reasons, but my introspection, induction and the like are misleading me into thinking that I believe them for another. The skeptic that Gangadean demands that we always answer will press this point. Based on my conversations with Gangadean and Anderson I suspect they will "answer" the skeptic by contending that we can know (with certainty) everything that is going on in our minds. That is, whenever we deliberate about what is going on in our "heads", Gangadean has the view that we can never be wrong. That is to say, he believes that we have infallible and comprehensive access to what goes on in our minds. Actually I'm not entirely sure about the comprehensive part, but I'm adding it because if we didn't also have exhaustive knowledge concerning our mental lives, then that could generate another skeptical problem for the Gangadean camp. If there is stuff happening in our minds which we can't be aware of, then it's possible that I merely think that I believe that P for one set of reasons, and there are another set of reasons which override those reasons and take their place, but of which I'm entirely unaware. That would be incompatible with the thesis that I can know, with certainty, such things like, "I believe P for reasons R" and so ultimately incompatible with the thesis that I have infallible access to the stuff going on in my mind. Anyway, even if he denies that we have comprehensive knowledge of our mental states, there's just no way for him to prove that we have even infallible access to our mental states (he'll likely try a transcendental argument which I'll get to in a second).

Now remember, that Gangadean thinks knowledge of P requires certainty regarding P. Further he thinks that being certain of P is to be able to show that the opposite of P is not possible. This is why deduction plays such a central role in his theories. It's the only way to show that the opposite of some proposition is logically impossible. But again, what deductive argument can Gangadean offer which concludes with, "I believe P for reasons R" (note this is a schema which means you can substitute in any proposition for 'P' and and reason in for 'R')?

So what's the upshot here? If all I have said is right, then according to Gangadean's committments (including his definition of knowledge), we can't ever know the reasons why we believe something. And if we can't know that, then we can't know whether or not we are justified in believing something. I might be able to recite certain reasons for why belief in God is rational, but I can't rule out the possibility that I might not believe that God exists for those reasons. That means I can't know whether or not I'm justified in believing that God exists, which means I can't know that I know that God exists. But this is a problem for the internalist like Gangadean who thinks he's got a better theory of knowledge than an externalist because he thinks we ought to be able to know that we know something. This is one of the main appeals to internalist theories of knowledge/justification. But it turns out there's no way for Gangadean to know that he knows that God exists because he can't know what his reasons are for believing that God exist---at least he can't be sure. This means it's possible from Gangadean's own point of view, that he doesn't know that God exists and any number of other "basic things."

Now if Gangadean wants to contend that we can just know (with certainty) everything that goes on in the mind, so that we can, by introspection, know the very reasons why we believe something, then he needs to argue for this. I suspect he will try to use a transcendental argument here. But it will fail. It will fail for the very reasons that all of his other transcendental arguments fail.

Each of Gangadean's transcendental arguments have the same flaws. He essentially begs the question against his interlocutors. He smuggles in some dubious unproven principle about how we need clarity at the basic level to argue, think, speak, and have meaning. And then such a principle is used to argue that we must have clarity at the basic level. But he never gives anybody a single reason to think that we need certainty/clarity of basic things in order for there to be meaning, or for the possibility of intelligible thought, talk and discourse. That is, he's out of arguments for proving the smuggled in principles. He just says it is so and his people just accept it. So I suspect that a transcendental argument concerning knowledge of the reasons why we believe things would go the same way. And it would faulter the same way. He will posit a principle like, "if we can't know the very reasons why we believe things, then arguing to convince people by giving them reasons to believe things is pointless." Perhaps he will suggest that arguing in the manner that I have instanced presupposes that my audience will consider the reasons I present and then be able to change their views based on those reasons. But if they can't even know the reasons why they believe things, then they could not know that they have been convinced by my reasons to adopt a different position. So what's the point of arguing?

But as I've said, the same issues arise. I highlight two.

1) Presupposing P for some undertaking like arguing, doesn't prove that Not-P is impossible and so doesn't prove P is true. So even if Gangadean is right that the arguer presupposes that a person can know with certainty, why they believe things, it doesn't follow that it's necessarily true that we can know with certainty why we believe things.

2) More importantly, the argument equivocates on the word 'know'. It's Gangadean's bizarre position that knowledge requires certainty, not mine. But we can simply reject his definition of knowledge and there is no threat of skepticism and no threat of self-referential absurdity. I know why I believe something--I know the reasons I believe something via intuition and fallible introspection and induction. I don't need certainty because knowledge of P doesn't require clarity or certainty regarding P. It's a fantasy that Gangadean is peddling. So it makes perfect sense why I argue to convince people. I believe that they too can know why they believe things. But if we take Gangadean at his word that knowledge requires certainty, then it's true, we can't even know why we believe things. If we can't know why we believe things, then for all we know, we might believe things for bad reasons, reasons which can't justify our beliefs. This shows that Gangadean's internalism leads to skepticism. What it doesn't show is that one ought to (by way of consistency) be a skeptic if one rejects Gangadean's definition/analysis of knowledge.

3) Lastly, one can simply reject the smuggled in principle. Again this generalizes because Gangadean's transcendental arguments follow a pattern. There's always an unproven principle or rule that clarity or certainty is necessary for something or other. Just like you can outright reject the idea that without clarity at the basic level, thought and talk are meaningless, you can reject the idea that clarity (concerning the reasons why we believe things) is required in order to make sense of the practice of presenting reasons to change someone's mind. You don't have to know (with certainty) the reasons why you believe something in order for me to consistently present you with reasons to believe something else for instance. At the very least we can demand that Gangadean offer proof of such a rule or principle.