The Meaning of the Universe

by Louis Lopez




Knowledge and Free Choice


Book II



Part 1







© 2021 by Louis Lopez
All rights reserved. It is allowed to reproduce and distribute copies of this book PROVIDED (1) that it is copied exactly as found here without any alterations to the wording and (2) that no more than $20 be charged for each copy.






Table of Contents (Part 1)


Part 1




Preface

Appearance and Reality
The Issues Ahead

1 The Problem of Certainty

No Room for Error
The Power of the Skeptics
The Tyranny of Certainty

2 A Better Definition of Knowledge

A New Approach
Gettier Counterexamples
A New Standard

3 Knowledge of the External World

Descartes' Thought Experiment
Dealing with Doubt
Berkeley's Views
Kant's Explanations
The Reliability of the Senses
The Causal Theory of Perception
The Permanence of Objects
Physical Explanations

4 The Road to Solipsism

The Implications of Solipsism
The Absurdity of Solipsism
Other Minds
Multiple Pea Brains
The Viability of Solipsism
A Lesson from Solipsism

5 The Sources of Knowledge

The Senses in History
The Aid of Reason
Mental Abilities
Root of the Dispute
Innate Knowledge
Extraction of Innate Ideas
Cultural and Epochal Differences
Innate Abilities
The Indications of Evolution

6 Kant's Synthetic a Priori

Analytic and Synthetic
The Ding an Sich
The Mind's Constructions
Different Perspectives
The Categories
How Are Synthetic a Priori Propositions Possible?
Color as Analytic
Color as a Posteriori
Restriction on a Posteriori Statements
Status of the Contingent a Priori
Nature of the Synthetic a Priori
The Analytic/Synthetic Distinction

7 Natural Inclinations

Predispositions and Teachings
Innate Belief in God
The Religious Motives Behind Rationalism
Observational Approach

8 Investigating Intuition

Testing Intuition
Unconscious Inference
Standards for Measuring Success
Senses and Intuition
Need for Justification
Surrogate Justification
Learning Space and Time
Intuition and Mysticism

9 Two More Skeptical Doubts

Induction
Hume's Exaggerations
Induction is Not Deduction
Support for Induction
The Self and the Soul
Evidence for a Soul
Personality
Personal Identity--Composition
Memory as the Criterion
Some Unforeseen Consequences
Body as the Criterion
Fate of the Soul

10 Weighing and Verifying

A Method of Reaching Conclusions
Verifiability and Bias
An Additional Principle



Preface




This second book in the trilogy The Meaning of the Universe will examine the issues of knowledge and free will from a philosophical perspective. The first book The Predominance of the Physical World dealt with the evidence and arguments for the existence of a spiritual substance in the universe and found them deficient. It should be clear to anyone open to considering all the evidence that the universe is entirely physical. The study of knowledge from a philosophical perspective, as opposed to a psychological or historical perspective, is known as epistemology.

Episteme is the Greek word for knowledge. Epistemology and free will have been topics of great interest in philosophy probably as long as philosophy has been in existence. There is much curiosity about the two areas and they also have a bearing on other important philosophical issues. Numerous claims are made about the universe, the world, science, metaphysics, and human nature. It is only natural to ask of those making the claims: how do you know? It has not been easy for philosophers to provide a clear picture of what we can be said to know or how we come to have knowledge. The debate has continued for centuries and has even involved arcane questions such as how we know that objects or other minds exist. In spite of it being a difficult task to understand knowledge, it is an especially pressing endeavor for most philosophers because they have aimed at having precision in their doctrines.

Philosophy sprang up because of a desire to have a clear and accurate understanding of phenomena in the world. Philosophy was clearly intended to be a pursuit different from something such as poetry, which thrives on ambiguity, allusion, vague suggestion, and the like. Plato and other philosophers have held poetry in low regard, probably because of this elusive nature. Due to their desire to propose true and correct doctrines, philosophers saw the need to uphold high standards of knowledge. There was consequently a perceived need to perform a close study of knowledge and formulate epistemological doctrines.

Knowledge is also an important topic to consider in metaphysical systems. As in philosophy in general, those proposing a metaphysical system make numerous claims, and it becomes important to understand how those claims are known. Since an attempt is being made in these three books to formulate a metaphysical system, that is an additional reason for dealing with epistemology.

The Issues Ahead

Of particular interest in complete metaphysical systems is the understanding of the motivating force or scheme behind the universe. Usually this is thought of as God, but there can be other designations. Georg W.F. Hegel and some of his contemporary philosophers developed the pantheistic idea of the Absolute and became known as absolute idealists. It is often said that metaphysics is the study of the difference between appearance and reality. Trying to understand what is the motivating idea or ideas that make the universe run is related to the search for the reality behind the appearance.

Knowing the reality behind the appearance was a vexing question for Descartes. It gained special prominence after George Berkeley and Immanuel Kant. The effort to understand the nature of what lies beyond our own consciousness, or gaining “knowledge of the external world,” took up more than a century after Kant. Another well-known epistemological debate has been the empiricism-rationalism controversy in which the rationalist claim is that humans gain knowledge purely through reason. Empiricism holds that we gain knowledge primarily through the senses. It is not as much an issue as it was long ago but is still pressed by some. Only a few relevant issues that fall under epistemology will be analyzed here.

A second big idea usually treated in a metaphysical system is that of morality. A thorough formulation and understanding of moral rules is needed for humans to live in social harmony. It is also often thought that God expects humans to observe prescribed moral standards. Punishment awaits those who transgress.

Closely related to morality is the question of free will. If people are to follow moral rules, they have to be able to understand what they are expected to follow. This is not as simple as some seem to expect. There are the mentally deficient who have trouble understanding even the simplest rules. How much can be expected of them? In the case of criminal or religious punishment, are they to be held as equally accountable as everyone else? Religious teachings have hardly delved into what responsibility the mentally defective have for their actions.

To what extent should a person’s upbringing in childhood and cultural environment be examined before meting out punishment? Courts cannot delve deeply into those considerations because they do not have the time or resources. Can those who suffer from akrasia, or weakness of the will, be held responsible to the same degree as those who appear to be steadfast in their ability to carry out their decisions? With regard to free will, an explanation of the extent to which humans can make free choices and take free actions is necessary. The subject of morality will be postponed to Book III.

Philosophers sometimes like to discuss in depth a number of issues that can be dealt with briefly. Some of those issues are in fact only tangential. The analysis here will be kept as simple as possible with the idea in mind that any more detail is unnecessary and can also create undue confusion. Philosophers have come close to solving the determinism-free will problem. Hopefully the ideas presented here furnish a solution.

Table of Contents (Part 1)


1 Problem of Certainty




Humans have been trying to understand knowledge for millennia. Socrates tried to come up with a definition of knowledge, as recounted in Plato's dialogue "Theaetetus."(Plato, Dialogues) It was inconclusive.(D.W. Hamlyn, "History of Epistemology," The Encyclopedia of Philosophy, Paul Edwards, ed. 1967.) The dialogue examined the definition of knowledge as "justified true belief." It was clear that for someone to have knowledge of a fact, they first had to have an opinion or belief in the fact. There would have to be more than belief, however.

One cannot say that one has knowledge of a fact after coming to believe it arbitrarily or without foundation. A person cannot predict that three inches of snow will fall next July 15 in Peoria, Illinois based merely on personal belief that it is going to happen. There have to be grounds for the belief, a justification. The supposed definition does not elucidate what is required to justify a belief. It is assumed that it is to be a well-founded justification. Someone could not justify a claim by simply saying, "My mother told me, and she has always been very honest."

The third part of the requirement is that knowledge be of something that is true. This is not surprising since one would expect that alleged knowledge be true. For example, the statement "leaves do not grow on trees" could not qualify as true. The requirement of truth seems to make perfect sense. For this reason, philosophers have not spent much time thinking about the "true" component of the standard philosophical definition of knowledge. More attention has been given to the "belief" requirement and more still to justification. Justified true belief was for the most part accepted as the proper definition of knowledge from the time of Socrates until 1963 when Edmund Gettier published a short article that put it into question. The problem--to be pointed out here--is in that least examined part--"true."

Suppose that i live alone and am the only one that drives my car. If i say, "I know my car is in the garage," and it turns out not to be there, it would be declared that i did not know that the car was in the garage. It was false that the car was in the garage, so how could i have known a falsehood.

No one else ever drives the car. I said i knew my car was in the garage because i did not hear the garage door go up, did not hear the car start up and be driven out, and was not told recently by anyone that they were taking it out. I completely forgot that a month ago my nephew and his employee had agreed to come some morning this week to install a new garage door. My nephew had a key to the garage door. It was they who quietly unlocked the garage door, lifted it quietly by hand, and then pushed the car out without starting the engine. They started to work at 6 in the morning and did not want to wake me up. Given all this, i had strong reason to believe my car was still in the garage. Many people aware of the evidence i had at the time would have agreed that i "knew" my car was in the garage.

The example illustrates that defining knowledge as justified true belief regarding a statement implicitly makes knowledge equivalent to certainty. Under that approach, a person can have abundant evidence of a fact and everyone else assessing the evidence can agree that the fact is established. Yet if it turns out that the fact was not true for some unusual reason, it could not be claimed that the person had knowledge of the fact before its falsity was discovered.

Given the evidence i had before 6 in the morning, it would actually seem strange for me to state that i knew that the car was not in the garage, or that i did not know whether it was in the garage. So it seems that one should be able to assume one has knowledge of highly justified claims, especially if to disbelieve them would be absurd. Given these kinds of scenarios, it would seem there should be a more flexible definition of knowledge, even if it is one that would in some cases leave room for saying that someone had knowledge of a proposition that turned out to be false. Such a definition would be more in conformance with the everyday world and with the approach taken in science. Scientists have come to accept that the knowledge they possess today is based on the presently available evidence. Nevertheless, the evidence may turn out to be erroneous or at least incomplete after additional facts are discovered. Scientific investigators are not dogmatic, as some of their detractors have accused.

There are many facts that in everyday life we reasonably assume we know but upon closer consideration find that they are not certain. For instance, you see a friend Mike go into a restaurant. You have been wanting to talk to him for a long time. Your mother is with you, and you tell her you need to talk to Mike so you both enter the restaurant knowing that Mike is in there.

Once inside you don't see him anywhere. You look everywhere you can, but Mike is nowhere to be found. You approach a waiter and describe Mike to him. He informs you that he saw a man of that description who came in the front door but very soon left through the back door. You and your mother are surprised. You both know for sure that Mike was in the restaurant. It is very unusual for anyone to go into a restaurant only to leave immediately unless they were the owner or an employee, and Mike was neither.

No Room for Error

Instances like these can happen. You are certain of a particular fact, but for some unusual reason, it turns out to not be the case. You look at a plant and you are sure that it is the color blue. The two friends with you agree. It is almost night time, and so the light is not very strong. The next day you notice under sunlight that the color of the plant is green.

Sensory deceptions happen at times. Plato claimed that the senses were unreliable by pointing out that errors like these could be made. While it is true that sensory errors can be made, Plato did not acknowledge that the great majority of the time the senses furnish accurate reports that provide sound information.

The vast majority of the time this information is true and remains true. It gives us knowledge. As in the restaurant case, people have every reason to believe they possess knowledge. Occasionally, that belief turns out to be in error, but these are few instances. There can be a very sound basis for assuming that a person has knowledge even if it is liable to later be corrected or defeated, i.e. defeasible. When the chances are very small that the fact believed to be known is not true, it should be acceptable to count it as knowledge. It may be that knowledge is open to possible falsification, but that for all practical purposes it is accurate.

Under the "true" requirement, it is always demanded that the facts under consideration be true without exception. There is no room for error. This is so even when it turns out the facts were not true in spite of every indication they were before the discovery of their falsity was made. Under this approach, knowledge must always be certain knowledge. Yet it has always been clear, among philosophers as well as nonphilosophers, that there is a distinction between knowledge and certainty. The latter is not simply a synonym for knowledge. There is a reason the word is used in particular instances. Certainty is a special kind of knowledge--one that is beyond doubt, unerring, unquestionable.

The inability to distinguish between knowledge and certainty has been a grave problem for philosophy for centuries. One could even say that it has paralyzed philosophy. Some of the problems in epistemology do not seem worthy of much consideration to nonphilosophers, and even philosophers point out that they do not consider them seriously in their everyday lives. Yet philosophers have continued to toil to find answers for them. One way in which philosophers have dealt with those epistemological problems is by trying to meet the challenges of the avowed skeptic.

The Power of the Skeptics

There have been dedicated skeptics since ancient times trying to show not simply that knowledge is very hard to attain but that it can never be reached. These skeptics have developed arguments against every proposition. The arguments have not been intended to show that specific claims are false. That is because awareness that something is false is itself knowledge. Instead, what these skeptics have tried to show is that it is not possible to claim knowledge of any kind. One can only have opinions, but the comfort of knowledge can never be found.

One of these skeptics was the Greek Pyrrho of Elis who lived around 300 Before the Common Era (B.C.E.). Pyrrhonism was founded based on his beliefs, which were written down by Sextus Empiricus. In the 1600's (C.E.) Montaigne and Descartes responded to that ancient skepticism. The record of skeptics has been a good one. It is hard to show convincingly how to attain knowledge. No one seems to have come up with a satisfactory criterion for attaining knowledge.

Philosophers have pointed out that part of the reason that definitive answers in epistemology have not been formulated is that too much attention has been paid to the skeptic. Presumably, if the skeptic were simply ignored, there would be much greater clarity in epistemology. There is some merit to that idea in regard to the extreme skeptic who demands the most punctilious standards.

However, that approach may not be satisfactory because an important feature of philosophy is skepticism. Philosophy was born when certain people decided not to accept the conventional explanations unquestioningly. They became skeptical after noticing that the accepted explanations were questionable. Ever since, philosophers have refused to take claims on faith but have instead chosen to examine supposed facts carefully. It has been said that every philosopher has an inner skeptic.

The Tyranny of Certainty

Philosophy is supposed to be based on careful observation of fact and painstaking rational analysis. This emphasis on exactitude naturally leads to a drive for strict knowledge that further tends toward a constant quest for certainty. This drive to exactitude has made it very hard to come up with sound answers in philosophy and in particular in epistemology. Philosophers have tried in countless ways to find certainty but have been continuously eluded. It could be said that the root of the aggravation has not been in trying to answer the sensible skeptic but in trying to establish certainty where it is not to be had.

As a result of this endless search for certainty, philosophers have still not been able to establish that there exists an external world, i.e. that any of the objects that you observe everyday truly exist. Related to this, no basis of certainty has been brought forth for belief that other minds exist. You can only be certain that your own mind exists. All the other animals that appear to have minds may not truly have them. They are only programmed zombies that seem to have a mind.

Two other uncertainty problems have been about induction and the self. David Hume argued that induction was not a reliable method for obtaining knowledge. Induction observes particular instances that recur and from that comes to the general conclusion that such instances will continue to occur in the future. For instance, the observation that trees lose their leaves in the fall season for as long as anyone can remember leads to the conclusion that they will lose their leaves next fall. It could be said that this general fact was a matter of obvious, common knowledge. Hume claimed that one could not be certain that trees would once again lose their leaves next fall even though it had always happened before. He was right, but that should not mean that we cannot say that we know it will happen. Hume also observed that he could not identify what constituted his self. The observation still holds sway among philosophers.

It is the alleged need for certainty that is the stumbling block to claiming knowledge in all these types of problems. Is certainty necessary? The hardcore skeptics allege that it is. They further imply that the situation leaves us in grave ignorance and confusion in numerous cases. The result is that one can hardly claim knowledge of anything. Is this truly the case? Is there no way out of this dilemma? This book will try to show there is sound knowledge through a new understanding of it and of the intractable problems mentioned above.

Table of Contents (Part 1)


2 A Better Definition of Knowledge




A typical definition of knowledge as found in a well-known dictionary is "acquaintance with facts . . ." or "awareness, as of fact or circumstance" or "familiarity or conversance."(Random House Unabridged Dictionary, 2nd ed. (New York: Random House, 1993).) There is no requirement of certainty or infallibility in "acquaintance" with facts. "Awareness" and "familiarity" sound downright weak with regard to the degree of acquaintance with the facts. There is no mention that the facts must be guaranteed to be true. This is a far cry from "justified true belief." Even Plato did not adopt that definition, if we go by the "Theaetetus." It is only later philosophers who came to accept that characterization.

Since 1963, that has not been the case. It was in that year that Edmund Gettier published a very short article showing by means of two examples that the three conditions of justified true belief of a proposition could be met and yet the proposition could still be false. After that, the old philosophical definition lost its widespread acceptance. Philosophers have tried to modify or add conditions to the original three in order to come up with an acceptable definition, but no proposal has gained a consensus. I believe that both of Gettier's examples failed to show there was sufficient justification of the facts in question. This violated the first requirement of the definition. For now "justified true belief" is worth discussing further.

A New Approach

Given the problems that arise by requiring that knowledge always involve true propositions, a more workable and realistic philosophical definition of knowledge is desirable. It could be one that modifies or eliminates the requirement that a proposition always be true. In certain instances, it would sound odd to say that someone had known something that turned out later not to be true. Yet this is in reality what turns out to be the case in connection with many statements we make. We can feel strongly that we are correct in making even the most obvious statement but cannot give a complete guarantee that it is true. We have all been surprised from time to time that we were mistaken even when we were absolutely confident that something was true.

Here are two more examples. You know there is an old brown house at 441 Green Street. You have passed by it many times since you were a child. If anybody were to ask, you would swear that it was there. Lo and behold, it was torn down last month by the new owner wanting to build a new house on the lot without your being aware of it.

For millennia, people observed that the sun moved across the sky and the stars likewise moved at night, but everyone could also swear that the earth did not move. There was no doubt in any of this. No one imagined that it could be moving through space or much less spinning on an axis. Even now it is hard to understand how it is that the earth moves while spinning without our feeling it.

When people want to assert that some fact is undoubtedly true, they use the word "certain." Knowledge should leave room for error; it should be defeasible. This would allow for the acceptance of fallibility in claiming knowledge. This is a move away from stubborn arrogance and dogmatism in thought, a salutary approach. It is unfortunate that it was not adopted long ago in human history.

To allow for considering that there was knowledge in spite of the later discovery of error requires a modification of the traditional definition. It could be changed to simply "justified belief." However, that would seem to allow for too low a standard. "Justified" could excuse even a flimsy amount of support for the proposed knowledge. Before proceeding to try to formulate a new philosophical definition of knowledge, this would be a good place to discuss the Gettier counterexamples.

Gettier Counterexamples

In the first case presented by Gettier,(Edmund Gettier, "Is Justified True Belief Knowledge"? Paul Moser and Arnold vander Nat, eds., Human Knowledge, 2nd ed. (New York: Oxford University Press, 1995) 273, reprinted from 23 Analysis no. 6 (1963) 121.) Smith and Jones have applied for a job. Smith has evidence that

(a) Jones is the man who will get the job, and Jones has ten coins in his pocket.

Gettier pointed out that Smith's evidence for this "might be" [Gettier's words] that the president of the company assured Smith that Jones would in the end be selected. Further, Smith counted that Jones had ten coins in his pocket just ten minutes before. Gettier claimed that from this it can be inferred that

(b) The man who will get the job has ten coins in his pocket.

Unknown to Smith, it turns out that he is the one who gets the job, and he also has ten coins in his pocket. Gettier concluded that Smith has justified true belief about (b) but does not know that (b) because he based his belief on his counting of the coins in Jones's pocket and not his own.

The first problem is that Smith is not sufficiently justified to believe even (a). There are two pieces of evidence in (a). Piece (2) that Jones had ten coins in his pocket is well founded. Piece (1) that the president assured Smith that Jones was going to be given the job is another matter.

It would seem that Smith could have considered some questions about the accuracy of the assertion of the president. Was it just a loose opinion of the president, or did he have a sound basis for it? Did the president even get involved in personnel decisions, or did the personnel department handle them, perhaps through a committee? Could the president have a motive to lie to Smith about the decision, especially since Smith was a competing candidate for the position? There may have been a reason that the president did not want Smith to know it was he who would get the job so he lied to him. Given these unanswered questions, Smith should not have placed unquestioning reliance on the president's word.

He should have investigated the matter even further. He should have also taken at least one other step before feeling he had justified his belief. Since he was also a candidate for the job, he should have taken the time to see how many coins he had in his own pocket. You would think that if he felt any real desire to know who was to get the job that he would have done this. If he had found the ten coins in his own pocket, that would have put the belief that Jones was to get the job into even greater doubt.

Gettier mentioned that Smith is "clearly justified" in his belief that (b) is true, but this is open to question unless we are going to accept weak standards of justification. The counterexample depends on changing the information that Smith has in (a) to the statement in (b). Statement (b) about a "man" goes beyond the information in (a) and so is too general. It leaves the door open to include other possible hidden candidates besides Jones. It could turn out that the man who gets the job is a man who had belatedly applied for the job and was chosen at the last minute, unbeknownst to Smith. That man could also have ten coins in his pocket, a common occurrence. Or it could be that there were even more applicants that Smith did not know about. He should have investigated further before coming to any conclusion.

Since Smith was not adequately justified in believing the first part of (a) that claimed that Jones would get the job, a fortiori he was not justified in believing (b). On this basis, he did not know (b). Furthermore, he was not justified in believing (b) because he should have noticed a red flag warning of possible error by the changing from a supposition about Jones to the more general one involving an unknown “man.”

It is a mistake to claim that the scenario presented by Gettier is a valid counterexample to the "justified true belief" criterion of knowledge. Specifically, the use of the inference leaves the door open for error, as in fact happened when Smith actually got the job. There is simply not enough justification in believing that an unspecified "man" will get the job.

There is only one part that has to be examined in the second case that Gettier presented in order to see the fault in it. I will not review the entire counterexample. The defective part is the "strong evidence" that Gettier alleged Smith has for knowing that Jones owns a Ford. The evidence is "that Jones has at all times in the past within Smith's memory owned a car, and always a Ford, and that Jones has just offered Smith a ride while driving a Ford."(Id. 274) We know that appearances can be deceiving. It could be that the Ford that Jones is now driving is one that he leased. Perhaps Jones is a person who believes in always leasing a car and it has always been Fords. Or it could be one that belongs to his wife or his father or his sister. This could have been the case all along. Smith needs more information to claim he has strong evidence, such as asking Jones if he owns the present car. Gettier was wrong in alleging that Smith was "completely justified."(Id.)

There has to be a higher standard for there to be a claim to sufficient justification. Philosophers lodged objections against the two Gettier counterexamples. One idea presented against them was that the alleged knowledge was justified by false evidence. Other philosophers have come up with counterexamples that supposedly corrected those criticisms.(See, e.g, Richard Feldman, "An Alleged Defect in Gettier Counterexamples," id. 274, reprinted from Australasian Journal of Philosophy 52, no. 1 (1974) 68-69.) I have not made an examination of the literature, but i suspect that any allegedly foolproof Gettier-style counterexamples depend on scenarios that rest on facts that are not sufficiently justified.

A New Standard

It should now be clear that a revised standard is needed. Based on the problems just observed in the Gettier counterexamples, it would be good to strengthen the requirement for what is to count as proper justification for a belief. There is also the idea of making it more acceptable that knowledge be fallible and defeasible rather than having to meet the exceedingly high standard of certainty. The leaving of room for error is consistent with what happens in real life. The focus would be less on what is ultimately true and more on whether the person was justified in having a belief that could count as knowledge.

This less certainty-demanding approach would be compatible with the approach that science takes to knowledge. Scientists have come to realize that their methods of acquiring knowledge do not involve setting down truths and then defending them unbendingly against all criticism. Honest and diligent scientists continue to question and test theories, including their own.

Scientists have realized that theories have been shown to be erroneous after more data has been gathered. They have realized that in science the gathering of information is a never-ending process. The discovery of new facts can render established theories obsolete or at least require their modification. They are not afraid to see if existing knowledge can be falsified. This particular process of ongoing investigation and testing was labeled falsifiability by philosopher Karl Popper. It appears that scientists operate in tune with Popper's characterization.

There are several formulations that are good candidates for this new characterization of knowledge. A focus on modifying the strictness of "true" produces formulations like (1) justified defeasibly true belief and (2) justified very probably true belief. These statements allow for something less than certainty, yet ensure that there is an approximation close to the truth. It would cover most situations in which humans claim knowledge. It would allow for people to make claims of knowledge without possessing certainty.

There was also the concern about there being sufficient justification. In this regard, there should be a qualifier of "justified." Something like "highly justified" or well justified" would probably fill the bill. A qualifier such as one of these would ensure that there would be a high standard of evidence in support of any claim to knowledge. "Completely justified" could be another candidate, but there may be some problems with it. First of all, it is hard to see what "complete" would mean. It could be so demanding a standard that it would require an endless process of justification. It could be so exacting that it would practically require that a declaration be categorically true, which would in effect bring back the demand for certainty. The standard of completeness would simply be too high.

"Highly justified" would appear to be the best phrase. It sets a high standard for what one should believe without demanding so much that it simply becomes an alternative justification for certainty. A consequence of requiring more than simply "justified" is that it demands greater evidence and thus brings a greater probability of attaining the truth. Nevertheless, it is only a probability and not a certainty.

Given this, it seems redundant to include the requirement of "true" in any manner in the characterization of knowledge. Previously the alternative "justified very probably true belief" was suggested. If that is used along with the alternative of "highly justified," the full characterization comes out "highly justified very probably true belief." This is acceptable but “very probably true” seems unnecessary because the requirement of high justification should ensure that the level of truth attained would be a very probably true one.

The best characterization would then simply be "highly justified belief." Under this new definition, the Gettier scenarios fail as counterexamples because, as pointed out before, they do not have adequate justification because they are not highly justified. This new definition raises the standard for what is to be accepted for justification, and this is for the best. It seems that too many times people believe they have knowledge of some fact, and yet they have too little justification.

Table of Contents (Part 1)


3 Knowledge of the External World




You see a table. You reach out, touch it, and feel its hardness. You see the chair next to it. It looks perfectly strong and capable of supporting your weight so you sit down on it. Through all this you give no thought to whether the table and chair and all the other objects you observe have an existence of their own. It seems absurd to have any doubt about the matter. You are sure that only certain schizophrenic people would doubt that the objects observed everyday are actual, solid physical objects. A world external to you undoubtedly exists.

Everyone else would have the same opinion and be perplexed that anyone could doubt that there are solid objects all around us. Yet philosophers have paid great attention to this doubt for centuries and debated it extensively. To this date, there has been no final resolution of the problem.

Book I discussed the contention by some that the physical universe does not exist. In philosophy this position is called idealism. I prefer to call it ideaism because to use the word "idealism" for the concept confuses it with the usual meaning, which is the holding of lofty ideals. Also ideaism gives a better sense that it is a belief that mental ideas are all that exists. The following discussion will involve the skeptical possibility that objects may not exist and not the outright assertion that they do not. It is an epistemological approach and not an ontological one like the previous one. It can be called epistemological ideaism.

Descartes' Thought Experiment

The controversy started when Rene Descartes tried to set knowledge on a foundation of certainty.(Descartes' fullest account of his search for certainty can be found in his Meditations on First Philosophy reprinted in The Rationalists (New York: Dolphin Books, 1960).) This was partly an attempt to respond to Pyrrhonian skepticism. Descartes decided that the best way to try to attain certainty was to engage in systematic doubt of everything. He wanted to take nothing for granted and decided he would doubt that the common objects around him could be true objects. He even went so far as to assume he had "no hands, no eyes, no flesh."(Rene Descartes, Meditations on First Philosophy reprinted in Paul Moser and Arnold vander Nat, eds. Human Knowledge, 2nd ed. (New York: Oxford University Press, 1995) 113, 115.)

Descartes observed that while having dreams people do not notice that they are in a dream but believe every bit that they are experiencing ordinary consciousness. He judged that in a similar fashion people in a state of ordinary consciousness could be mistaken about the reality of the objects around them. Instead, the objects could just be completely in their imagination. Descartes did not actually believe that he was in a dream but only pretended it as a device for testing what he could claim to know. Still he wanted to justify that he could believe that he was not in a dream but was actually observing true objects in ordinary consciousness.

The only way he could find to do this was by positing that there was a God who was benevolent and so would not deceive him. It was hard to see how Descartes could doubt even the existence of ordinary objects and yet so readily accept the existence of God who is never seen. He came to this belief on the basis of believing in his consciousness and on the single idea of a Perfect and Infinite Being.(Bernard Williams, "Descartes, Rene," The Encyclopedia of Philosophy.) It is further open to question how he could claim to know the attributes of God, in particular, the one of benevolence. He could also establish that this benevolence was so absolute that God would never engage in deceit.

Descartes' contentions were questioned by some of his contemporaries, and he could never satisfactorily establish his allegations supporting a belief in objects. Descartes had set out to defeat skepticism but instead left the topic with greater doubts. No one before him had seriously discussed doubt about objects. Augustine had touched on the problem tangentially.

Dealing with Doubt

Skeptics in the 1st century B.C.E. claimed that one only knows sensations and nothing about the objects that might represent them.(D.W. Hamlyn, "Epistemology, History of," The Encyclopedia of Philosophy.) They did not deny that the objects existed. In 1690, John Locke published his Essay Concerning Human Understanding in which he tried, like Descartes, to set down what humans could count as certain knowledge. He was an admirer of Descartes and wrote about 50 years after him. Locke wanted to ascertain what we know about objects, or bodies, as he called them. He pointed out that the mind does not immediately come into contact with an object with which it is confronted but instead perceives an idea produced by the object. Specifically, the object possesses qualities that give it the power to produce an idea in the mind.

These qualities he divided into primary and secondary ones. The primary qualities Locke considered to be inherent in the object and could not be separated from it. These primary qualities included solidity, extension, shape, motion, and quantity. The secondary qualities did not reside in the objects themselves but were considered by Locke to merely be powers in the objects to produce certain effects such as color, sound, and taste.

Locke took the primary qualities to show that objects exist because the primary qualities resemble them. Secondary qualities supposedly did not play a part in establishing that the object existed. Another way of looking at the question is that physical objects standing apart from our minds have to cause the primary ideas that we have of them. The view is plausible and conforms well to common sense. Primary qualities are inherent in each object and show that the object must have its own independent existence.

The issue seemed to be settled until George Berkeley, an Anglican bishop, pointed out in 1710 that the primary qualities cannot be considered to be as fixed as had been assumed. For instance, the size or the shape of an object can vary depending on the distance away or the angle of observation. Berkeley held that primary qualities are just sensations, no different from the case of secondary qualities. They cannot be used as proof that there are physical objects causing them. All that can be said when we observe objects is that we experience sensations. For all we know, it is all in our mind--even the primary qualities.(George Berkeley, A Treatise Concerning the Principle of Human Knowledge, Introduction 6-12, reprinted in The Works of George Berkeley, A.C. Fraser, ed. (Oxford: Clarendon Press, 1901).)

Berkeley took on the skeptical problem related to objects in an external world because he was concerned that Locke had unintentionally unleashed skeptical doubts with his discussion of how our perceptual ideas could be separate from external objects. Berkeley feared that this would undermine religious faith. It backfired. Despite his best efforts, all Berkeley succeeded in doing was to add to the reasons for being skeptical.

For one thing, he argued that there was no good reason to believe in physical objects at all. David Hume found Berkeley's arguments unconvincing but did not attack them. He liked the skeptical approach. He came up with reasons for skepticism about causation and induction.

Berkeley's Views

Here are some of the claims of Berkeley.

(1) The primary qualities found in objects like solidity, extension, and size do not give us sufficient justification to believe that the objects have independent existence.

(2) All we can claim is that we experience sensations that appear to be objects.

(3) There is not enough evidence of independent physical objects.

(4) Sensations are produced in human minds by the mind of God by his perceiving everything that we perceive.

The first three points form a rejection of the causal theory of perception that simply says that the sensations we experience are caused by physical objects actually present in the world apart from us. There are exceptions such as dreams and hallucinations. It is not necessary to study Berkeley in order to question the causal theory of perception. Many started to question the causal theory after considering Descartes' questions of how he could be sure there were physical objects. Ever since, individuals have not been able to prove with absolute certainty that they are not living in a complete dream experiencing only sensations of imaginary objects.

Item number (4) is open to serious question. How could Berkeley question the existence of something as simple as physical objects but not ask why God had concocted a system of perception in which he allows only humans to be aware of objects? Are other animals not aware of objects? The explanation that we are aware of perceptions only because God is directly involved in producing all those perceptions is simply not convincing. It is more open to skepticism than belief about the physical objects that were put into question in the first place. This is apart from any skepticism based on other grounds that might be felt about the idea of God. Idealists were happy with Berkeley casting doubt on our belief in physical objects but did not examine the full implications. Idealists hold that the physical world does not exist or is secondary to the spiritual world. I call them ideaists.

It is puzzling that Berkeley also held that if an object was not being perceived by any mind, presumably it could not exist. In Latin, this thought was expressed as esse est percipi, to be is to be perceived. If you walk into a room and see a table, it exists, but once you leave the room it ceases to exist. If you walk back into the room, the table exists again. After you leave the room the second time, the table ceases to exist again.

To solve this problem of disappearing objects, Berkeley came up with a deus ex machina--God. Berkeley had God perceiving objects so that they could continue to exist even if no human was perceiving them. It was an understandably unconvincing explanation. You would think God would have many other things with which to occupy himself. He could make it so much easier on himself by simply creating physical objects that would cause the perceptions he wanted perceiving minds to have.

Then there is the question of why it is necessary for God to produce physical objects to be perceived. He could skip the physical objects and implant images of objects in our minds. That would be much simpler. If this were the case, we would be right back to only perceiving imaginary objects. This sounds very much like ideaism. God could coordinate everyone's perceptions perfectly so that there would be smooth social interaction.

It has to be asked how Berkeley knew the actions that God is supposedly taking. Specifically, how did Berkeley come to know that God is perceiving everything? Does God have to perceive everything in the universe at all times? How did Berkeley find out these details? Did God personally inform him? There was no mention of this perception system in the Bible.

Berkeley could be wrong about God perceiving all the objects in order to keep them in existence. If he is wrong, his scheme also puts into question whether other minds exist, although he did not intend to challenge that assumption. Here is why the existence of other minds cannot be known under Berkeley's scheme. Every indication is that minds are always contained in physical objects known as brains. Brains furthermore are contained in bodies that are also physical objects.

Remember that according to Berkeley physical objects may not exist. Therefore, since brains may not exist, their corresponding minds may not exist. If this is so, then all those minds are possibly only inventions of the mind perceiving them and nothing else. They cannot be real. Taking your mind as the perceiving mind, you cannot know that the minds of your siblings or friends or anyone else are real. It may just be your solitary mind imagining that you are interacting with other minds.

Refer to Book I, The Predominance of the Physical World. It argued that the mind is only a traditional and metaphorical way of talking about the brain. The mind does not have an independent existence from the brain. If one believes that minds exist in a nonphysical state, there are problems associated with assuming that. One has to explain how minds are able to exist nonphysically. Thinking beings have always possessed brains to accomplish the task. It could be assumed that minds are made of air or something like it, but even air is physical.

There could be more discussion here about the views of Berkeley, but that will not be necessary. As Hume said of them in 1740, "they admit of no answer and produce no conviction."(David Hume, An Enquiry Concerning Human Understanding (Bobbs-Merrill, 1955) Section XII, Part I, 163, footnote.) Hume correctly judged that while Berkeley's ideas were ingenious and presented interesting skeptical points, they only brought the confusion that comes from skepticism. Anyone interested in delving into Berkeley can look for anyone of several of his works and books written about him.

Kant's Explanations

Immanuel Kant found the entire situation appalling. He could not believe that by the late 1700's no one had been able to show that objects actually existed. He came up with what he thought was a solution. He was not very modest in calling his approach a Copernican revolution in philosophy. Kant claimed that experience was not simply that sensations were felt by the mind based on physical objects. Rather he thought the mind imposed certain conditions on sensory experience. As an example, space and time were not conditions that were supplied by sensation but were instead imposed on experience by the mind.

Kant referred to the sensations that appear in experience as phenomena. He further claimed that there was behind them, noumena or things-in-themselves. He posited that noumena could not be known directly, but they did exist. Kant's claims about perception of the external world were widely accepted by philosophers of the generations that followed him. It is puzzling why his ideas gained such wide approval. One possible explanation was that philosophers simply wanted to avoid skepticism about the perception of objects and were seduced into accepting Kant's explanations.

There are questions that arise from Kant's explanations. It is not clear how he could be so confident that the mind imposes categories on experience like space, time, substance, and cause. Most people would say that these categories involve properties involving space and objects themselves. It would seem that the categories are very much independent from the human mind. It would seem that categories like space and time were present in the universe long before the human mind appeared.

Kant's claim that the human mind imposes categories on what is experienced seems to put him on the same page as Berkeley who claimed that for an object to exist it had to be perceived by a mind. Kant seemed closer to Berkeley than he thought in giving the mind such an important role in fashioning perception. While both Berkeley and Kant have been considered ideaists, they have still been seen as being far apart. While Berkeley has been called a subjective ideaist, Kant has been counted an objective ideaist.

Was Kant's approach all that objective? His claim for the role of the mind lent support to Berkeley's belief that objects have no independent existence of their own, rather than to refute it. After all, if the mind has such a powerful role in the fashioning of perception, what is to say it does not simply furnish us all the aspects of perception and that there is no objective world--Kant's noumena--beyond it?

Neither did Kant's reference to noumena as being behind all the appearances do much to avoid Berkeley's conclusions. Kant in effect postulated the existence of noumena as things somewhere in the background but offered no grounds for assuring they are actually there. He did not get around the unavoidable arguments that Berkeley posed for feeling doubt that objects exist. Objects were left very much in the same questionable states where Berkeley and Hume had left them. If Kant accomplished anything with his ideas on the mind, it did not furnish a solid basis for belief in physical objects.

Kant’s idea that the mind provides categories by which humans perceive is very unclear. It is puzzling how so many philosophers who pride themselves in putting their trust in only the most carefully formulated beliefs came to accept Kant's epistemological views so readily. (The same observation can be made of how so many philosophers put such great importance on what Ludwig Wittgenstein had to say on a number of philosophical issues even if it is not very clear what it was that he said. It seems that even philosophers are not free from following fads.)

For many years, i had strong doubts on how Kant's solution to the physical object problem made any sense. I wondered whether i was simply unable to grasp the point where philosophers had been able to do so before me. I delighted when in 1997 i discovered an article by Anthony Quinton, a philosopher of high stature well versed in Kantian thought. He expressed serious misgivings about Kant's ideas of how the mind supposedly imposes structure on the world. Like me, Quinton questioned whether he was failing to see the theory of the highly regarded Kant. "Am I making a complete fool of myself? Have I fundamentally misinterpreted his meaning"?(Anthony Quinton, "The Trouble with Kant," Philosophy, Jan. 1997: 5-23.)

Philosophers should have the courage of Quinton to question the established doctrines of earlier respected philosophers whenever it is called for. It would also make an interesting psychological study to investigate how so many hard thinking, logical philosophers could keep accepting questionable ideas for so long.

Philosophers in the 20th century tried to explain the knowledge acquired through perception. This was attempted through different programs like phenomenalism, logical atomism, and logical positivism. No definitive answer has yet been reached. That need not remain so. Even if it is not possible to reach a place of absolute certainty with regard to physical objects, it should be possible to arrive at sound explanation supporting their existence.

It may not be possible to devise a completely certain proof. The problem appears to be a consequence of our individuality. Specifically, we each have an individual, private consciousness that is always separated from all other consciousnesses. Each consciousness can never experience any perceptions but its own and so can never check to see what other consciousnesses are truly experiencing. So any individual consciousness can never be certain that all the perceptions it experiences are not simply self-generated.

The Reliability of the Senses

Apart from dreams and hallucinations, skeptics base much of their doubt about the perception of objects on a few examples in which perception turns out to be unreliable. Examples are the way in which a stick partly immersed in water can appear bent, the way in which a grove of trees at dusk can appear gray while under the midday sun it looks green, and the way in which a white sheet of paper can appear to be the same color as the color of the light that is shined on it.

While it is true that odd perceptions occur that are not in accordance with what is ultimately true, the fact is that with a few exceptions our senses provide us with perceptions that are accurate and give us useful information. For instance, our visual judgment of the distance of an object from us is generally accurate. If someone pointed to a pond and told us to walk to it, we would have a good idea how far it would be. When we sit down to eat, we can reach for our utensils and dishes without a problem. With our visual sense, we can readily learn to distinguish and remember people, places, objects, and to perform numerous tasks.

The sense of hearing allows creatures to distinguish a countless number of sounds. This fine ability to detect and identify different sounds can be very useful, even lifesaving. Identifying the warning sound of a rattlesnake or a hungry tiger can be very useful. Being able to discriminate the various sounds that others produce with their mouths when they speak to us allows us to understand their thoughts and wishes.

The sense of taste allows us to enjoy food and to avoid poisonous plants that often have an unpleasant taste. Smell goes along with taste and also aids in avoiding danger. The odor of smoke has helped many a person avoid being suffocated or protect their property.

There are aberrations but they are minor. The detractors of the senses who gleaned putatively great philosophical lessons from the occasional distortions got carried away. While it is true that human judgment is greatly prone to going astray, the errors of the senses are not the greatest concern. The failures in emotional stability and reasoning are much more important.

Plato was the first prominent philosopher to disparage the senses, and it is clear that he had the intention to elevate reason to a much superior position. The philosophical battle between the senses and reason was entirely off the mark. There need be no contest between the two. It is more of a matter of cooperation to furnish information. One would have thought that empiricists, those who believe we get our information primarily through the senses, would have been more willing to faithfully defend the senses than they have. Berkeley and Hume appear to have held their skeptical agenda in higher regard than their empiricist one.

The Causal Theory of Perception

The causal theory of perception is what almost all people accept automatically to be the case. It is unquestioningly accepted that there exist physical objects that cause the sensations that are perceived and that these perceptions must resemble the respective objects that cause them, at least to some extent. The perception skeptic goes against all this in holding that it is not certain that any of the supposed objects exist. Yet s/he does not come up with any satisfactory alternative explanation. Berkeley's alternative approach is an example. It was contrived and thus unsatisfactory.

The casual theory of perception should not be disregarded purely on the basis that the senses occasionally report inaccurate information. After all, in a number of instances, the problem is not with the condition of the senses but with the physical phenomena involved. For instance, in the case of pink light making a white sheet of paper look pink, there is nothing wrong with the eyes. It is simply that electromagnetic radiation from the pink section of the color spectrum stimulates the eyes whether it comes directly from the pink source of light or whether it is off a sheet of white paper. The senses seem more than adequate for the task of properly perceiving physical objects.

Philosophers have often gone against the grain of public opinion because they have noticed that too often public opinion is reached unquestioningly and even on prejudice. This habitual reaction against "the vulgar," as Hume referred to the general public, can be overdone and the present question of physical objects is an example. Upon closer examination, the causal theory of perception is not as hard to support as skeptics claim. The argument that Berkeley presented that all we can know when we are confronted with an object is sensations in the mind representing the supposed object could have been countered with the following argument.

The Permanence of Objects

Sensations that we ordinarily conclude are only imaginary and not caused by any object are usually fleeting and change from one occasion to another. This is especially true in the case of illusions and hallucinations. By contrast, (1) physical objects appear very stable. They look the same for very long periods of time unless altered in some way, such as a person taking an ax to them. Perceptions that represent physical objects are of a much greater quality are hallucinations, which give every appearance of being solid but nevertheless are not.

Another indication that there may actually be independent physical objects in the world is that (2) a detailed description has been compiled of how animals perceive those objects visually. First, light must be present to be reflected off objects and into the eye. Without light, nothing can be seen. A person with two perfectly good eyes cannot see anything in a completely dark room.

The light that is reflected off an object enters the eye through the cornea, the front of the outer layer of the eye. The rest of the outer layer is the white of the eye, or sclera. Unlike the cornea, it is not transparent. Light next passes through the watery aqueous humor that along with the cornea reflects light. These two act as a front lens of the eye. Light rays then enter the pupil, a small black hole in the middle of the front of the eye. It is surrounded by the iris, a muscle that contracts or dilates depending on how much light is needed by the eye. The lens of the eye then becomes involved. It is flexible and controlled by ciliary muscles that can make it thicker or thinner in order to accurately focus on objects, depending on whether they are near or far. Next, the light rays pass the jelly-like vitreous humor, which makes up a large part of the interior of the eyeball.

The light then lands on the retina in the rear, which consists of several layers of nerve cells, numbering about 120 million. The vast majority of them (114 million) are called rods and are highly sensitive to low intensities of light. Animals with poor night vision often lack normal function in their rods. The other 6 million photoreceptors are the shorter cones that are involved in detecting red, green, and blue light, which together facilitate the viewing of all other colors. When light falls on the rods and cones, it causes chemical reactions that then send electrochemical signals along the optic nerve to the brain.(The Human Body (Chicago: World Book Press, 1984) 36.)

The visual information the eyes receive about the objects they see is transmitted in every detail to the visual cortex at the back of the brain. An image of each object observed in front of the eye is cast on the retina by the process just described involving the refraction of light. From the retina to the visual cortex, the information is sent through a code that is produced electrochemically.

Each pathway from each eye is called an optic nerve but in reality consists of several nerves. The nerves from each eye meet at the optic chiasma at the front of the brain. Some of them cross over at the optic pathway or tract of the other eye. This partial crossing of nerves ensures that signals from the right side of each retina arrive at the right visual cortex while visual information from the left side of each retina goes to the left visual cortex. The two optic tracts continue on each side of the brain and pass through the lateral geniculate bodies that coordinate visual information with information from other parts of the brain. This includes information from other senses such as touch and hearing. At the visual cortex, myriad signals are interpreted and coordinated almost instantly.(Id. 37)

This seems like a very complex system for the handling of visual images. Ophthalmologists and optometrists learn these details to properly treat our eyes. The operations involved in sensing and interpreting other stimuli like sound and taste are also complicated although not as elaborate. Then there is the task of coordinating them all in the brain. All this would seem unnecessary if all that we experienced was mere sensations in the mind. Instead, every indication is that our body's sensory systems function in a manner to detect objects that rest outside the body. The indispensability of light for being able to see objects by itself says a lot.

Another point in favor of the idea that visual stimuli come from outside objects involves the retina. (3) Objects observed by the eye form images on the retina. However, this is not the case in dreams, whether in sleep or daydreaming. There the images being observed do not originate in the retina; they have nothing to do with it. All the images originate in the brain.

Physical Explanations

If our perception of physical objects only involved sensations within our bodies, it would only need to involve the brain. Why would it have to be any more complicated than that? Why would the body have these intricate instruments known as eyes if they did not perform any function? There would be no need for surgeons and other doctors to attend to their defects and diseases. It would be very strange that medical doctors have been able to come up with physical explanations for sensory problems and with mechanical fixes for them.

This all flies in the face of Berkeley's conjecture that there may be no physical objects. It even calls into question Kant's claim that there is a wide divide between phenomena and the supposed noumena behind them. All the extensive medical explanations and descriptions of the interaction between perceivers and objects tends to indicate that there are only objects, the phenomena. Noumena are superfluous. In any case Berkeley apparently did not believe in his arguments himself. Because skepticism threatened religion, he was trying to subvert skepticism, not elevate it. The foregoing considerations demonstrate that there are actual physical objects that cause our sensations.

Table of Contents (Part 1)


4 The Road to Solipsism




The second major argument against any serious claim that physical objects cannot be known to exist takes a reductio ad absurdum approach. Reductio ad absurdum arguments have been around at least since the advent of systematic geometry and have been used in other branches of mathematics as well as in general discourse. The method is to take a statement that is questioned and follow the implications that arise from it. If those implications lead to an absurd result, it is taken as a proof that the statement in question is not true. In the case of the external world, the statement in question: all objects perceived are merely sensations rather than having a physical existence.

The first step is to make clear that when it is claimed that objects are merely sensations--that means all objects. It is all the natural ones like trees, rocks, mountains, blades of grass, insects, birds, people, and so on. It includes all human-made artifacts such as buildings and everything in them such as furniture.

Under the statement in question, other humans do not exist physically. They, like everything else in your field of vision and orbit of sense, are mere sensations in your mind. They perform certain actions like speaking to you, but the sounds they make while speaking are not produced by vocal cords. You learned in school that animals produce vocal sound through contractions causing their vocal cords to contract and expand to produce specific sounds. That is all a myth under the implications of ideaism (idealism), the doctrine that everything is spiritual. Your sensations are all in your mind, not the result of any physical action. Your entire conscious life is a dream.

The Implications of Solipsism

Imagine the brain exists in a completely disembodied form. One may try to insist that minds can exist in an ethereal and disembodied state without the need for any associated brain. This is hard to maintain once you delve into its meaning. Nothing close to a satisfactory description or explanation of disembodiment has ever been accomplished. Especially now that there has been so much neurological research with advanced technological instruments concerning the actual physical brain, it appears that mental activity has to take place in a physical container.(See Book I, Chapters 3, 4, 10 for a discussion of this.) A story in an internet Stanford newsletter in 2021 reported brain studies that could detect intention before the taking of action. The method of measuring was called the "brain activity decoding technique."

For now at least, follow this argument with your brain as the only object in the universe. What you have been assuming to be objects do not exist. That covers everyone from humans to viruses. It is universally found that minds exist inside of living beings. However if there are no living beings, there can be no minds. In turn, if there are no minds, there can be no persons.

For the purpose of constructing this reductio ad absurdum argument, assume that you are an ideaist who does not believe that matter exists. This has to include the brains of all the animal and insect species as well as their bodies. If you are going to follow the consequences of believing the possibility that the physical universe does not exist, you have to realize that you may be alone in the universe--completely alone. There are no objects and no minds except yours. This is the state that has been given the name of solipsism.

You cannot consult with anyone else about what they opine about your solipsism because they are only a creation of your mind. You cannot rely on their opinion. You may as well talk to yourself or some mural of a person on a wall. Nor can you talk to anyone else about the mutual solipsism you both experience. Bertrand Russell joked about a woman who professed to be a solipsist and was surprised that everyone else was not also a solipsist.(C.D. Rollins, "Solipsism," The Encyclopedia of Philosophy.)

You are not simply alone in the sense of being the only person that exists, but you are the only thing in the universe whether you are matter or spirit. You know this because, as Descartes discovered, you cannot doubt your own existence because you are obviously conscious. You are the universe. You may well have created yourself for all you know although you may not remember performing that amazing feat. It would certainly be an astounding accomplishment that is impossible to imagine. Someone else may have created you, but you cannot remember who did it or how it happened or where they are now.

There are a number of implications, both personal and general, that arise from your solipsism. Your mother, your father, and all the rest of your family have never been real. Their existence, their actions, their demonstrations of affection for each other and for you have all been a product of your imagination. You may have developed some joyful and rewarding relationships with humans and pets. You will remember and cherish those friendships for the rest of your life but realize there has been no one there--only you.

Any praise or prizes or awards you have received have likewise been false as far as there being anyone there who has truly been impressed by your deeds. The accolades you received were probably well deserved and you believe that. You worked hard to receive the praise. While that may be true, it has all taken place in your solitary dream life.

On the other side of the ledger, consider this. All the people who have ever hated you, who have expressed contempt toward you, who have tried to hurt you in any way never existed. What they did to you may have caused you anguish and turmoil, but rest assured that they were just imaginary people in your mind. If you are in any way thinking of taking revenge, you need not bother with that a minute further. There is no one there in reality against whom to retaliate. Now that you have considered that only a person concocted purely by your imagination brought your aggravation, your pain can be erased away--hopefully.

Likewise, all the people you have insulted, either intentionally or just carelessly, have not felt any of the pain that you thought they had. You need not feel guilt or think of apologizing or making amends. No one was there to receive the brunt of your insensitive actions.

Any failures or shortcomings you may have are not important, at least in so far as anyone else is aware of them. All those tireless efforts to eliminate or at least diminish your faults have not been necessary. You could have remained content to simply remain as yourself, faults and all. After all, those faults were part of your identity. All that concern with what you eat was for nought. Your body is not physically real so what you weigh makes no difference. The same is true for any other aspects of your body you don't like--blemishes, wrinkles, and the like. There is no one real out there to notice any of that.

What does your solipsism mean in connection with history? All those historical events, even the monumental ones, never took place. Choose whatever event you wish. The Bronze Age, the Battle of Thermopylae, the death of Socrates, the life of Epicurus, the crucifixion of Jesus, the Ming dynasty, the voyages of Columbus, the LIttle Ice Age, Montaigne's essays, the atomic bomb, the career of Elvis Presley, the destruction of the World Trade Center in 2001--those and countless other events never happened. Somehow they all appear in your narrative of what has occurred. It is hard to understand how they got there. It has all been reported and discussed in historical documents, books, textbooks, periodicals, films, and other publications, but they never took place. Everything that is recorded as having happened, even minor events in your hometown, are all in your mind.

The same has to be said for all the other areas of knowledge. All the findings of psychology, anthropology, and sociology about the actions and motivations of people refer to nothing but what was in your imagination. The many explanations of the biological sciences pointing to the intricacies of living things and their interactions with each other and with nature are all intellectual concoctions that were produced by some mysterious process but which refer to nothing outside your mind. These facts and claims only refer to the imaginary events and beings in your dream and mostly make sense. They are usually in coordination with each other. That helps make the whole narrative believable.

The laws and theories of the physical sciences seem to be sound, but there are no objects to which they refer. The formulas of chemistry, the various descriptions of the earth found in geology, the numerous equations of physics, and many more alleged physical facts are only concoctions of your imagination. There is no science of physical objects.

It is remarkable that if you test them, these facts will turn out to be consistently true. For instance, if you take a thermometer and measure the temperature at which water will freeze, you will find it is 32o Fahrenheit with some variation depending upon the exact surrounding conditions. If you drop various objects and measure their acceleration, you will find it is consistently 32 g. Of course, you can explain this by pointing out that this simply means that the remarkable imaginative design of your working dreams.

All the discussions that scholars have undertaken with regard to areas of knowledge like literature, music, painting, and sculpture have also only been part of your dream. For that matter, the artistic works themselves, whether you are familiar with them or not, do not exist. All the books and periodicals that have ever been written on any subject, even those written by highly respected writers, have just been images or references in your mind. All the libraries filled with books and papers written by people who supposedly toiled mightily to communicate with others on numerous subjects have amazingly never existed.

Stepping back into your personal realm, you realize that you have always had a tremendous amount of freedom that you did not believe you had. All this time you could have freely done exactly what you pleased without any thought for anyone else. You could have insulted other characters in your dream, hurt them physically or emotionally, even been outright cruel and sadistic, and it would have affected no one but imaginary characters. You could have ignored and abandoned those to whom you have a responsibility like young children or old parents.

If you have been dutiful and devoted until now, you can cease such silly and useless behavior. If you get the impulse to kick a person hard on the shin, you will be hurting no one. You can even commit vile crimes against persons and institutions, and no one will be worse for it. Grand theft, aggravated assault, bank robbery, even murder are all options for you that will cause no harm.

Now there is a cautionary note with regard to harming others. Even if everything is just in your dream, there is no guarantee those characters in it will not retaliate, either individually or as a society. After you snatch a woman's purse or hit someone over the head with a baseball bat, they might retaliate in some way, including having you lawfully put in jail.

You may assume that this won't happen because you will insist that they can do nothing to you because they are only imaginary characters in your mind. Your pleadings may be of no avail. Your history with these characters is that they think and act every bit as if they believe they are equal to you in existential terms. They may defer to you at times because they perceive you are more intelligent or knowledgeable or forceful than they but not because you have any special status of existence apart from them. Even if you have special status and are the only truly existing being, it may be that you do not have control over the characters. Instead, they can exercise total control over you even if they are only characters in your dream. In this respect, it turns out to be of no avail to you that you are the only true being.

If you commit a serious crime and are sent to prison for several years, your agony will be the same whether everyone exists physically or you are in a solipsistic dream. It is advisable that you not commit any crimes in order to test whether you are alone. You may want to run a test by committing a minor offense against others such as punching them lightly to gauge their reaction. Their reaction will likely be one of displeasure with you just like it has always been before. Tell them then that it doesn't matter because they don't exist. See what their reaction to that will be.

The Absurdity of Solipsism

By now you have probably realized that the implications of solipsism do not make it a ready alternative to the ordinary conception of the physical existence of objects including persons. It is not so easy to simply assume, as solipsists have, that physical objects do not exist. You have to follow the logical consequences and make full note of the implications of disbelieving the physical world. You can't just stop at assuming the physical world does not exist and easily maintain your other beliefs about how persons exist, are fully conscious, and are equal to you in consciousness states. If you are going to be truly skeptical about the existence of physical objects, you have to be doubtful about almost everything else, and then follow the implications of your solipsistic state. The investigation winds up in absurdity.

Here are the steps in the argument against believing that physical objects (the external world) do not exist:

(1) Assume that physical objects do not exist.

(2) There then has to be the belief that living beings besides you do not exist since they consist of objects.

(3) If there is belief that living beings do not exist, there has to be the corresponding belief that conscious minds beside yours do not exist since they are always found in living beings.

(4) This is solipsism, by which you are the only thing in the universe.

(5) Solipsism leads to absurd results regarding the explanations of what you experience.

(6) That objects exist is a much more probable explanation of what you experience.

(7) Therefore, the initial assumption that objects do not exist is false.

This argument defeats solipsism and any form of ideaism that puts into doubt the existence of objects. Even the claims that our experience of objects is significantly different from what they appear to be runs afoul of this argument. This is because scientific explanations of how things work depend to a great extent on their being much like what they appear to be. Ideaism gained much of its acceptability in the 1800’s from making it seem plausible that there is something problematic with the perception of objects.

Other Minds

This argument against solipsism can be considered as containing within it a reductio ad absurdum subargument against the proposition that no other minds exist but yours. In this situation, all objects are taken to exist physically including all animals. It is only minds that are doubted by the observer. The problem of other minds received attention in the early 20th century especially since Ludwig Wittgenstein took a special interest in it.

Examine this mind argument. Assume that all objects exist in space including your own body. Assume that there are no other minds in the universe. All other sentient beings including humans may seem to have conscious minds, but it is only an illusion. In reality they are like robots or zombies programmed to go through their motions. They have no self-consciousness.

Unlike the solipsism scenario, all geological and historical events have physically taken place just as recorded. However, humans at no point controlled the events since they had no minds. They were like characters on a movie screen. Somehow events turned out the way they did, but it was not due to any control by the minds of any sentient beings.

No mind was involved in scientific discoveries, inventions, or artistic creations. They just happened in your imagination. The only mind that has existed since the beginning of the universe is yours. That is the only one for which there is indubitable proof since you can see that it is functioning. All the others are open to doubt. As much as it appears that others have minds that aid them in controlling their actions and making decisions, it is just an illusion on which you cannot rely.

This line of thinking again leads to absurdity as in the case of solipsism. Belief in other minds turns out to be so much more plausible than seriously entertaining the alternative which requires believing that every human who ever lived on this earth, except you, has been a mere automaton. Countless intellectual accomplishments just happened to appear in your imagination. It also becomes necessary to believe that all those who seem to have interacted on a meaningful basis with you are in effect cardboard characters. You are the only being in the entire universe that possesses a mind.

There have been philosophers who have criticized the attempt to show that other minds exist by showing an analogy between the way your mind works and those of others. According to these critics, like Norman Malcolm,(Norman Malcolm, "Knowledge of Other Minds," The Philosophy of Mind, V.C. Chappell, ed. (Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1962).) such an inference is unacceptable because the only valid way to prove another mind exists is to step into the privileged position of that other mind, and that is impossible.

These critics go too far. They are grasping for absolute certainty, which we saw was unrealistic. It certainly seems likely that pointing to similar behavior and habits of thinking would indicate that others had a mind even if they exhibit different psychological characteristics. Some people are shy, while others are outgoing, and so on. However, the differences in mental behavior are far outweighed by the similarities. Then there is the inference by analogy that other creatures like monkeys and elephants have minds. Nevertheless, if an inference based on an argument from analogy is not acceptable, then the reductio ad absurdum argument should be. If it is bolstered with the argument from analogy, then the proofs together appear unassailable.

Multiple Pea Brains

When i followed the preceding argument, i had to believe i had to exist in some physical form. I have always tended to think in a physicalist manner. I accepted that maybe my body did not exist, but still had to believe that my mind, which was conscious and always perceiving, had to have a container. I made that container a very economical one--a green pea--a pea brain to be exact. That pea was in effect my brain, which was all that was necessary to produce all the mental activity that i observed. That pea was all that existed so it was also the universe. I imagined it as sitting there by itself in the middle of totally dark infinite space. In your case, the pea would be you.

Here is another possibility that can be imagined, but it is getting far away from the ideaist desire to get away from matter. It is that each person that you observe exists, along with their minds. You are not alone. Each person, including yourself, exists as a pea that is the producer of all her/is activity. All that is found in dark space is rows and rows of green peas, one for each person who ever existed. The peas are completely stationary and motionless. They do not have to be watered or tended in any way. All the activity that takes place in any particular mind occurs in its corresponding pea. If two or more minds interact, they remain stationed in their respective peas. They communicate through brain waves, which are something like radio waves in that they travel through the air. (Perhaps there exists an oversized pea that is the mind of God.)

The arrangement shows a very tidy economy. There is no need to construct mountains, rivers, skyscrapers, foxes, raccoons, or any of the rest of the vast number of objects that have been seen in this world throughout its existence. All that is necessary is that the peas be able to imagine them and that their various imaginations be properly coordinated so that there is a harmony of perception. All that has happened in the past, all the momentous events that have been recorded by history have taken place in these peas. Every bit of agony and suffering that have been felt by sentient beings through the ages has been fully borne by them. The only thing that has been different from what we now believe is that there has been no involvement of other matter whatsoever. It has just been internal images experienced in coordination by peas.

The Viability of Solipsism

Solipsism retains the slightest viability because there is still the minutest possibility that you are in a continuous dream in which nothing exists but your mind having that dream. In spite of how absurd that has been shown to be by the prior arguments and how difficult it is to believe it is true, you cannot step outside yourself to prove that other things exist. You may feel uncomfortable that you are the only entity in existence and may not wish it so, but it remains the slightest possibility. All the human accomplishments and historical events and personages never took place other than in your mind. It remains a possibility because it is irrefutable even if there is tremendous evidence against it. Note that this can only be a possibility for those who demand absolute certainty.

A Lesson from Solipsism

Is there anything for those of us who cannot accept solipsism to learn from the situation? Yes, and that is epistemological modesty. The fact that we cannot ultimately say with absolute certainty that objects or other persons with minds exist--something as fundamental as that--shows that we should be very modest in claiming that we know anything with unwavering certainty.

Another point to observe once again involves the insistence by some skeptics that we cannot know a proposition unless we know it with absolute certainty. Based on this position, we cannot even be sure that the objects (including people) around us are real. Given the implausibility of this position, we should adopt the more reasonable position of highly justified knowledge.

Too often we claim to know facts beyond a doubt, even without considering if there is adequate evidence for our position. If we are honest and observant, we notice as we mature that we turn out to be mistaken more often than we would care to admit. This serves to make us more modest about what we claim to know. If that is not enough, we can reflect on how we cannot even claim complete certainty about the objects we perceive. Unfortunately, those who are in most need of appreciating these lessons are the ones who most likely fail to see the value in them.

Table of Contents (Part 1)


5 The Sources of Knowledge




When trying to narrow down the sources from which we gain knowledge, it does not take long to notice that much of it is obtained through the senses. Vision and hearing respectively provide the most information and the other three senses follow. Pain, both bodily and psychological, is an interesting case. It is not catalogued as one of the five senses but nevertheless would seem to count as a sensory phenomenon. The same can be said of pleasure.

In the sciences it is necessary to gather information through the senses either directly or indirectly. Sciences like physics, chemistry, and astronomy employ mathematics to analyze the information first gathered by the senses. The calculations always have to conform to the data gathered by the senses. An example can be found in string theory, which has tried to come up with a comprehensive explanation of subatomic physics. Many papers have been written elaborating and extending the theory. Nevertheless, physicists such as Nobel Prize winner Steven Weinberg, have noted that all that theorizing and calculating will be for naught if there is no confirmation through observation and experiment in the actual physical world.

The Senses in History

This has not always been the view. In the early days of philosophy, the senses were seen in a much dimmer light. Parmenides and Plato distrusted the reliability of the senses due to the erroneous reports they sometimes provide. Sight can indicate there is a pond of water on the desert floor 1,000 feet ahead when in fact it is only a mirage. The ears can report the sound of a baby crying when it is actually a bird chirping. Parmenides thought that those shortcomings of the senses indicated that objects we observe everyday are not separate and distinct as seems apparent. Instead all reality is one.

Plato did not go as far but definitely agreed that the senses were not trustworthy. He therefore came up with an alternative--the assumption that there were what he labeled Forms. These were the ideal of all the objects found on earth. The objects on earth were imperfect copies of the Forms. All the chairs on earth were imperfect representations of the perfect Form of a chair. The location of these perfect Forms has been called Platonic Heaven but cannot be found by anyone. This place also holds Forms of abstractions such as justice and beauty.

According to Plato, to attain true knowledge is to become acquainted with the Forms by means of reason. The senses with their unreliability could never provide adequate knowledge. On the other hand, an individual could not expect to attain the proper knowledge merely through the use of her/is own reason. One had to undergo the proper philosophical training in order to reach the desired level of enlightenment. This is where professional philosophers of the time came in. They developed a mission to promote the use of reason and to do so in a very favorable light as compared to the lowly senses. A school such as Plato's Academy was supposedly a good place to receive the necessary training.

If they had reflected on it a little, they should have seen that mistakes in reasoning are frequently made. Simply in the carrying out of simple arithmetical calculations people often make mistakes. Surely even Plato and the best of his students would have realized that even very intelligent and knowledgeable persons can make mental mistakes and reach erroneous conclusions.

Pythagoras was another Greek philosopher who apparently believed that great truths could be discovered through the sole use of reason with the aid of mathematics. For a long time, reason remained in the highest regard. It was not until around 1690 that John Locke spurred a spirited debate about the epistemological value of the senses. The great majority of today's philosophers would agree that the senses carry the heaviest load in attaining knowledge.

The view that the primary source of knowledge is the senses is called empiricism. The view that some of our knowledge is gained solely by reason is known as rationalism. The rationalist means of obtaining this knowledge also includes innate (inborn) ideas as well as some form of intuition. There are not many adherents of rationalism today.

The first declared empiricist was Epicurus.(P.W. Hamlyn, "Empiricism," The Encyclopedia of Philosophy.) He was in agreement with his pretemporary Democritus, who gave the senses their due regard in fragment 125 of his writings: "Wretched mind, do you take your evidence from us and then overthrow us? Our overthrow is your own downfall."("Rationalism,"The Oxford Companion to Philosophy, Ted Honderich, ed. (Oxford: Oxford University Press, 1995) 741.) Thomas Aquinas also believed that the senses were a source of knowledge,(Hamlyn) but it was not until Locke that empiricism was given full attention.

The Aid of Reason

One can adopt empiricism and still accept that reason may bring knowledge to some limited extent. This has been called the empirical use of reason.(Moser and vander Nat, 20.) The rationalist would not disagree with using reason in this way. S/he would want to go further. The task then is to investigate what further part, if any, reason plays in attaining knowledge. The basic question presented: does reason provide any original knowledge independent of that provided by the senses? If yes, an additional related question arises: is that knowledge about the world?

We spend much time thinking in a variety of situations. Often that thinking involves reflection on knowledge we already possess acquired through the senses. After assessing the situation, it may be that we need to make a decision. That will involve reasoning. Coming to a decision will not involve the discovery of any new facts. It will only involve choosing a course of action.

An example of this process of thinking could be used in finding an efficient way to cool your home in the summer. Let's say you live in a house that has a very efficient central air conditioner but it uses a lot of electricity. You want to supplement the air conditioner so that it won't have to turn on as much. You are always concerned with saving energy and gather information about alternatives. You can install more insulation in the house. You can use fans in different rooms. There are a number of comparisons on cost and location that have to be made.

The gathering of this information mostly involves the senses. Even in the case in which you rely on the hearsay information collected by other persons, you are relying indirectly on their senses. For instance, the technician who tested a fan at the factory to determine the number of watts at which it should be rated had to use her/is senses in the process. You in turn use your senses when you read on the back of the fan the maximum watts for which it is rated.

After collecting all the information, at least some knowledge was gained through reason. This happened in comparing the energy consumption of the different fans. A small desktop fan consumes 14 watts of electricity, while one on a floor stand consumes 37 watts. While the knowledge of those two facts was gained by the senses, it could be said that the knowledge that the floor fan consumes 23 more watts of electricity than the desktop version is gained by reason after the senses provided the initial wattage values. It could be said that the values were basic facts and the difference a derived fact.

Then there is the final decision on what approach would be the most suitable. It is purely a matter of weighing all the related facts. It would seem that making this decision and the others leading to it would be a matter of using reason and not the senses.

Another example involves a process that takes place everyday around the world. This is the holding of a court trial. The factfinder (either a judge or a jury) takes in all the evidence. Once all the evidence is received in the courtroom, the factfinder withdraws to deliberate on it. After that point, it is improper to receive any new evidence. The only task left for the factfinder is to reflect on the evidence and reach a verdict. The procedure draws a very definite line between the receiving of the evidence and the deliberation over it. The factfinder may infer certain facts from the ones actually presented. These would be derived facts reached by reasoning. The entire deliberation would be a process involving reason.

Still, a quick third example involves Albert Einstein's formulation of the special theory of relativity. Einstein did not conduct any physical experiments to gather any facts. Instead, he relied on the experimental results of other physicists. One notable experiment was the Michelson-Morley experiment.

There was one kind of experiment in which Einstein did engage and that was a gedanken (thought) experiment. In his mind, he imagined how objects would travel under different conditions. In this way, he gained the insight into the questions involved, which no one had been able to accomplish before, and came up with the special theory. He also availed himself of mathematics. Again, when Einstein worked to come up with his general theory of relativity, he used only reasoning and mathematics. He was a theoretical physicist and like all theoretical scientists his task was to come up with explanations for the data that had previously been gathered by other scientists. In turn, after a theory has been devised by the theoretician, it has to be confirmed by the existing data and by any new data that may later be discovered.

It cannot be said that Einstein did not provide new knowledge about physics when he recognized how the physical world worked and provided explanatory statements about it. Those before him had not been able to accomplish it even though they had the same evidence at their command. Einstein produced new knowledge through his insight. He brought out knowledge in addition to that which was previously known empirically.

It cannot be said that the only facts that can count as knowledge are the sensory ones. They may provide a foundation for all knowledge, but the insights that can be gained by reason from them should also count as knowledge. Those insights may be derived and dependent but are gained through mental effort, which can be intense. Reason does not merely rearrange the facts that are gathered through the senses but instead can provide valuable new insight. It is the empirical use of reason. Empiricists have conceded this much: some knowledge is gained through reason, even if it is ultimately dependent on the facts gathered by sensory means.

Mental Abilities

It seems that Locke, the first modern empiricist, was able and willing to make the concession. While he did postulate a tabula rasa (blank tablet or slate) to illustrate that humans were not born with any specific items of knowledge in their brain, he did agree that they came into the world with certain mental abilities, which he called reflection.(John Locke, An Essay Concerning Human Understanding, Book II, Chapter I, reprinted in Moser and vander Nat, 126, 127.) He clearly counted mental abilities including reasoning as one of two ways to gain knowledge--the other being sensation. Most empiricists have agreed with Locke's characterization.

There have not been many rationalists who have pushed the position that knowledge only comes through reason. Gottfried Leibniz was a contemporary of Locke and a prominent rationalist. He responded to Locke's ideas in one of his essays.(Gottfried Leibniz, "New Essays of the Human Understanding (Introduction)," The Monadology and Other Philosophical Writings, Robert Latta, ed. (Oxford: Clarendon Press, 1948) 357, reprinted in Moser and vander Nat, 146.) He admitted that "the senses . . . are necessary for all our actual acquiring of knowledge."(Id. 147)

For Leibniz, even sensory knowledge had to be supported by nonsensory pure knowledge, what he called "the natural light." Another description of his for the same idea was the "light born within us." It was supposed to be what gave us the ability to recognize universal and necessary truths, something that the senses were incapable of doing. Leibniz made sure to point out that it was a privilege humans had above the brutes.(Letter to Queen Sophie Charlotte of Prussia in 1702, reprinted in Leibniz Selections, Philip Wiener, ed. (New York: Charles Scribner's Sons, 1951) 355-364.)

On the rational side of knowledge acquisition, Leibniz seemed hesitant. He pointed out that there was some type of knowledge that was originally found in humans from birth, "[T]he soul originally contains the principles of several notions and doctrines . . . "(Moser and vander Nat 147) However, he did not commit to any specific propositions that were known by the soul. It appeared that this inborn knowledge could not be specific. Instead, Leibniz declared that "ideas and truths are innate in us, as natural inclinations, dispositions, habits or powers . . ." (Id. 149) Now that sounds very much like the mental abilities--not propositions--that Locke was willing to agree contribute to knowledge. Even Leibniz seemed to say that the mental abilities with which we first appear on earth simply contribute to the knowledge gained through the senses. Accordingly, there are no specific items of information placed in us innately. What then could be at the bottom of the difference between the empiricists and rationalists?

Root of the Dispute

The root of the dispute has been that the rationalists have wanted to say more than that we are born with mental abilities that enable us to glean knowledge from basic facts through the forming of concepts, making of inferences, and drawing of conclusions. Even though they have been vague on how it works, the rationalists have ostensibly claimed that each person is born with innate knowledge or at least innate concepts. This innate knowledge could presumably be expressed in specific propositions. Usually these were ethical propositions such as "killing is wrong" or "stealing is bad." It is true that the rationalists of Locke's day claimed far more for the human soul than those that came later. Rationalists gradually became more moderate.

There are not many philosophers today who declare themselves rationalists, but there are still de facto rationalists. They may not even realize that is what they are. Many of them are just members of the general public, including religionists. A de facto rationalist is really anyone who believes humans are capable of gaining knowledge of specific propositions independently of the senses. This is not reason that provides insights into known sensory knowledge. It involves the nonempirical use of reason. It includes innate knowledge but can also involve other means of attaining knowledge such as intuition. It often involves claims in metaphysics and ethics. Given these parameters, there will probably always be rationalists. Even among philosophers, those who defend religion tend to use rationalist assumptions.

Rationalists claim the human brain is able to comprehend mathematics and logic without aid from the senses. Leibniz expressed this view but tempered it thus, " . . . although without the senses it would never have come into our heads to think of them."(Id. 147) On this point, empiricists agree.

Innate Knowledge

Where the rationalists went beyond empiricism was in claiming that some sensory knowledge is gained outside the senses. Such knowledge is called a priori--prior to knowledge learned through experience, i.e. the senses. This innate knowledge was not about trivial matters. The understanding was that significant facts involving metaphysics and ethics were innate knowledge. According to the rationalists, all humans know from birth that God exists, that there is an afterlife, or that it is desirable to help those in need.

One of the criticisms of innate ideas that Locke presented was that people do not seem to be aware that they have them. This was especially true of idiots and children. No child starts spouting out propositions known innately soon after it begins to talk. Leibniz answered that people often lack awareness of habits, memories, and incidental perceptions(Id. 149) but can be prodded to remember.

A reply to Leibniz can point out that perceptions as the amount of light in a room or the sound of the leaves being fluttered by the wind outside are simple ones. Once our attention is called to them, the awareness becomes immediate. The recovery of individual memories may take a little more time, but it is also usually a simple process. Once the memory is recovered, it becomes certain that it is a correct one. It is not so easy to confirm that we knew certain more complex ideas innately.

The idea that there is pre-existing knowledge that can be drawn out of people supposedly originated with Socrates. In Plato's dialogue "Meno," Socrates elicited some answers from a slave boy about the size of some squares he drew and the length of their lines. The boy readily answered all the questions Socrates asked him, which turned out to be correct. The questions were elementary, and it is very plausible that the correct answers could have been readily given by anyone with some intelligence.

On the basis of these few questions--in comparison to all the ones that could have been asked about geometry--Socrates claimed that the boy was merely recollecting what he already knew. Socrates insisted that a teacher does not really teach but only elicits from the student what s/he already knows from birth. From there Socrates--without further justification--jumped to some additional conclusions: that knowledge is contained in the soul, it always continues to exist in the soul, and the soul must be immortal. Did he really think through the evidence for the nature of the soul, or was he swayed by wishful feeling?

Extraction of Innate Ideas

The question that has to be asked: how reliable are the recovered facts that we reputedly knew at birth? How much confidence can be placed on claims that certain metaphysical or ethical propositions have been remembered as opposed to their simply being learned at a later date in life? Actually there have not been many people who have claimed to remember specific propositions from birth and accordingly not many tests of what people purportedly know. What about claims by different people that they remember propositions that are in conflict with each other?

This brings up what is perhaps the best test of whether certain propositions are innate knowledge. The rationalist claim that people possess innate knowledge has to mean that those principles are universal. I am not aware of many rationalist discussions about specific innate ideas. Supposedly, innate universal ideas could be the complete agreement of all humankind that killing or stealing is wrong.

Since then, it has become clear that there are what have come to be called psychopaths--people who kill a number of humans (and probably other animals) without restraint. They seem to kill without remorse; many appear to enjoy it. Apparently these cold-blooded murderers would not agree with the rest of humanity that killing is wrong. There are those who feel it is wrong but can't help acting out of compulsion.

Then there are the different opinions on what excuses should be allowed for killing. There are those called pacifists who do not believe in killing even in a just war in which their country is first unjustifiably attacked. On the other side of the spectrum are those who like to see their country go to war over the slightest pretext and readily rationalize why their country is engaging in hostilities. Some of them become soldiers of the type who seem to enjoy war.

There are those who favor the exception for killing those who kill others, i.e. capital punishment. There are those who favor capitol punishment in situations other than murder such as treason, rape, and adultery. Some of these same death penalty proponents oppose legal abortion. There are those opposed to capital punishment under any circumstances even for those who kill numerous people and would kill again if given the chance.

Stealing is an act that is condemned around the world. Yet there are those who favor it, mainly among the class of thieves themselves. Many criminals are sociopaths who have little feeling for others. They see no wrong in it. In fact, the father could have been a professional thief who even taught and encouraged his sons to become enterprising and skillful burglars. Social scientists, as well as lay people using careful observation, have concluded that children's values and conduct are strongly influenced by the example and environment around them, especially in the first five years. The entire family does not appear to have any innate idea that stealing is wrong.

If rationalist innate ideas occurred, you would expect the science of genetics to have uncovered them. Study of genetics indicates that heredity helps determine abilities and characteristics. Genetics has not shown that all children are born with the belief that stealing is wrong.

Return to the claim that those innate ideas are not immediately apparent to children but have to be coaxed out Socrates-style. That whole endeavor becomes that much more controversial. It would seem that the awakening of the innate ideas through the alleged relearning process would produce ideas that were crystal clear in all. Yet even after this alleged relearning period, there are so many disagreements and controversies over right and wrong.

The question also arises as to why it takes so long to recognize those innate ideas. Why doesn't it happen swiftly? Shouldn't every five year old recognize them? Mathematics and logic are clearly the subjects of study that meet the rationalist ideal of truths than can be laid out solely by the use of reason. They need not refer to anything in the actual world. Their propositions are universal. This is not true in connection with the contents of the world where mountains vary in appearance and different vegetation is found depending on the location on the globe. Given that, it would be expected that these subjects could very readily be remembered innately by everyone.

As a result, it would be expected that high school students would love to take subjects like algebra and calculus. They would all be very good at them. However, much the opposite is true. Even the better students have to work hard to master these subjects. Many other students--perhaps the majority--manage to gain some understanding of them but at the cost of enduring an unpleasant experience. Some who are bright in nonmathematical areas never become proficient at math. Few of the students in later years clearly remember the axioms or theorems they learned. Given this repeated experience that humans have had with such a reason oriented area as mathematics, it becomes very hard to believe that knowledge of it or anything else is innate.

Cultural and Epochal Differences

Then there are the cultural and epochal differences that have been keenly recognized since Liebniz's time. It turns out certain attitudes that Leibniz might have believed were innately implanted are not universal. One cultural difference between his home in Germany and other parts of the world--in his time and in current times--is in the standards of public dress.

In Europe around 1700, people went to public places fully dressed. Men wore long-sleeved shirts and pants, sometimes with stockings. Women's dresses extended far down and did not show bare backs or much of the arms. No woman would have been seen wearing pants or shorts. Shoes would have been worn, not flip-flop sandals. Soldiers had ornate, bright, and attractive uniforms that they used, even in combat. They used no coarse uniforms--fatigues--for training and combat.

If Leibniz had traveled abroad, he would have witnessed a great difference. In Arabia and China, he would have seen both men and women dressed fully in robes and sandals. In the Americas, moccasins would have been distinctly different from footwear in other global regions. During the summer months, dress would have also been scant as was the case in Africa.

If Leibniz could visit today, he would be surprised at the skimpy attire worn in the summertime in most places around the world. Then there would be the nude beaches. In 2009, the city of Ashland, Oregon allowed people to go in the nude in public. One woman in her 20's started to ride her bicycle totally nude much to the delight of many of the men in the town.

Rationalists might respond to these observations by saying that tastes in dress are not significant enough for there to exist any innate ideas or principles about them in the human brain. Actually a moral connection has long been made to choice in wearing clothes. Through the 20th century as women's skirt hemlines rose, there was no shortage of critics who thought the trend toward more revealing clothes was not moral. The criticism was mostly directed at women. The critics warned it was a threat to the fabric of society. They might have had a point given later indications such as the great rise in the divorce rate and the number of babies born to unwed mothers with little means of support.

Two more examples that point to differences in moral attitudes involve homosexuality and the treatment of animals. For most of history, homosexuality has been abhorrent. There were certainly no laws that protected homosexuals. Religions derived from Judaism held the practice in contempt based on several passages in the Judaic scriptures. The harshest was Leviticus 20:13, which states, "If a man lies with a male as he lies with a woman, both of them have committed an abomination. They shall surely be put to death . . . " In contrast, the ancient Greeks openly practiced homosexuality. Although disapproval of homosexuality remains throughout the world, there has been much greater tolerance since around 1980.

Another practice that has changed through the ages has been the treatment of animals. Before 1900, there was little public concern for their welfare. Any care for them was for human benefit since cows, pigs, and chickens provided food on the table. Horses, mules, and oxen were a good source of transportation and hard labor. These animals could still have cruel pain inflicted on them at the whim of the owner. Slaves, serfs, and children could also suffer cruelty and assorted indignities without any legal compunction.

Cats were the target of special cruelty for centuries beginning in the Dark Ages in Christian Europe. They were systematically destroyed. It has been estimated that if they had been allowed to live during the time of the Black Plague and other plagues, they might have helped in preventing the spread of these diseases. They would have kept down the population of the rats that carried the fleas that carried the microbes that brought the plagues. It seems the Christians of the times felt an aversion toward cats because they associated them with witches. The superstition was probably fueled by the practice of old women who lived alone to keep cats as companions. The practice turned out to be unfortunate for both the cats and the women who liked them.

In the 20th century, groups formed that began to provide shelter and protection for animals. Much money is raised for their provision and crimes were introduced for abusing animals. In 2007 a top quarterback of the National Football League was caught running a business that sponsored violent dog fights and was sentenced to 23 months in prison.

Given so many different orientations, there can be no such thing as innate knowledge. The supposition that humans come to this world preprogrammed with innate knowledge expressed in specific propositions is only the product of wishful feelings and inflated estimates of human powers.

In the areas of metaphysics and ethics, there is no reason why the rationalists could not point out specific propositions that are implanted at birth such as "there is a punishment after death" or "hell consists of the eternal endurance of pain by fire." Ethical propositions could be ones like "abortion is always immoral" or "bearing a child out of wedlock is immoral" or "addiction is always wrong" or "slavery is wrong." Indeed, if the latter maxim had been implanted in people, it could have saved much grief through the ages.

Innate Abilities

Some rationalists have apparently conceded that there is no such thing as innate knowledge but have fallen back on a belief in innate concepts. As in the case of innate knowledge, this idea is vague. A concept is not one simple idea. It is a generalized idea of a class of objects. It is not clear that we come equipped with concepts like triangles, rhombi, dodecahedrons, parallelepipeds, surds, complex numbers, transfinite numbers. Do we come with set theory implanted in our brains and all the concepts involved in it? It is even harder to comprehend which nonmathematical concepts are innate, especially those involving metaphysics and ethics, which are so important to rationalists.

The claim of innate concepts is also flawed. It is much more plausible that children learn their first as well as later concepts through observation, experience, memory, and the ability to form concepts. The rationalist agenda gained some new life after linguist Noam Chomsky received attention in the 1960's. He posited that all humans are born with some innate knowledge in the form of "universal grammar": a set of interactive grammatical principles. Based on this, they allegedly go on to learn the specific languages to which they are introduced. Chomsky was not clear how humans become aware of these specific principles of grammar. It does not seem that these principles pop into our heads and that from there we go on to learn languages armed with them. As in the general belief in innate knowledge, the ideas behind the innate knowledge of language are unclear and hard to pin down.

As in the general case, the whole idea of linguistic innateness trades on confusing knowledge with ability. There is no doubt that humans come with the ability to learn, understand, and use language--even with the capability of being inventive with it. This should not imply that we come with any specific language or grammar already imprinted in our brain.(Syntactic Structures is the title of the book Chomsky published in 1957 in which he fully discussed his ideas about a universal grammar found in all brains.) Instead, we come with eardrums, auditory canals, auditory nerves, tongues, vocal cords, neurons, and other bodily instruments necessary for language. Of course, whatever neurons and other physiological items are needed for memory and understanding are very important. Without memory, a child cannot even begin to build a vocabulary.

Another important ability a child must be able to develop is that of making logical inferences. This is important in both a linguistic as well as a general context. In language, it is helpful to be able to infer the meaning of a word one hears or reads for the first time. This can often be done from understanding and reflecting on the context in which the word was used. This involves the ability to use logic. The ability to make inferences also helps in a wide variety of other situations. This ability along with so many others is strengthened as a child gains experience. Brain and nervous system growth must also help. Neurological processes are more plausible as explanations of how language is learned than any specific innate principles that may already be spelled out in babies' brains.

There is one way in which some sort of innate structure in the body for learning language may possibly exist. This is if some mechanism were discovered in the brain that provided specific instructions to different parts of the brain and nervous system on what part to play in the learning of language. This would be akin to a computer program giving instructions to a computer on what action it is to take in response to different situations. The mechanism for language would be a similar set of instructions in the brain that enabled the brain processes necessary for the learning of language as well as all other types of learning. The enabling instructions would be the same in all people.

Perhaps someday neurologists may be able to figure out the specific brain instructions that guide the required procedures. If that could be accomplished then, and only then, could there be any claim that there is anything innate that comes in any specific form. Nevertheless, this seems far removed from any rules involved in Chomsky's universal grammar. It would probably only involve a program with instructions for general learning with a few specific applications to linguistic learning. It would certainly be a far cry from the ideas and principles espoused by rationalists with respect to metaphysics and ethics.

The Indications of Evolution

Interesting questions come up with respect to a report that there is a baboon with a 2,000-word sign language vocabulary. Would the rationalists claim that she has an innate knowledge with respect to language? What are the indications that evolution favors one or the other of the two rival methods for attaining knowledge--empiricism and rationalism? Another way of putting it is this: does the evidence of how evolution has imparted to animals their adaptive ability to gain knowledge show that it has been through their senses or through reason?

Clearly, evolution has given animals the ability to gain knowledge almost exclusively through the senses. Many animals have very acute senses of sight or hearing or smell, certainly superior to humans. Evolution then has brought about development along the lines of empiricism. This is no big surprise. Even rationalists concede that humans themselves gain most of their knowledge through the senses. Of course, it is even more true of the other animals, who clearly depend heavily on their senses.

We have seen that nonhuman animals use reason, even if it is limited. Empirical reason plays a part in assessing empirical observations and arriving at subsequent conclusions. Furthermore, it would seem that a baboon would have to use reason to learn and use sign language. A dog or cat can weigh various reasons. It has been observed that octopuses can learn to open boxes even locked ones. Evolution has then provided various animal species with the adaptive ability to use empirical reason. However, empirical reason has not been the root of the disagreement between empiricism and rationalism.

The problem has been that rationalists have held we have specific innate knowledge at birth. In this regard, nonhuman animals exhibit only a very limited use of reason and probably no innate knowledge. Evolution does not seem to have provided that. That also provides an indication that humans would not have arrived at rationalistic innate knowledge through evolution.

It is hard enough to prove that humans have innate knowledge and then much more difficult to show that nonhumans do. Innate ability and instinct are another matter. In connection with the more lofty knowledge the rationalists like to promote, it would seem that even they would be reluctant to claim that nonhuman animals acquire it at all, much less innately.

Before pronouncing that evolution lends no support for rationalism, there is one more avenue that has to be considered. While innate knowledge is the method that has been most commonly cited by rationalists for attaining knowledge, there is also intuition, however weak the proof for it may be. Their claims seem akin to those of intuition.

This brings up the general nonphilosophical claim that nonhuman animals sometimes manifest the ability of intuition. It is sometimes opined that the power of intuition appears at times to be stronger in nonhuman animals than in humans. This power can supposedly be observed in times of impending danger, such as storms that are approaching. Nonhumans show prescience in these situations while humans at the same time appear oblivious to natural conditions.

During the great South Pacific tsunami of 2004, it was reported that the humans on the island beaches that were hit by the storm never felt any clue before the arrival of the disaster. On the other hand, some nonhuman animals who were normally seen near the beach were nowhere around when the destructive tsunami made landfall. Allegedly, they had been seen moving collectively to higher ground a long time before the tsunami arrived. It was inferred that they were able to in some way "sense" that the water and its winds were approaching. It has been said that the nonhuman animals seem to have a "sixth sense" that they use in these situations and that humans do not have it.

The operative word here is "sense." Whenever scientists have tried to explain the phenomenon, they have talked in terms of the nonhumans having a physical sense or ability to predict what dramatic events were soon to occur. The sense helped them become physically aware of climatic conditions like atmospheric pressure that signaled a looming storm. This beneficial sense appears to be wholly consistent with an evolution that has imparted this ability to them. Innate knowledge is not even relevant here. The plausible explanations focus on a possible sixth sense and physical experiences, which in turn support empiricism.

Adequate ability is enough to explain the gaining of knowledge. It is not necessary to posit innate knowledge that would have to come in the form of specific propositions. Rationalists have confused themselves and others by seeing innate knowledge where there is only innate ability, including instinct.

Table of Contents (Part 1)


6 Kant's Synthetic A Priori




Kant wanted to bolster the knowledge we gain through reason. Think of it as nonempirical reason or pure reason, as Kant referred to it. Reason here does not mean rational thinking or reasonableness but rather some special inborn knowledge. Kant was very important historically in pointing out the limitations of rationalism as it was promoted in his day. The rationalism of the time was based on Leibniz's ideas including that human minds came equipped with nonsensory knowledge that was universal and necessary. It was thus also indubitable and incontrovertible.

Kant understandably came to question whether the knowledge given by reason was always universal and necessary. His thinking was influenced by Hume's observation that there was nothing necessary and inherent whenever a cause was followed by an effect. Instead, all that could be affirmed was that the same effect was seen to follow a given cause. Kant feared that this apparent lack of certainty for causation was a threat to science. He decided to find a way to be sure that causation rested on solid ground so that science could be possible. Scholars had already been engaging in science so it is puzzling why Kant thought it had to be made possible.

Kant was also concerned with morality. For people to be able to make sound moral choices, they had to have freedom of the will to properly make those choices. Yet Kant recognized that the findings of Isaac Newton about the always uniform motion of the planets and other objects presented a universe in which everything was inflexibly controlled by mechanical causation. This did not bode well for human free will. He felt there had to be a way to save a place for human free will for the sake of morality and, at the same time, a certain and mechanical causality for the sake of science.

This was a tall order, but he did not shrink from it. Instead, he built an elaborate (and confusing) scheme for solving the two problems. He was critical of the rationalism of his day. He identified metaphysics with rationalism, and another big question he posed was how metaphysics was possible.

Analytic and Synthetic

Before continuing, it would be beneficial to review the terms analytic and synthetic. A statement is analytic if you can determine its truth simply by analyzing its words. Just examining the definition of its terms should tell you its truth or falsity. The statement "all bodies are extended" is true because a body must always have extension by its very definition. It could not be otherwise. To say that the statement is not true is a contradiction since a body cannot be unextended by its very definition. Thus the sentence also states a necessary truth. This means it cannot be false. The contradiction of a necessary truth is a clear and unavoidable falsehood.

Mathematics and formal logic are analytic in nature since they are composed of analytic statements. Both subjects begin with certain definitions and axioms. The truths or theorems of these subjects are then built on the basis of strict rules of inference that are required to be used. Those truths are fully dependent on the initial assumptions. Different mathematical and logical systems can be formulated based on different initial assumptions. That all statements in mathematics and logic are analytic means that they don't depend on any object or condition found in the world and thus do not depend on their discovery by the senses. A statement is analytic if its truth or falsity is solely dependent on the meaning of its terms.

Empiricists starting with Hume have accepted that there can be knowledge of analytic truths without dependence on the senses. However, they have at the same time denied that any analytic truths can give us knowledge about the world, i.e. about atoms, viruses, trees, hogs, mountains, and so on. All that kind of knowledge is called synthetic. Analytic and synthetic are terms that Kant coined and their use has continued to the present. Here again there has been agreement from the rationalists on analytic truths not being about the configuration of the world. Analytic truths can also be thought of as logical truths and synthetic ones as contingent truths, contingent on the state of things in the world.

Kant decided that science and metaphysics would be possible if synthetic a priori truths were possible. Since a priori means prior to experience, a posteriori means after experience. Statements that can be known apart from experience are a priori. Those that become known through experience are a posteriori. In this regard, ask yourself, in the following paragraphs, if Kant sided with the rationalists in believing there could be a priori truths.

Kant asked whether synthetic a priori propositions were possible. He then assumed with little more question that they could be true and set out to show how they could be possible in his best known book The Critique of Pure Reason. There are problems in Kant's epistemological scheme that put his program in serious doubt. Objections have been made that are damaging, if not fatal, to his scheme. Those criticisms are also a mark against the idea of inborn knowledge and in favor of sensory knowledge.

The Ding an Sich

In order for Kant's system to work, he had to posit a separation of things into what was their appearance and reality. In taking this approach, Kant was actually resorting to an idea popular in metaphysics for many centuries. Certainly, it was a crucial part of Plato's metaphysics. In effect, Kant was engaging in metaphysics although he alleged that it was an open question whether metaphysics was even possible. According to Kant, what we observe through our senses is phenomena. Any single object is observed as a phenomenon. Behind that was the noumenon or ding an sich in German. That was the thing in itself that could never be observed. This applied to everything in the physical world.

Understand that this noumenon was not just the clumps of matter made up of particles, atoms, compounds, etc. that constitute objects. Everyone can agree that our human perception of a ball is probably different from what it actually is. Other creatures surely observe something different. To Kant and his followers, noumena were apparently more mysterious.

The question that arises from Kant's claim is how he came to know about the hidden world of noumena that was supposed to be unobservable. Perhaps Kant had supernatural powers unavailable to the rest of us by which he could know about the occult side of things, but he did not make any such claim. There does not seem to be a clear line of inference that he could have followed. If there were, it seems that it could have been more obvious to philosophers before him. The same puzzlement springs up with respect to the hidden reality that other metaphysicists like Plato claimed to have discovered.

Additional puzzlement arises with respect to the various abilities and compartments that Kant claimed to have discerned in the human mind. He did not derive much of it from philosophers that preceded him with the exceptions of the basic idea of categories that came from Aristotle. Even then, Kant modified the categories and their significance. He did not mention consulting with either medical doctors or psychologists.

The fact is that all of the ideas and concepts that Kant came up with were a product of his own speculation. This along with his preconceived intentions in studying the problems of metaphysics, science, and morals, leads to questions of reliability and bias. It was clear from all his writings that he was focused on making sure that morality was possible through freedom of the will. At one point he stated, "I have therefore found it necessary to deny knowledge, in order to make room for faith."(Immanuel Kant, The Critique of Pure Reason 2nd ed., trans. N.K. Smith (London: 1923) 29.) He went on to write The Critique of Practical Reason that concentrated on making morality work.

Kant's attitude does not seem to have been one of free inquiry to the extent that one would expect to see in a philosopher or other intellectual. He did not seem open enough to whatever the evidence his research might indicate regardless of how unwelcome the results. The convoluted and stilted layout of Kant's scheme with its unusual explanations and labels may be an indication that it had to be constructed in that way, given that he was so intent on establishing certain preconceived results. His bifurcation of reality into phenomena and noumena helped him in his quest to save morality.

Kant produced numerous new terms and labels for his system such as Transcendental Aesthetic, Transcendental Analytic, and Transcendental Dialectic. He certainly earned the label of jargonmeister. Psychologists were never impressed enough by Kant's scheme or its labels to adopt them in their theories. In spite of the turbidity of Kant's scheme and his language, he won many other followers and admirers in the years that followed. Perhaps it did not matter to them that they did not understand everything he said. Many, like Johann Fichte, liked his promotion of morality. Others, like Georg Hegel, selected some of his ideas to help build their own systems that were ultimately very different.

The Mind's Constructions

One of the most questionable and puzzling claims made by Kant was crucial to his system: the human mind made the objects it observed conform to it. The mind organized experience. Observation, according to Kant, did not take place as commonly thought, with the mind conforming to the fixed objects that it observed. For instance, Kant believed that the mind on its own furnished space and time, which he called a priori forms of sensibility. He apparently did not believe space was something out there apart from the observing person, something in which objects were found, something which allowed for the distances found between objects and for their measurement.

It is not known what Kant's opinion was on how nonhuman animals observed. It would seem they would also need to have the same mental powers. Or it could be that human minds created space by their perceiving it, which in turn allowed it to exist for nonhuman minds, plants, and everything else. This is reminiscent of Berkeley's "to be is to be perceived."

Consistent with Kant's thinking, space did not exist before minds appeared on the scene to create it. Yet, scientists have amassed extensive evidence that the universe came into existence about 9 billion years before the earth was formed and 13 billion years before any forms of complex life appeared. Perhaps Kant assumed that before that the mind of God observing everything provided for space, again as Berkeley had assumed. Kant did not mention this. If God's mind did hold up space, then it should not be necessary for individual human minds to create space in their perception.

With respect to time, Kant was closer to being correct in believing that it is a creation of the human mind. (I do not know of any discoveries of nonhuman animals keeping time but that is a possibility. For instance, chimpanzees could put pebbles down in a line for each time the sun sets.) Time is dependent upon motion. It takes a mind to construct an instrument, a chronometer, that itself constitutes a regular motion to keep track of all the other motions. In addition, at least one mind has to be present and able to keep a record of when the motions take place. For instance, the mind can measure motions relative to a revolution of the earth around its axis (one day) or to one revolution of the earth around the sun (one year). Kant would probably not have seen time as simply as that. Instead, he seemed to see space and time as much more powerful and mysterious. They separate us from the noumena, the things-in-themselves.

An interesting situation presents itself in the case of time-keeping trees. Remember that some trees keep a record of annual time as demonstrated by the circles found in their trunks when they are cut horizontally. It is said that each circle, called an annulus ring, represents a year that passed during the life of the tree. Apparently, trees on earth showed these records even before minds appeared. Minds were not necessary for the making of these time records. It could be claimed that time was being kept without mind.

But is that true? The trees were just living from year to year. They did not intentionally keep the annual time. They were not even aware of it nor was anything else. It took a human mind to discover the phenomenon and to understand it as a record of time.

Different Perspectives

You spot a boulder on the side of the road. Most of its color is a light tan with some lighter areas that are almost white and some areas that are a darker tan. It is almost in the shape of a cube with slightly rounded corners except that one corner is especially sharp. You are confident from past experience that other people would observe and describe it much the way you do although there might be few persons who might see minor differences.

One major difference can be found that is related to color vision. There is a small but significant percentage of people who are “color blind;” they don't see all of the colors that others do. With respect to sound, there is a minority that is tone deaf.

Then there are the differences with other creatures. For instance, it is said that dogs are all naturally color blind. What about other animals or insects? It is credible that a cockroach standing twenty feet from the boulder would see something different from a human. An alien from another galaxy might well see something that differs significantly.

In spite of these differences, it is plausible to think that there is something that is basic and constant behind all the different perspectives. Locke pointed this out. The images that different creatures observe are relative to their own given perspective. No single one of them can be chosen as being the correct one, not even the human one. However, it can be speculated that there may be something behind the various images that is not variable and does not depend on how it looks to any one individual or species. That something hidden can be thought of as the noumenon.

Yet you would think it is nothing more than the aggregate of atoms and molecules that comprise the object. For someone reading Kant for the first time, it seems that this is what he meant and nothing more. This is the way i at first understood what he was saying. He meant much more, and it is more far-reaching. From him, it seems more profound, even mystical.

The Categories

Beyond space and time, the categories were another important foundation block of Kant's system. They were a modification of Aristotle's set of categories. To Aristotle they were simply occurrences to which the mind adapted itself. Again as in the case of space and time, Kant posited that the categories preexisted in the human mind. They were not formed in the mind as the result of experience but were present a priori, before experience. They were also not analytic because they were not derived from analyzing the words in a definition. They were statements about the physical world and labeled synthetic because not analytic. They were synthetic a priori. The mind imposed certain characteristics such as space and the categories onto the world.

Kant has not been associated with the doctrine of innate ideas. Yet in his saying that humans possess certain concepts such as the categories without the benefit of experience, it would seem that he was referring to something innate. It has always been considered that Leibniz was an innatist, but Kant has not usually been given that label. Nevertheless, his classification of these concepts would seem to place him in the family of innatists.

Kant set out twelve categories that he thought had to be known a priori. They were fundamental to all knowledge and thus, according to him, could not become known through the experience furnished by the senses. They had to be instantiated in the mind from the very beginning. Kant was still under the sway of the rationalists with their distrust of the senses. The meaning of the categories was equally apparent to all. This made things much easier. After all, if you surmise that the categories may be a product of experience, you have to concede that people with different experience may come up with a different understanding of the categories.

To decide whether the categories really have to be known a priori--prior to experience rather than as a result of it--only three need to be examined: plurality, substance, and causality. To do this, it would be helpful to ask how young children might come to learn about these categories.

Plurality. According to Kant we are born being able to understand that there are multiple objects around us. Is that necessarily true? Take the case of a child in a crib who is first learning about things around her. At times there is only one human in her line of vision (perhaps one of her parents) while at other times there are two (maybe both parents), or there can at times be more than two (siblings, grandparents, or visitors). Eventually she may be given a toy for her amusement. Then at later dates, she may gradually be given several more toys. Say that the family has three pets: one dog and two cockatiels. Sometimes the child sees one pet, at times two, and at other times all three.

As this child grows up, she will encounter numerous other examples of the difference between one and many. This is just like the experience of all other children. Eventually when she learns a language and develops enough mental maturity, she will become able to articulate the meaning and difference between singularity and plurality. There is no need to suppose that the two concepts are so fundamental and difficult to grasp and apply that humans have to be born with them in their mind. Experience gradually teaches the young child.

Then there is the observation that other animals probably have a sense of the difference between singularity and plurality. A dog can readily distinguish between taking one bone and three bones and would very likely choose the latter, even if he does not eat them all at once. The extra two bones can be saved for later. It is doubtful that animals have an a priori sense of plurality. They probably gain the sense from experience. It is hard to know precisely how human children learn about the plurality of objects. It is a good chance that they gain it through experience but it may never be possible to understand how it occurs. Therefore Kant had no right to conclude that they know it a priori.

Substance. In looking at Kant's reasons for alleging that substance was known a priori, it may be helpful to review the following points he made in The Critique of Pure Reason. Numbers in the statement are added for ease of reference to the three points:

[1] If we remove from our empirical concept of a body, one by one, every feature in it which is [merely] empirical, the color, the hardness or softness, the weight, even the impenetrability, there still remains the space which the body (now entirely vanished) occupied, and this cannot be removed. [2] Again if we remove from our empirical concept of any object, corporeal or incorporeal, all properties which experience has taught us, we yet cannot take way that property through which the object is thought as substance or as inhering in a substance (although the concept of substance is more determinate than that of an object in general). [3] Owing, therefore, to the necessity with which this concept of substance forces itself upon us, we have no option save to admit that it has its seat in our faculty of a priori knowledge.(Immanuel Kant, The Critique of Pure Reason, Norman Kemp Smith, trans. (New York: St. Martin's Press, 1969) reprinted in Jack Crumley II, Readings in Epistemology (Mountain View, California: Mayfield Publishing, 1999) 528, 530.)

The first observation to be made about Kant's comment (1) is his reference to the "empirical concept" of a body, which ultimately sounds very far removed from what scientists and many philosophers would consider it to be. This is because he apparently was thinking of his idea that behind the "empirical concept" is the thing-in-itself. He had to think that it was possible to peel away, one by one, the properties of a body and yet have something left after eliminating the weight.

It seems that he thought that having space left where the body had been located showed that the substance that comprised the body was still there. How could that be? There was nothing left of the body. Kant was correct in saying that the space "cannot be removed," but certainly another body could be moved into that same space. This happens all the time. That is because there is nothing left of the substance of the first body. He further concluded that the concept of substance was a necessary one. He must have assumed that because substance is so ubiquitous. It was not shown by his thought experiment.

It is hard to see what Kant meant when he talked of removing the body's weight and impenetrability since those are very much required in speaking about a body or object. Substance is almost synonymous with weight. Substance is considered to be matter or material that constitutes a body. A person of substance is thought to be one who is solid and therefore dependable. Therefore, to say you are removing the weight is to say you are removing the body. You can’t have substance without a body. Substance is not a mysterious entity. It is accurate to think of it as a clump of atoms and molecules whose form and composition depends on the kind of object it comprises and its properties.

Kant's comment (2) states with respect to properties what comment (1) states with respect to features. (The distinction between features and properties is unclear.) It cannot be correct if we consider weight as a property. We cannot take away the property of weight and have anything left. Even a subatomic particle such as a lepton has weight. If an object has no weight, it is nothing. Only intangible items like beauty or honor can be considered to exist without having any literal weight. Kant was not referring to intangible concepts. Kant's idea of weightless substance indicates that to him noumena were something nonphysical, something akin to spirit.

What is puzzling in comment (3) is Kant's quick conclusion that because substance is so universal in objects it can only be known outside of experience, i.e. solely through the mind, a priori. It cannot be that substance is universal because it comprises the ubiquitous objects that we perceive repeatedly in our experience existing out there in space. In spite of Kant's belief that he held a progressive view with respect to rationalism, he still clung to the old view that only the mind was capable of grasping important and universal insights. It could not be that the senses in concert with the mind could come to know universal truths.

Knowledge of substance could come about this way. After many experiences touching, holding, and beholding objects, a young child becomes amply familiar with certain properties such as weight, impenetrability, and solidity. As years go by and the child talks to parents, siblings, and others, it becomes aware that they have the same experience of objects through making comparisons with them.

People probably have a good idea of it at a very early age, let's say at three. After all, by that age, they have touched numerous objects including toys and food to give them a good idea of their solidity and impenetrability. Some have had heavier objects hit them. Unfortunately, they may have fallen on a hard floor, felt pain, and thus experienced the solidity of objects in an unpleasant way.

It eventually becomes clear to the child--what every adult takes for granted--that objects are composed of different amounts of matter that determine their particular mass or weight. Later in science class, the child learns that it is believed that matter is composed of atoms of different sizes that determine the different elements. Afterwards, the adult, now with years of experience at observing a multitude of objects in a variety of ways, takes it for granted that all objects are made of substance even if they exhibit a number of different consistencies.

The concept of substance has become an indelible truth that no one would dare to seriously contradict. It is something of a necessity that forces itself upon us, as Kant put it, yet that certainty is reached through repeated experience and through intersubjective comparison with the observations of other people. The consistency of the appearance and nature of objects is also evidence that they consist of substance. There need be no assumption that the concept of substance is provided to the mind a priori.

Surely no rationalist would claim that a baby would be able to articulate her/is understanding of a priori concepts like substance. The baby would not even be able to explain its fears. Different fears, desires, and other emotions may be innate and may bring about certain familiar behavior, but that does not indicate in any way that there would also be inborn, articulable, specific ideas about substance and the like. In any case fear and other emotions cannot count as ideas and are not one of Kant's categories.

The 12 categories that Kant set out should be articulable by anyone that allegedly has knowledge of them. It seems fundamental that if someone has knowledge of a fact, including a priori knowledge, they should be able to explain a little about what it is they know. In other words, they should be able to articulate their knowledge. The categories that Kant set out were not simple impressions that one could have. For instance, they were not something like having a large tiger growl in one's face. Babies might understandably cringe from that. Rationalists could then claim there was knowledge involved in the fear of the tiger.

The reaction of the baby to fear is not knowledge. The categories are more complex than that and demand more of an intellectual understanding and therefore an articulation. Such an understanding and articulation cannot come about until a child learns a language, and even then it has to reach a sophisticated level.

That level of linguistic knowledge would probably not appear until at least the age of three. At that point, a child could conceivably discuss and demonstrate that s/he understood categories like possibility, necessity, plurality, and substance, but her/is knowledge would not need to be a priori. By then, s/he would have passed through much experience in the world and could well have become acquainted with such concepts in that manner. Very likely the child would have parents or tutors who would have introduced and taught such concepts. Certainly, rationalists could not insist that such knowledge had been attained prior to experience. There is ample reason for believing that knowledge of the features of the world, including even the categories, is attained by children through their experience, i.e. a posteriori.

Causality. A very important category for Kant was that of causality. "Everything that happens has its cause," he said and correctly pointed out that it was not an analytic proposition since the truth of the statement could not be analyzed out of the statement itself in any way.(Id. 531) However, he further insisted that, as in the case of substance, the statement was not merely a truth but a necessary truth. Upon reflection, it is difficult to see the necessity in it. It is hard to see how Kant could have been so confident in that belief. He explored no possibility that it would not be necessary.

A necessary truth is one whose contradiction cannot possibly be true. The contradiction of "everything that happens has its cause" is "some events happen without a cause." The contradiction need not be false. Some events may occasionally occur that do not have a cause. There is no guarantee against that happening. Scientists may be surprised when it happens but it does. In Book I biologist Ernst Mayr and geologist Robert Hazen mentioned that random events have happened in the development of the world. Quantum events appear to happen randomly in the subatomic realm. It is then not a proposition that is a necessary truth since it appears that some events can happen without a cause.

One can think of instances in which it appeared that an event did not have a cause. Examples could be a darting meteor in the night sky, an unexplained fire, or a surprising new allergy in an adult. A veterinarian stated that pyelonephritis (a kidney disease) can be difficult to diagnose and speculated whether for this reason that it could be a spontaneous disease. There is no doubt that some people have wondered whether illnesses, which have suddenly appeared wholly without expectation and without explanation, may have been spontaneous.

A woman comes down with lung cancer, yet it is hard to discern a cause. She never smoked, had always lived in a small town that was not subjected to air pollution, had never worked in any occupation or location that posed a risk for contracting lung cancer, did not have relatives who had become ill with cancer of any kind, had not lived or worked in a building where anyone smoked inside. It would still be assumed by doctors and others that there must have been some cause for cells in her lungs to start multiplying uncontrollably as they do in cancer. Might there not be a few cases in which cells simply start to multiply and become cancer cells?

In cancer, there is the phenomenon of spontaneous remission. This was recognized many years ago. Spontaneous remission is the sudden, unexplained disappearance of cancer after the disease was underway. Doctors have not been able to come up with an explanation for it, but it appears to occur on rare occasions. Perhaps just as there is spontaneous remission, there can also be spontaneous occurrence.

There are scientific laws that have been taken to be incontrovertible, but they are instead well-founded assumptions that are very likely never to be contradicted. Examples of such laws are (1) matter-energy cannot be created or destroyed and (2) nothing can go faster than the speed of light. These are true in view of many experiments that have been performed and the confirmation of other scientific facts consistent with them. Yet, there is nothing that says that the universe could not change significantly along with some of its physical laws. It is highly unlikely that this would ever happen, and there is nothing to be feared. Nevertheless, it is useful in philosophy to discuss these possibilities because it helps gauge whether the maxim shows that it is a necessity. It is not. Its contradiction is not an absurdity but is instead only a remote possibility.

Say that on October 10, 1512, a hydrogen atom appeared in the space near the planet Saturn spontaneously without cause. Who can guarantee that did not happen? If not then, at some other time and place. Then there are the many mysterious events that people have claimed have taken place throughout history--cases that have "defied scientific explanation." No doubt almost all of these cases have had some explanation, even if it was one that was not yet known to humans. Nevertheless, it cannot be guaranteed that there might not have been at least some of these events that were spontaneous occurrences. For the maxim to be proven wrong, all that is needed is for it to be possible for one causeless event to take place.

Then there is the uncaused cause that many people, such as Jews and Christians, believe in, and that is God. They are not clear on what was the cause of God. They usually proceed on the quiet assumption that he has always existed and do not question it. However, if God always existed, then there was no cause for his existence and that violates Kant's maxim that "everything that happens has its cause." So, this is another example by which, at least according to some theists, the maxim does not hold.

Kant was wrong in his claim that "everything that happens has its cause" is a necessary truth. Kant assumed the maxim was infallible because it is so pervasive throughout nature as well as in everyday events. Even if it were necessary, that would not prove it was a priori because it would not mean that it was known prior to experience.

Not all people are even aware of the claim that "everything that happens has its cause." Even if those aware of the maxim, probably do not know how certain it is. If pressed to give an opinion, there would be variability. Some would no doubt think it was a matter of repeated observation. The differences of opinion would certainly count against the proposition being a priori.

Causality is a matter of human experience repeated many times over. It is a well-founded assumption that it always takes place without fail. Yet no one has ever pointed out what prevents it from failing. Hume was right. Causality is constant conjunction that we observe repeatedly, but nothing can be found that proves it a necessity. Kant never made any argument that refuted Hume. Those who believed Kant over Hume on causality simply wanted to believe, feeling greater security that there is an unbending causality.

How Are Synthetic A Priori Propositions Possible?

Review of types of statements:

Analytic - truth determined by analysis of their terms

Synthetic - truth not determined by analysis of their terms

A priori - truth not determined by the evidence of experience

A posteriori - truth determined by the evidence of experience

Synthetic a priori propositions are possible because of the way Kant characterized them. First, he drew the class of analytic propositions too parsimoniously. Consequently, synthetic propositions were given too wide a range. It was not that he did anything devious. He surely thought his characterization of what is an analytic statement was accurate. It was not; it leaves out too many statements that depend for their truth on the definition of their terms.

One good example of this is one that Kant liked to discuss: 7 + 5 = 12. He claimed that this was a synthetic a priori proposition. He could do this because he defined an analytic statement as one in which the predicate could be analyzed out of the subject. In other words, the truth of the sentence is a result of the predicate being contained in the meaning of the subject. Examples are "bachelors are unmarried males " and "squares are rectangular." The first sentence is true because an analysis of the word "bachelor" reveals it to mean an unmarried male. The second sentence can be analyzed, and it will be seen that its truth also depends on the definition of the subject "squares," which are equal-sided rectangles.

Since the truth of an analytic statement depends on the correct application of a definition, it is also always a necessary truth. There is no way that an analytic statement can be false. That is the case because it solely depends on the correct use of a definition or definitions. Its contradiction is always false because it is a misapplication of the definition. As Kant said, "The common principle of all analytic judgments is the law of contradiction."(Immanuel Kant, Prolegomena to Any Future Metaphysics (Indianapolis: The Bobbs-Merrill Company, 1950) 14.) Sentences like the two above have also been classified as a priori because the information they provide is not based on experience but is rather a matter solely of the meaning of words in them. Such sentences are then counted as analytic a priori.

Kant's method of analysis leaves out too many sentences that should be counted as analytic. This is why he could say that the statement "7 + 5 = 12" is not analytic. He pointed out:

The concept of twelve is by no means thought by merely thinking of the combination of seven and five; and analyse this possible sum as we may, we shall not discover twelve in the concept.(Id. 16)

However, if one takes a less restrictive approach to the meaning of analytic, there is a different result. Moser and vander Nat proposed this alternative definition: analytically true propositions are true solely in virtue of the meanings of their constituent terms. The authors went on to say, "Such propositions cannot be denied without inconsistency."(Moser and vander Nat, 12) Using a definition like that requires that the definitions of all the words in the sentence be analyzed and not that they be strictly revealed only from an examination of the subject of the sentence.

Given this more flexible approach, it becomes clear that "7 + 5 = 12" is analytic. The truth of the statement becomes evident once the definitions of all the terms are understood, specifically the definitions of arithmetic. Definitions include not only definitions of terms but also of rules involved. These rules are set and inflexible, allowing no deviation. The result of adding two numbers can only produce a specific third number. "7 + 5 = 12" is just one of the countless examples of those rules in action.

Using Kant's method of testing for an analytic statement, it cannot be determined from the subject "7 + 5" that the predicate "12" provides an analytic statement. Examining the definitions of the terms and of the rules of arithmetic, because it is obviously an arithmetical statement, reveals clearly that it is a true statement. It is an analytic statement because nothing else need be done. To determine its correctness all that is necessary is to look at the pertinent definitions; no empirical observations need be made. Furthermore, to deny its truth is patently wrong, and its contradiction is clearly false.

Color as Analytic

A set of propositions that rationalists like to claim is not analytic involves color. One such proposition is "everything that is red is colored." Roderick Chisholm discussed these propositions.(Roderick M. Chisholm, Theory of Knowledge (Englewood Cliffs, N.J.: Prentice-Hall, 1966) 88-89.) He claimed that the subject could not be analyzed out in the style of Kant, i.e. in the way it could be done with the undisputedly analytic sentence "everything that is square is rectangular." There the subject "square" can be substituted by the conjunctive term "equilateral and rectangular," with the conjunct term "equilateral" not being synonymous with the term "square," but the term "rectangular" being the same as the predicate "rectangular." According to Chisholm, it is not possible to "analyze out" the sentence "everything that is red is colored" because no conjunctive term can be found that can be substituted for "red." Chisholm put even more restrictions than Kant on how one can show analyticity. For instance he sometimes required the use of conjunctive and disjunctive phrases as substitutes for the subject.

"Everything that is red is colored" cries out for confirmation as an analytic statement. Anyone who understands what it says would say that it is obviously true from the meaning of the words. Then there is the test of any analytic statement of whether its contradiction is false. Try out a contradiction to "everything that is red is colored." “Nothing that is red is colored” is patently false. So are “something that is red is not colored” and “everything that is red is not colored.”

It would seem that one could use several phrases that substitute a definition for red and they would not be conjunctive. One is "everything that is the color of a cherry is colored." Shortened, that would be "everything that is cherry colored is colored." You can substitute any other well-known red colored item for cherry such as raspberry, ruby, tomato, blood, fire truck. We were often taught the definition of red by being told to imagine examples like these.

Then there is the fact that colors call for an ostensive definition. They can simply be pointed out to us. Mom or dad or someone else pointed to an example of red and declared the color was called "red." Perhaps we were shown a chart with a variety of colors at some point and shown the names of each one.

With the use of ostensive definitions, analytic statements about color can be analyzed out, or at least a large number of them are so suited. One can break down "everything that is red is colored" by pointing to a cherry and stating, "Everything that is colored like that is colored." Try a contradiction of that statement on someone and see the kind of look you get.

Another definition, suggested by a dictionary, is “the appearance of the light produced by radiant energy with wavelengths approximately 630 to 750 nanometers.” Substitute that for “red” and there can be no doubt about color. Whichever method is used, it should be obvious that red is a color once we understand the definition of color. Red is defined as a color and so cannot be anything other than colored. It is not an ambiguous definition, and the sentence "everything that is red is colored" has to be analytic.

Another type of statement that is claimed by rationalists as synthetic a priori is called a "disparate," a term coined by Leibniz. A well-known example is "nothing that is red is blue." Chisholm claimed that no one had been able to show it is analytic.

Here are two candidates for statements that may fit the bill for saying the example is analytic. The first is “nothing that is cherry colored is blue.” Then there is "nothing that is cherry colored is daytime-sky colored." This is true as a matter of the definitions of the colors red and blue. Alternatively, ostensive definitions could be used as before. One could state, "Nothing that is this color is that color" while pointing to a cherry and saying "this," and pointing to the daytime sky while saying "that." Of course, the same approach could be taken with any two different colors. These disparate color statements are made in accordance with their definitions. That is all that should be needed to establish that "nothing that is red is blue" is analytic.

Color as a Posteriori

Now examine the idea that "nothing that is red is blue" is based on experience and therefore a posteriori. That would deny that it is a priori, as claimed by Kantians. When we are small children, we are taught the names of different colors. In the following years, we realize that an object that consists of a particular color is never seen to also be of another color. We learn this through repeated observation of numerous instances in which no two colors appear to cover a surface at the same spot at the same time. We compare observations with other people and find that they agree. Eventually, we become convinced that it is a fact in nature that no two colors appear on the same surface at the same time.

All this knowledge is gained by experience. In other words, it is empirical, gained a posteriori. It may be that certain facts related to color are rigid and that it cannot be imagined how they could be otherwise. Nevertheless, those rigid facts are what is presented by nature to us and not a matter of language involving terms that someone has previously defined. Rationalists themselves admit that even definitions of items can be presented to us from experience.

The first statement examined before, "everything that is red is colored," turns out acceptable as a matter of definition. The second statement is different. "Nothing that is red is blue" depends more on the reality that is visually found in the world, i.e. that the colors red and blue are always found to be separate. It is much like the fact "nothing that is a tree is a rock." It is ultimately a matter of what the world presents. It is not a matter of definition.

Another indication that many color statements are a posteriori could be the following allegation: some things can be both red and blue. When that happens, the resulting color is purple. It is well known that other combinations of colors also produce different colors. The colors produced by the combinations have traditionally been given names, like purple. Still, why can't that color be thought of as both red and blue? If you paint a door by mixing red and blue paint, the ingredients of the paints are still present. They just mix in a way that comes out as purple to our vision.

An answer to that could be that it is undeniable that red and blue are no longer seen, only purple. Therefore, the door cannot be said to be both red and blue. A counteranswer could be that red and blue would still be found to be present if a chemical analysis were made.

Here is another way the sentence "nothing that is red is blue" could be true with respect to a painted door. Suppose the door were the color red before it was painted blue. Might not a chemical analysis show the coat of red paint is still present under the coat of blue paint, just not visible to the human eye? What about a door that is painted red on the left half and blue on the right half?

People blind from birth can never really learn what red or green or blue are. They can never have the experience of observing actual colors. The best they can do is associate certain color words with items they were informed possess those colors. For instance, they can be told that the daytime sky and the water in the ocean are blue, that grass and tree leaves are usually green, that the sun is yellow, and so on. However, there is no way for them to really “know” what those colors look like. On the other hand, everyone who has sight can actually see colors and truly know them. That means that true knowledge of colors (and certain other items) is only gained through experience and not by showing that a statement is synthetic. The nature of a sentence about color could well be an empirical observation that has to be classified as synthetic and a posteriori. The classification depends on the features of colors in the physical world.

The fact that colors are provided by the spectrum found in light further demonstrates that color is a physical phenomenon, with any questions to be settled by resort to the facts established by physics. The statements then that have been related to color that have been proposed as being synthetic a priori are not that but have instead been a posteriori.

It has been easy to confuse color statements as a priori because colors and their properties have always appeared so rigidly set. Their nature has been so inflexible that it has not appeared that we become acquainted with them through experience. Colors always maintain the same position in the spectrum of light, so it seems they have simply been considered as initially defined, rather than discovered in nature. Ironically, these color statements may qualify as analytic a posteriori. That is because they are analytic because color definitions are so rigidly set and a posteriori because they are ultimately known only through experience.

Restriction On A Posteriori Statements In addition to color statements, there are other statements that have been promoted as synthetic a priori but that ultimately may be a posteriori. An unduly narrow restriction was put on what can qualify as an analytic statement. Likewise, an unwarranted restriction seems to have been put on what can qualify as an a posteriori statement, which is supposed to be one "learned or justified on the basis of experience (in the way empirical generalizations are)."(Moser and vander Nat 20)

Somehow it was accepted in philosophy and science that statements such as "every event has a cause" and "nothing comes from nothing" are not learned or justified on the basis of experience.(Id. 19-20) We saw above that that way of looking at it seems strange. It would seem that in order to know enough to assert that either allegation is true, one would have to rely on experience. To consider that "every event has a cause," one would have to review and guarantee that every event that has ever taken place has a cause.

Much the same conclusion has to be reached in regard to the proposition that "nothing comes from nothing." The statement is part of the statement proposed by scientists a long time ago: matter can be neither created nor destroyed. Physicists are highly certain that this is the case, but it cannot be said with total certainty. It could turn out that matter (and its alternative form, energy) could, in rare instances, be created or destroyed. Cosmology has not yet shown that the universe could not have begun by the creation of energy and matter from nothing. It is ultimately an empirical question.

There are two more alleged synthetic a priori propositions worth considering. There is good reason to believe the statements to be analytic if there is an allowance for using deductions from the definitions that are presented. The propositions are (1) anything that is colored is extended and (2) anything that is a cube has twelve edges.(Roderick Chisholm, "Reason and the A Priori," Philosophy R. Chisholm, H. Feigl, W.K. Frankena, J. Passmore, eds. (Englewood Cliffs, N.J.: Prentice-Hall, 1964) 287, 306, reprinted in Crumley, 541, 549.) The definitions involved in these statements are not as easy to point out and use as in the statements previously reviewed.

Statement (1), "anything that is colored is extended," has a very simple predicate, "extended." It is reminiscent of the old proposition "all bodies are extended" which philosophers have agreed is analytic. After all, it is inherent in the idea of a body that it is extended. It is contradictory to hold otherwise.

Looking at the subject portion of the statement "anything that is colored," we may ask what can qualify to be colored. It can be seen that to be colored the item must be a physical object, no matter how small. There are no intangibles that have color. No one would talk about things such as beauty, ambition, the unemployment rate, forgetfulness, or transportation as having color except in a figurative sense. There is the concept of "local color" in literature. One also deduces from the nature of color that it is a physical phenomenon that depends on light. The next step is to conclude that "anything that is colored is a body." This gives us premise (1) in the following logical deduction:

Premise (1) Anything that is colored is a body.

Premise (2) All bodies are extended.

Premise (3) Anything that is a body is extended.

Conclusion Anything that is colored is extended.

The approach to establishing the statement as analytic is not as simple as examining clear definitions like those found in other analytic statements. There are more logical steps involved here. Premise (2) is analytic based on the definition of a body. Premise (1) is the more problematic premise. Here is a way to set out premise (1) more explicitly. It can be established by the following syllogism:

(1a) Anything that is colored has to contain a surface on which the color is seen.

(1b) Anything that contains a surface is a body.

(1) Anything that is colored is a body.

It is clear that the conclusion "anything that is colored is a body" is established by a series of incontrovertible statements. It would appear that the contradiction of any of the premises is false. The conclusion can then be claimed to be analytic in the wider sense that is derived by a logical analysis that depends on premises that themselves are incontrovertible and dependent on definitions. The process is definitional and logical rather than empirical. Remember that formal logic is considered a priori.

Some empirical observation was involved in this process. Premise (1a) could be counted as an empirical observation in pointing out that color has to be found on a surface. This is acceptable because philosophers have found it acceptable that analytic propositions can ultimately depend on empirical observations. For instance, "gold is a yellow metal" depends on observers being acquainted with gold.

If anyone wants to disagree with this, then perhaps "anything that is colored is extended" and other statements like it should be considered a posteriori. There is something to be said for this approach since there is often a dependence on initial empirical observation in the establishment of both analytic and synthetic a priori statements.

Take the test that an analytic statement is one whose contradiction is false. The contradiction of "anything that is colored is extended" can be put in different forms. The one that would seem to have the best chance of being true is "there is something that is colored but not extended." This is hard to imagine and accept. How can something have no dimension, no length in any sense, and yet be able to show color? One can imagine a point not being extended, but in reality if such a point exists in physical space, it has to have three dimensions no matter how small. One just has to be close enough to discern the dimensions. The conclusion has to be that any contradiction to the statement has to be false; added proof that the statement is analytic.

The other supposedly synthetic a priori statement is "anything that is a cube has twelve edges." It is consistent with Euler's Formula. It follows from the definition of a cube--a solid object made of six congruent squares. Here is a proof.

(1) Take any two opposite squares of a cube.

(2) The combined number of edges of the two squares is eight.

(3) The only other edges on the cube are those between the four squares that connect the first two. The number of these edges is four.

(4) The total number of edges of the cube is then twelve. Q.E.D.

A cube always has the exact same shape, only the size varies depending on the length of the edges. It becomes obvious that a cube will always have twelve edges. How could a cube have eleven or thirteen or any number of edges other than twelve and still qualify as a cube under the definition given? The invariant property of twelve edges of a cube must follow logically from its definition.

Actually the solid figure need not be a cube in order to have twelve edges. It can be any solid figure with six noncongruent sides. The edges can even be curved. The parallelepiped with its straight edges is a special case of that figure, and the cube in turn is a special case of the parallelepiped. The cube statement is analytic. It is also a priori because it becomes understood as a truth through a process other than experience.

Objects in the physical world suggest the abstractions studied in mathematics. For instance, arithmetic was surely first suggested by the need to count, whether it be apples, pears, or people. Starting with the obvious, simple ideas, a more complex system was built. It was all an abstract edifice with the truths in the form of theorems, corollaries, etc. built on the simple initial assumptions. There was no empirical process involved except that items observed in the everyday world suggested the assumptions to be made. The method of proceeding after that is a purely logical process. There is not even a need that the specific entities of each mathematical study conform to anything that exists physically. It can be solely a study of abstractions. This is why a statement about a cube having twelve sides and Kant's example "7 + 5 = 12" can be considered analytic. They flow solely from the use of logic applied to the initial assumptions. They cannot be said to be synthetic. Instead, they are each part of an abstract mathematical system: solid geometry and arithmetic, respectively.

Kant could not call results in arithmetic analytic because he believed mathematics to be part of the real world rather than an abstraction produced by human minds. He believed that standard Euclidean geometry was intimately tied to the physical universe because he judged that space was measured solely by employing that geometry. In connection with long distances in space, astrophysicists in the 20th century found that nonEuclidean geometry with its alternative axioms was more suitable. NonEuclidean geometry makes more room for dealing with curved lines, an aid in studying space.

Bertrand Russell pointed out, "The most important part of The Critique of Pure Reason is the doctrine of space and time."(Bertrand Russell, A History of Western Philosophy (New York: Simon and Schuster, 1945) 712.) Yet Russell seemed to have had reservations when he added, "To explain Kant's theory of space and time clearly is not easy, because the theory itself is not clear."(Id.) That Kant was mistaken in his estimate that arithmetic and geometry were synthetic does not inspire much confidence in his ideas on space and time since they in turn rest on his ideas of arithmetic and geometry. Leibniz and Hume believed mathematics to be analytical as have many other philosophers.

Status of the Contingent A Priori

In the 1980's, Saul Kripke came up with a statement that he offered as an example of a contingent a priori truth. I believe that it was on the basis of this example that the supporters of the synthetic a priori claimed that it could also be used as an example of the synthetic a priori. That does seem like a plausible assumption.

Kripke's example was "Stick S is one metre long at time t0."(Saul Kripke, "A Priori Knowledge, Necessity, and Contingency," Naming and Necessity (Cambridge, Mass.: Harvard University Press, 1980) 53-56.) The statement is clearly a contingent one since it could turn out that its contradiction "stick S is not one metre long at time t0" could very well turn out to be true. It is not necessary, which is the opposite of contingent. It is clearly an a posteriori statement to anyone who reads it as if S is an ordinary stick. It is a declaration of a fact that can only be determined by measurement that in turn involves sensory observation, i.e. it cannot become known a priori. The statement is one that is contingent a posteriori. This would seem to invalidate Kripke's contention that the statement was a priori.

Kripke could maintain his position because his stick S was not just any one-metre stick. It was the stick in Paris that provides the standard for the length of a metre. Since the stick was being used as a standard, Kripke pointed out that its length of one metre would not be known through experience but would instead be known a priori. Kripke claimed, "[H]e knows this automatically, without further investigation . . ." Earlier, Kripke had pointed out that the statement "the length of S at t0" (without mention of "metre") did not designate anything rigidly because the stick might have been longer or shorter "if various stresses and strains had been applied to it. According to Kripke, the statement "stick S is one metre long at t0" (the Paris standard stick) is different because it designates "one metre" rigidly. For this reason, he thought it must be known a priori.

The answer to Kripke is that making "one metre" the rigid designator of the length of S would not change the empirical (a posteriori) status of the statement. The length of stick S could still change due to stresses and strains taking place at time t0. The length of a metre would simply change. It would then be different from what it would otherwise have been. It would then be different in comparison with the standard length say, of a yard. That new difference would have to be assessed empirically by measurement. The length could also change if some mischievous person were to pull a prank by hacking 10 millimetres off one end of the stick. The stick could still be kept as the standard of what is to be considered one metre, but that probably would not happen.

Note that the discrepancy would be noticed by empirical observation. If the decision were made to keep the stick as the metre standard, it would be significantly different from the previously known metre. Conversion tables to yards, feet, etc. would have to be reworked. The decision would probably be made to find another stick to use as the standard metre that would be as close as possible to the original length of the first one. These would all be empirical (a posteriori) observations. (The definition of a metre has been defined as one ten-millionth of the distance along a meridian from the equator to a pole.) These empirical factors seem to indicate that the statement is synthetic, not analytic.

For the sake of examining the question further, suppose now that we were to follow Kripke's assumption that the statement was known a priori. It would certainly seem that the statement could no longer be considered contingent. Once stick S was adopted as the standard for the length of a metre, it would be invariably fixed as that standard. Apparently, it would remain the standard even if it underwent contraction or expansion. The only way there could be any deviation from it would be if it were abandoned as the standard or if there were serious damage to it, like being shortened significantly by being sawed or getting broken.

Kripke himself talked about “fixing the reference.” He talked about this when he claimed that the definition using stick S was not an attempt to “give the meaning” (his emphasis) of a metre but to fix the reference. The fact is that the length of stick S was henceforward always to be known rigidly as the length of a metre. Understand what Kripke did. He took the statement out of the realm of the contingent and made it a necessary truth. People may have created the definition, but it became a necessary truth once it was designated as a rigid standard. It became an artificially human-created necessity. It's contradiction then had to be false since it was set as the permanently fixed standard.

Kripke tried to have it both ways. He tried to consider the standard flexible enough to count as being contingent. At the same time, he wanted it to be rigid enough to be a priori. It is just not possible to have a definition with such a dual nature.

If "stick S is one metre long at time t0" is taken as the official definition of one metre, it is made a necessary truth. If so, it is also analytic since its veracity can be ascertained on the basis of the definition of its own terms. Consider also that the contradiction of the statement is "stick S is not one metre long at time t0." This cannot be true if the original statement is to be taken as the official definition, which has to be fixed if it is to have any significance. It becomes clear then that Kripke did not come up with a fresh example of the synthetic a priori.

The Nature of the Synthetic A Priori

After all these years, no clear example of a statement that is synthetic a priori has been formulated. One would think that if there were such a thing it would have already been found. Beyond that, it is not clear why it is so important that there be synthetic a priori truths in connection with either commonly observed or scientific claims about the world. They have not refuted Hume's observation that causality is not guaranteed. It is another matter with regard to metaphysical and moral claims.

In regard to common claims, it seems unnecessary to be able to say that certain statements are synthetic a priori. For example, go back to: every event has a cause. Almost anyone would agree that it is a true statement, at least for all practical purposes. Except for the first event, if indeed there was one, it does seem that no event ever occurs without its being caused by at least one prior event. Try to come up with a clear instance of an event without any cause and you will be hard pressed, even though a spontaneous event remains a possibility.

Due to this possibility, the statement cannot be known a priori. It is ultimately an empirical generalization, one with overwhelming support behind it. This degree of certainty would seem to be sufficient for anyone who deems it important to separate a class of statements about the world that cannot ultimately be tested in any practical sense.

Here is a list of more statements that could be claimed to qualify as synthetic a priori.

1. An object cannot be in two places at the same time.

2. Two objects cannot be in exactly the same place at the same time.

3. Every line has one line parallel to it through a given point.

4. Between every two points there is a spatial separation.

5. Nothing can come from nothing.

6. Nothing can be utterly annihilated.

7. In every change there is something permanent.

8. Space and its objects extend in height, breadth, and width.(Moser and vander Nat 19.)

Some of them, such as (1), (2), (5), and (6), seem to be about the physical world and so are synthetic. (1) and (2) clearly appear to be true, but it is not clear why they should be considered a priori. It certainly seems that one agrees with them based upon reflection on one's past experience. Even if they appear to be incontrovertible, it may be that it could someday be shown that they were in error. For instance, take (2). Could it not someday be found that it is possible to intertwine the atoms and molecules of two objects in such a way that they retain their individual identity while occupying the same place at the same time? After all, is it not well known that there is a vast amount of space between the nucleus of an atom and its electrons? That may leave the space that is necessary for the feat of placing two objects in the same place at the same time.

In conclusion, the group of statements that is supposed to be synthetic a priori is composed mainly of synthetic a posteriori statements that are difficult to doubt. They could be called synthetic a posteriori obvious statements. Just because they seem to be obvious and hard to dispute, it should not mean they can be claimed to be known a priori. It may be that their status is that they are not known with certainty although it may appear that they are certain. Statements (1), (2), (5), and (6) are examples of this.

One reason that it has been so easy to count these statements as a priori is the way that a priori is defined. All that is necessary for a statement to be classified as a priori is that it not be known by experience. That leaves too much leeway for being included in the a priori category because there are a number of statements that deceivingly appear not to be known by experience but ultimately are. Perhaps, there should also have all along been the category of synthetic obvious alongside synthetic a posteriori.

Then there are some of the putatively synthetic a priori statements that are actually analytic. Examples of these are (3), (8), and "7 + 5 = 12." They should count as analytic because analyzing the meaning of the terms of the particular statement reveals its truth even if analysis using Kant's method does not.

The fact that there appears to be no such thing as a synthetic a priori statement should not be surprising since it would make sense that our knowledge of the furniture of the world should only come through the senses. Why should we humans be privileged to gain knowledge in any other way? No one has argued that any animals besides humans can possess knowledge of a priori propositions. Humans along with other animals are products of almost the same evolutionary chain of events. The acquisition of knowledge through the senses is clearly what all animals have become capable of achieving. In the evolutionary process, there is no place for acquiring knowledge a priori.

The claim to a priori knowledge is only based on how unlikely it seems that a particular statement could be incorrect. That is, a priori statements are patently obvious ones that leave little room for argument that they are incorrect. Yet, there is nothing magical about the a priori.

It might be assumed that because a priori knowledge is not attained through the senses that there is something grand that can be learned through it. This is certainly what many earlier philosophers who distrusted the senses thought was the case. However, just because what is learned through the senses is open to error, it does not follow that what is learned otherwise is necessarily profound, or that it is profound by virtue of becoming known a priori. It may seem that way because some seemingly universal truths have been among the candidates for a priori acquaintance.

Examples are (5) and (6), "nothing comes from nothing " and "nothing can be utterly annihilated." The two can be combined into "nothing can be either created or destroyed." Statements like these are still empirical because it is possible they could be disproved and this would be an empirical finding, however unlikely. If statements like these are deemed to be profound and universal, it is not because they are a priori. It is really that it is hard to imagine how they could be wrong that makes it seem that they are not empirical. Even if they could not be proven incorrect, they are still assertions about the physical world and are discovered by the senses. The statements of mathematics and formal logic can truly be said to be a priori because they are clearly analytic. They cannot be contradicted within the meaning of their defined terms.

Color presents an interesting case. Clearly colors are first apprehended by sight. We then narrow the scope by definition. This makes statements like 'nothing that is red is blue' appear obvious and incontrovertible. This is because we learn the definitions of colors by having them pointed out to us (ostensively). The color purple is that particular color that was at one point in our childhood pointed out to us as the color purple. That was the end of it as far as defining the color.

Later we learned that purple could be produced by combining the colors red and blue, but we did not think of that as an alternative definition. However, at least one dictionary defined it that way. It stated: a dark color that is a blend of red and blue.(Webster's New World Dictionary of the American Language, College Edition (Cleveland, Ohio: World Publishing Co., 1960).) Using this definition, "nothing that is red is blue" can be considered false because purple can fit the description of being both red and blue.

By coincidence, the athletic pants i am wearing as i write this happen to be purple. They are made of nylon and so are shiny. The sun is coming through the window and striking them. At certain angles of the folds, i see patches of blue. I don't see any comparable patches of red, but i detect a few scattered, very tiny dots of red.

Of course, someone could argue that in spite of the alternative definition it still remains the case that nothing that is red is blue. I think that such a proponent would still be arguing on the basis of the original ostensive definition. At any rate, the whole point is to show that color has become closely defined by pointing out specific colors. In short, color statements are analytic because they depend on definition. In turn, this makes them appear so incontrovertible that they appear to be a priori (much the same as in the case of Kripke's stick). They are not. They are known through experience, as blind people too well know.

The preceding observations show that a number of supposedly synthetic a priori statements are not a priori at all but more dependent on experience than many Kantians would care to consider. This implies that in general there are a myriad of a posteriori statements and few truly a priori ones. In other words, there are not that many statements out there that can be comprehended readily without the aid of experience. The color statements are good examples of synthetic a priori statements that really are not.

Kant was correct in disagreeing with the rationalism that he was taught but was mistaken in promoting his favored synthetic a priori statements. They merely state undeniable and obvious truths. They are either analytic a priori or synthetic a posteriori obvious. Doubt about the viability of the synthetic a priori also helps put into question Kant's idea that human minds impose a reality on the world.

The Analytic/Synthetic Distinction

The consequential question arises whether the claim of the paucity of a priori statements gives support to the claim that there is no analytic/synthetic distinction. The analytic and the synthetic and the a priori and the a posteriori are two different distinctions that do not control each other. Nevertheless, the synthetic and the a posteriori are the most closely related since they involve empirical statements about knowing things in the world.

It would appear that empirical (a posteriori) statements are more common than nonempirical ones since there are many things in the world and the universe. This could support a claim that there are many more synthetic statements out there than analytic ones. This may be related to the claim that there is no such thing as the analytic/synthetic distinction. Willard van Orman Quine is the most prominent of those who believe this. In his essay, "Two Dogmas of Empiricism,"(W.V. Quine, "Two Dogmas of Empiricism," From a Logical Point of View (Cambridge, Mass.: Harvard University Press, 1953), reprinted in Moser and vander Nat 255.), he tried to make the point. I would like to examine the claim while referring to some of the points he made in his essay.

Quine based his case mostly on observations about the synonymy of words. It is well known that Quine believed that synonymy was a notion that was in need of clarification.(Id. 257) In different essays, he questioned whether any two words could ever be truly counted as synonymous. His concern may be overinflated. While there may be some problems with obtaining a perfectly synonymous match between any two words, there is still the viable and highly useful word-matching performed with the use of synonyms. If there were any serious defect in the process, there would probably not be any such thing as a dictionary or a thesaurus. At the very least, they would not be used much. Perhaps, the problem with vagueness in synonymy stems from the vagueness and ambiguity in the definitions of words in the first place.

Quine also noted a lack of clarity with respect to analyticity and self-contradictoriness.(Id. 255) Others have used his approach of relying on the vagueness of certain concepts in order to gain points in philosophy as well as in other intellectual areas. The point gained can be significant or it may not be. It may also not be as far-reaching as its originator and others might expect. It looks as if that is the result of Quine's quest to undermine the analytic/synthetic distinction. The distinction may have come out weaker as a result of his observations, but it should not have been pronounced dead. Analytic sentences may have a narrower scope but are still viable.

One good example of an acceptable analytic statement is the oft used "no bachelor is married." The statement can also be put in the form "all bachelors are unmarried." Quine quibbled about the notion of synonymy needing clarification so presumably then "bachelor" and "unmarried man" could not count as synonymous. Also the statement could not count as analytic. The fact is that the synonymy of the two terms is hard to ignore.

A stronger objection to the analyticity of the statement could be based on the way that the word "bachelor" was defined. Quine brought this up indirectly in his discussion on how the synonymy of the two terms was determined.(Id.) He pointed out it was determined by the lexicographer who wrote the nearest dictionary but then observed that the lexicographer was "an empirical scientist" who determined any definition on prior usage of the terms.

Quine's point was related to synonymy, but it can also show that "bachelor" is not a word that is simply given to us defined without any questions being brought up. The lexicographer is not given any special authority to arbitrarily define words and enter them in the dictionary for everyone to accept without question. In fact, "bachelor," like all other words, is discovered in the world. Several features of it are discovered in the world including its sound, spelling, etymology, and meaning as determined by the numerous people who have used it for many years, maybe centuries.

In that respect, the word cannot be said to be analytic but can be claimed as synthetic. The word "bachelor" was discovered in the world just as were facts like that trees have leaves and that rain is composed of water. That even words are discovered in human society before they are "defined" in a dictionary seems to be the best case for holding that there is no such thing as an analytic statement. All statements are at base synthetic.

Quine rejected the analytic but was not clear on how he saw the status of mathematics and logic, which have been generally accepted as analytic because they follow invariably from initial definitions. Quine did not tell us how we are to look at mathematics and logic if there is no such category as the analytic. Perhaps the best approach to the problem is to accept that the boundary between the analytic and the synthetic is uncertain and that the domain of truly analytic statements is small.

Colors are another good example of synthetic words. Colors were first discovered in nature in an empirical manner. After that, colors became so familiar that they became rigidly identified with their given names. To some rationalists this meant that the colors became known a priori, but that is not the case. If these rationalists continue to insist that colors become known a priori, then they should count statements about colors such as "anything that is red is not blue" as analytic because colors have by now become defined so inflexibly. (I am counting anyone who is a believer in the synthetic a priori as a rationalist.)

The same can be said about other well-accepted and rigid definitions. They may actually first be known empirically but become so ingrained that certain statements related to them come out analytic. Go back to "no bachelor is married." At some point in history, there must have been men who were not married but who had not been given any special name. They were simply referred to as unmarried males and nothing else.

After all even today, not all languages have specific one-word names for all classes. In English, it could be said that there is no specific unique name for an unmarried woman, "bachelorette" simply being an unimaginative extension of "bachelor." A better example could be "letter carrier" which was formerly known as the "mail man." It could be said that these are not really names but only descriptions. A special one-word name could be "letporter." In Spanish, there is a special one-word name for such a person, "cartero."

It seems that in France in the Middle Ages bacheler became the name for a young man who held a farm as a vassal, pledging loyalty to the overlord in exchange for his protection. Eventually, the name was applied to any unmarried man. This was a synthetic a posteriori fact that arose in the natural world, specifically in the social world of humans.

At first, as the name started being applied to all unmarried males, a person found out about this new development in a posteriori fashion. S/he learned through experience that unmarried men were being designated with a special one-word name. The word had not yet appeared in any dictionary, to say nothing of the fact that there were very few dictionaries in existence at the time. Thus the person may well not have had access to a dictionary to see if indeed the word was being applied to designate all unmarried men.

It is very possible that someone else could have disputed that unmarried men had become known as bachelers. (Apparently as in other cases, the English changed the spelling of the French word.) The second person may not have heard of the new use of the word and insisted that it still only meant a young man possessing a farm as a vassal. To investigate the matter, the two people may have agreed to conduct a survey of the countryside to see if the new definition was being used and to what extent. They would have taken an empirical approach. Incidentally, bacheler may be derived from the Latin "baccalaris," which in turn comes from the word "bacca" for cow, which is one of the things that a farmer ordinarily tends.

Many years later, bacheler became widely accepted as a name for an unmarried man. At some point, the new definition was probably even found in dictionaries along with the earlier one of a young farmer. Eventually, the latter definition became archaic (after the passing of feudalism). The use of bachelor to mean an unmarried man has become so widely accepted that it would seem absurd for anyone to dispute it as the definition of the word. This indicates that the truth of the statement "no bachelor is married" can be known by analyzing the definition of the world "bachelor." Substituting the definition for the word in the sentence gives "no unmarried man is married" which is undeniably true. The sentence is analytic.

To further prove this, its contradiction can be examined. The statement can be contradicted in various ways. Take this version: that bachelor is married. Substituting the definition of bachelor into the sentence brings us "that unmarried man is married." Only a person not aware of the correct definition of bachelor would seriously employ the sentence. The contradiction of "no bachelor is married" is patently false, and thus the truth of the sentence itself is analytic.

The statement is undoubtedly analytic because the definition of "bachelor" is so widely accepted. It has been a well-known meaning for centuries. The meaning of words can change over time. Look at any large dictionary, and you will find a number of definitions denoted as archaic. Definitions go out of fashion for one reason or another. New words are invented and sometimes catch on very quickly. New inventions and methods brought on by technology have produced numerous new words. There is a specialized dictionary that tries to keep up with them.(Tuttle Dictionary of New Words (Rutland, Vt: Charles E. Tuttle Co., 1992.)

At the historical point at which both definitions of "bachelor" were in common use, it could be said that "no bachelor is married" (call this statement B) could not be analytic because there was more than one designation of "bachelor." At that time, it could have been asked which definition of "bachelor" the declarant had in mind. As happens in speech all the time, the listener would have likely guessed the definition correctly from the context yet could have been wrong.

It could be claimed that the statement could not be analytic at that point in history because first the question of which definition was being used had to be answered before proceeding to an inspection of the definition. It could be claimed that having to ask that question showed that the statement could not be analytic because an investigation first had to be made regarding the status of the word "bachelor" in the world.

In his time, Kant apparently did not see a problem with this because he considered statement B as analytic without hesitation even though he lived in a time much closer to feudal times. Perhaps in German there was no such definition as that of young male farmer.

Another way in which statement B could fail to qualify as analytic would be if at some time in the future the meaning of "bachelor" changed. This would probably take a long time, perhaps centuries, although in modern society with much better means of communication, the change could be quicker.

Let us say that in the year 3500, the meaning of "bachelor" stopped meaning unmarried man and completely changed to mean anyone, male or female, possessing an academic bachelor's degree. (Some dictionaries presently include this meaning, but the word by itself is not used much in that sense today. It is customarily stated that a person has a bachelor's degree.) If that happened, statement B would cease to be analytic. It would not be true by looking at the definition of bachelor. The year 3500 definition would not yield statement B. The only way statement B could turn out to be true would be if in fact all people possessing bachelor's degrees turned out not to be married. That would seem to be very unlikely. If it were to turn out to be the case, it would be a synthetic truth. So it turns out that statement B can change roles between being a synthetic and an analytic statement depending upon the contemporary definition of "bachelor." For now, statement B is about as solid an analytic statement as has ever been devised given its universal contemporary meaning.

The long-term variability of the status of the statement is reminiscent of Kripke's stick. The stick started out as an ordinary stick with any statements about it being synthetic. Once it became fixed as a standard, statements about it as the standard for the meter became analytic. If it were to cease being the standard, say because it was sawed into two large pieces, statements about its length would have to cease being analytic. It does appear that even the clearest of analytic statements can ultimately change to being synthetic although not necessarily for the synonymy reasons proposed by Quine.

Nevertheless, Quine went too far in claiming there is no analytic/synthetic distinction. Ultimately, there is an analytic/synthetic distinction although the line between the two is just not as sharply drawn as was once thought. Neither do the analytic statements that can be generated seem all that significant or profound. After all, they are dependent on definitions. Change the definitions and their seemingly eternal truths also change.

This implies that the subjects of mathematics and logic are not as profound or as mystical as they have been imagined by some. They are simply systems of thought that generate true sentences in complete dependence on their given definitions and rules. These truths may or may not conform to what is found in the world. Remember that an example of a system that conforms to short distances in the universe is Euclidean geometry, but with respect to long distances, nonEuclidean geometry is more applicable. A further important result of the weakness of the analytic has been to diminish hope for great success through the sole use of philosophical analysis.

Table of Contents (Part 1)


7 Natural Inclinations




From the foregoing chapters, it appears we gain no basic knowledge of nature by any means other than the senses. This especially applies in the case of specific propositions. No one is born with any innate ideas or concepts nor can they suddenly acquire knowledge through any "natural light" as surmised by Leibniz.

The place in which rationalists can make the best case for something appearing innately does not have to do with knowledge as philosophers think of it. Remember that what truly counts as knowledge is propositional knowledge--knowledge that can be put into words. Babies do not come to the world reciting statements that they all know universally. Nor do they reveal them at a later age unless they have heard them from someone in their surroundings. What can be observed is behavior that demonstrates certain inclinations that can more appropriately be called instincts. As time goes by, abilities in a child become apparent that were not taught but show up anyway. Leibniz himself refrained in one passage from claiming that specific ideas are innate in us. Instead, he held they appeared "as natural inclinations, dispositions, habits, or powers."(Gottfried Leibniz, "New Essays on the Human Understanding," Moser and vander Nat, 149.)

This was a very important concession on his part. It certainly seems that he did not have in mind that babies came into the world with any propositions that they believed in or that would later be awakened in them. Rather, it appears that Leibniz was thinking of behavioral dispositions like instincts and abilities. This certainly would contradict the claim that the mind at birth is totally devoid of any information upon which it can act--a pure tabula rasa. Empiricists have agreed with Leibniz on that point. It is hard to disagree. Unfortunately in just a few passages before, Leibniz expressed approval of the belief in innate principles in the following:

The soul originally contains the principles of several notions and doctrines, which are merely roused on certain occasions by external objects, as I hold along with Plato and even the Schoolmen, and with all those who interpret in this sense the passage of St. Paul (Romans, ii. 15), in which he shows that the law of God is written in men’s hearts.(Id. 147)

This muddied the water as to what Leibniz truly believed. It is said that much of what he wrote was in his younger years during the time he was eager to please the members of the royal courts with which he was in contact. Those early writings did not necessarily express what he truly believed--especially after he reached his later years. It seems some of this has been uncovered in recent years as a result of the review of his later writings, which had not been studied before.

It can certainly be inferred that if Leibniz agreed with Paul the apostle that "the law of God is written in men’s hearts," he must have proposed in his early years that people were born with propositional knowledge. Laws are always spelled out. Universal emotions and instincts would not satisfy that specific requirement.

Predispositions and Teachings

Predispositions that children have at birth is something many observant parents have witnessed. Babies might show distaste for the sight of blood running on someone’s arm or for observing bodily injury to someone else. On the other hand, babies who have never themselves suffered serious bodily injury, which is most, might not even realize that the other person has been hurt.

There are surely a number of other natural actions and reactions that babies have without being taught them by anyone or without observing anyone else doing them. Two more examples that come to mind are fear of the dark and fear of a menacing stranger. Many children are afraid of the dark. This should not be surprising since it also involves fear of the unknown. Who knows who or what may be lurking in those mysterious dark recesses. Nevertheless, it is not clear that the emotion is a universal one. It may be that a significant number of babies never have any fear of the dark. So it is also clear that even predispositions are not universal.

The innate fear of any unknown persons who would appear in front of the baby's crib groaning, grimacing, and waving their arms in a threatening manner is certainly plausible. This is an image that even a newborn of other species like kittens and puppies may well automatically find frightening.

That suggests a possible explanation for the innate actions of human babies--evolution. It may be that certain of the emotions and actions expressed by human babies are the evolutionary inheritance from the preceding species that have contributed through the eons to the development of humans. There is a good chance that nonhuman babies would have universal fear of falling from a height or be afraid of a threatening being. We know that there are a number of traits that humans share with other animals like the expressions of affection and envy. It should not be surprising to find that some of those same traits appear in humans from birth.

Human babies start making utterances naturally without any instruction on how to do it. They can sometimes make sounds that are similar to real words. One day they speak a real word; later another one. Of course, they would have heard those words spoken by the adults around them. It may even be that the adults in their lives speak the words they want the babies to start saying. Mother may often repeat the word "mama" to baby Jane and not surprisingly that is the first word little Jane utters.

Actual words are not something babies arrive with innately. However, the ability to pronounce the variety of sounds that can make up words is innate. So is the ability to repeat and remember the sounds that make up words. Later the innate ability to understand the definitions of words becomes evident. Overall, the innate ability to learn language will appear. This is what Chomsky talked about, but it only involves innate ability, not any innate knowledge of any specific words, much less phrases, sentences, or moral maxims.

Even in the case of predispositions, it is not clear how pronounced they are. One instance involves the recognition of property. This would involve the early inclination to sense that things can belong to particular individuals and that this should be respected. It is hard to believe that this could be a predisposition--much less an innate maxim--because there are too many instances of children taking items from others and treating them as if they were their own. It seems that it is necessary for parents or others to teach them that items like toys belong to individuals and cannot be taken or even handled freely.

I myself can testify that i was taught that lesson. I was an only child. When very young, there were hardly any other children with whom i could play. We went to a party one evening in which a number of other children my age were present. I was probably a little less than three years old. Given my general isolation from other children, i had never had to deal with the issue that some toys belong to others. Some children brought their toys to the party. I remember sitting on the floor with several other kids. I wanted to examine a toy that a boy had brought and reached to pick it up from the floor in front of him. I did not make any move to appropriate it for myself. The boy, apparently wanting to assert his property rights, immediately punched me in the face. I comprehended the rule right away and put the toy back in front of him. I don't remember having any idea before the incident that things belonged to individual persons. Apparently, the other boy did. He was probably taught the idea but I had no clue.

It would be hard to believe any psychological experiment that purported to show that very young babies--say ten days old--innately possess the moral maxim that stealing is wrong. It is not obvious from observing human behavior. It is clear from the behavior of thieves that any inborn maxim disapproving of theft is widely violated. Theft is one of the most common crimes. Even respectable members of the community engage in it.

Of course, many thieves know that theft is morally wrong but simply decide to disregard the prohibition. In that case, what is the use of the moral precept being supposedly implanted in every person from birth if the deviation from it is so widespread? It would seem that an innate prohibition would be strongly engraved in each individual and that deviation from it would be unusual. Besides, when people are asked why stealing is wrong, they generally respond with an answer based on social reciprocity--if theft were not prohibited, people would be harmed. Others may simply respond that society prohibits it and it is wise not to engage in it because you can to jail. One wonders how many people could honestly say they knew the prohibition from birth. In any case it is not necessary that people know moral rules innately in order to comprehend and find them acceptable as guides to conduct.

In spite of the close concern that children may show for their own physical well being, it does not necessarily translate into concern for the safety of others. It is not unusual to see babies hit, kick, or throw things at other babies without seeming to have much concern over whether it is wrong. That is not knowledge that they bring with them. They have to be taught that it is not acceptable to engage in physical assaults.

Then there are children who are taught at an early age to be physically aggressive or to steal as a means of financial support. Children who are taught asocial behaviors do not seem to consult any innately ordained prohibitions that they carry with them. Much seems to depend on their social environment. There simply appears to be no evidence of innate, universal precepts, and if there are any, it seems they can easily be circumvented or forgotten.

It would seem that maxims, moral laws, and other specific knowledge are what rationalists like Descartes and Leibniz suspected was the case. It is by now clear that they were mistaken. (A reason for wondering if they truly believed this was that the thought-controlling Inquisition still posed a terrifying threat to those engaging in free inquiry in their times.) Babies do not come with specific precepts in their minds. The closest thing to predetermined knowledge is their arsenal of predetermined inclinations like the fear of heights or menacing figures. Even those can vary significantly from individual to individual.

Innate Belief in God

Now a few words about a maxim that some rationalists have claimed to be innate: God exists. It is understandable that they might propose this since the belief seems to be pervasive. It is hard to find persons who will unequivocally and confidently declare that there is no God. Some children might start talking about God at an early age, but it is very likely that this occurs as a result of hearing parents, relatives, and friends make professions of belief in God.

If one entertains the idea that believing in the existence of God is innate, the question arises as to the nature of that God. Surely under innateness, the image of God perceived would be universal and true to that of the real God. Why then have there been such varying conceptions of God put forward around the world? Why do those conceptions almost always conform to those prevailing in the societies in which persons find themselves?

Any idea of an innate conception of God may have been given strength by recent studies by a few neurologists who have thought there might be evidence of some inborn tendency toward religious belief. Included in that is presumably some adherence to a belief in God. I have not found information on how many of those researchers may have been biased by professing religious belief themselves.

The claim so far appears to be only that an area in the brain has been located that may facilitate feelings and thoughts that religious people get, like the desire to submit to a supreme being. I do not know of any examinations of the brains of newborn babies to see what the area looks like at birth. There is no consensus among neurologists supporting the speculation about belief. More study is needed. It is hard to believe that such an area in the brain could be so clearly pinpointed. Even if there were something found like that area, it may well be that it would only indicate that most humans are open to following group leaders. It may not mean that they are specifically predisposed to follow supreme beings.

Following leaders of the group is clearly something humans have been doing for millennia. In the early days, they followed the tribal chief. Later, the men followed the general in battle while the nation as a whole obeyed the commands of the king. Even in families, until recently there was mostly a patriarchal system in which the father held far-reaching and unquestioned authority. Today, there is no longer the degree of ironclad authoritarianism there was in those days, but the prevailing institutions still have chiefs at the top. Kings and presidents still lead countries, presidents and chief executive officers are the heads of corporations, mayors direct cities, and so on. The tendency in humans to follow a top leader is also found in other species referred to as pack animals. Examples are wolves, deer, and zebra.

Then there is "hero worship," the unquestioning admiration of certain individuals. This can happen on both a large scale and a small one. It can be harmful, especially if the hero takes advantage of his charismatic power over others to lead them down an erroneous path. It also harms the leader in influencing him to misjudge his powers and his good intentions. There are numerous examples in which this leader of the pack tendency has had its downside. It has made people follow and adore too many destructive dictators like Alexander, Napoleon, Hitler, and Stalin.

Given these considerations, the spot in the brain that purportedly shows that humans have religious leanings might be nothing more than a place that provides a trait pushing them toward following leaders at the head of a pack. This simpler explanation makes more sense especially given the parallel with other species. It is hard to see that other species show signs of having any religious feelings. It also seems hard to pinpoint the exact religious feelings found in the brain given that religious expression has a wide variation among individuals and cultural groups. It may be that the area of the brain only shows a simple leader-following tendency, but humans take it further than the other animals given their greater imagination.

Humans desire the most capable leader they can find who can guide them successfully in their endeavors. With the help of their imagination, humans came to realize that the strongest leader they could find would be the ruler of the universe. They thought it likely that the ruler of the universe would have also created it. That would be so much the better because that would mean that the ruler would know the workings of the universe very well. They called that great leader God. They went on to attribute certain wonderful qualities to this leader and to set down rules by which this leader wanted people to live. The task of figuring out the leader and his rules was then delegated to the leaders known as the priestly class.

The Religious Motives Behind Rationalism

There have been religious motives behind the promotion of rationalism that have helped it to survive even to this day. The discussion here has involved only one type of rationalism: the one professing that some original knowledge can be gained solely by nonsensory (hence allegedly rational) means. The most commonly understood definition of rationalism is that of employment of reason to decide issues rather than reliance on emotion or tradition. This kind of reason is not being questioned here. The ones who demean this type are religious traditionalists, believers in intuition, and those who benefit from the ignoring of reason.

Descartes had a religious agenda in writing The Meditations. In his writings, Leibniz referred to God and even to the authority of Christian scripture. Kant was not as openly religious but clearly indicated that he was concerned with establishing a sound basis for morality. Part of Kant's foundation for morality was the existence of God. There are other rationalists who have had religious ties. One example is Roderick Chisholm who was Romeo Elton Professor of Natural Theology at Brown University as well a professor of philosophy concurrently. Then there are the rationalists who have never heard of these philosophers. An example would be the Christians who agree with the apostle Paul in the quote above about the law of God being written in men’s hearts.

There are three reasons why religious people would want to believe that humans possess innate knowledge about God and morality.

(1) God wants humans to know his nature and his laws because they are his special creatures.

(2) Humans want to know the nature of God and his commands so that they can know how to please him.

(3) Humans are supposed to be punished for disobeying his laws so they must be able to know what those laws are.

In all this, it would seem a proper assumption that innate knowledge given by God at birth would be universal and reliable. That is, everyone would be furnished with the same exact knowledge so that each person could have the same standing with God as well as the same opportunity to have notice of his commands and thus avoid punishment.

Reason number (1) revolves around the human feeling of self-importance. Actually, much of the value of religion as a whole is that it soothes human egos. Rationalism supposedly demonstrates human exceptionalism because people have knowledge about God (metaphysics) and morality implanted in them at birth. No restriction for humans here on having to rely on their senses to know God and his commandments. Only the lowly animals have to depend completely on their senses. According to this belief, God has given humans from their inception the special privilege of this important higher knowledge as laid out by rationalism. This pure reason at birth for humans also proves that there must be a source from which it came and that source must be God.

Under reason (2) religions promote the active and continuing worship of God and contain a number of rituals to facilitate it. The assumption is that God expects this ongoing worship because it is pleasing to him.(Exodus 10:3) He is also pleased by seeing people obey his commandments. They need information in order to properly worship and obey him, and he obliges by furnishing them information at the outset. It is naturally assumed that the same information is provided to each individual because God is just.

Reason (3) is related to (2). It can be considered as an extension of it. Since religions teach that there is divine retribution for misdeeds, it is only natural to expect that God would make everyone aware of what is right and wrong. It would also seem that the moral knowledge would be imparted universally from birth with each individual receiving the same laws and the same understanding of them.

To provide these precepts as innate knowledge to everyone would be a much more fair and effective method than relying on experience. To depend solely on experience would assuredly bring uneven results that could be unjust. It would mean that the moral knowledge that each received would depend wholly on their particular situation such as the moral example of those around. A rationalist could maintain that having innate or nonsensory knowledge makes individuals less inclined to break moral rules even if present circumstances appear to allow it. The rationalist would maintain that having moral knowledge depend solely on experience makes it more easy and acceptable for people to break moral rules. People in a desperate situation may not only break moral rules at that particular time but may also be encouraged to continue to break them later, even if there is no special situation calling for it.

One answer to that line of thinking is that having moral rules implanted in our brains does not guarantee immunity against the later effect of experience on our moral behavior. A later experience, particularly one that makes a dramatic, perhaps traumatic, impact can have a profound influence on our attitude going forward. Experiences in the world and values taught by others can apparently uproot any innate knowledge that might be placed in the brain since there are so many people who show they are strongly influenced from what they learn from earthly experience. Ultimately, the reason for rationalism's existence has not been that there has been sound evidence for it but that it has been favorable to religious notions.

The intelligence of individuals can also be an important factor. Persons not having a solid intelligence may not be able to understand the moral lessons that they were either taught or experienced personally, especially when those lessons involved subtleties and nuances. Even if those individuals can understand the moral lessons, they might not be capable of assessing when and how to apply them. Low intelligence is always detrimental to making sound moral decisions.

It should be clear that much of the motivation behind rationalism has been religious and egoistic. Perhaps after the realization that religion is much of the impetus behind the belief in rationalism, as it is behind other philosophical doctrines, time will no longer be wasted in discussing rationalism. Instead, philosophers can go on to explore other philosophical questions that need to be answered. Empiricists can go on and confidently rely on the fact that knowledge is first gained through experience however imperfect that process may be. Rationalists might then give up their idea that the brain comes with preinstalled knowledge or that it is so powerful that it can gain knowledge purely through its own actions.

Observational Approach

After finishing this discussion on the sources of knowledge, one wonders whether it may have been better to eschew the entire traditional approach in philosophy of defining and discussing the time-worn and suspect approaches of old philosophers. Instead, it may have been better to deal with the issues on the basis of credible observation, logical analysis, and the related sciences of psychology and neurology. The issue could have just been approached directly on the basis of the evidence available. It seems the old philosophical jargon was in the way and caused more confusion than anything else. It is not that useful anymore. The issue would have been the same one. Do humans gain knowledge through any other way than the senses?

Table of Contents (Part 1)


8 Investigating Intuition




According to rationalism, knowledge comes innately but an alternate way to gain it is through intuition. It is not a subject that has received great attention from philosophers; however, many people in the popular culture appear to place great reliance on it. Some psychological professionals even put faith in it. A psychiatrist wrote a book that promoted the acquisition of knowledge through intuition claiming that the skill could be taught. Like other people with similar controversial beliefs, who are referred to as New Agers, she accused those who are skeptical of such powers as guilty of "linear thinking." Intuition boosters may be correct in saying that the skeptical are not imaginative in exploring the possibilities behind intuition, but they should not be reluctant to have their methods examined closely. Unfortunately, it sounds too often as if they are using such accusations as a shield against critical scrutiny.

Testing Intuition

It would seem that the advocates of intuition would not be hesitant to subject intuition to a sensible method of examining whether a statement is properly a bit of knowledge: test whether it is a highly justified belief. (Remember that i earlier rejected the traditional test of "justified true belief" on the basis that we can never be completely sure that a claim is undoubtedly true. We can only approximate belief in its truth through use of a high amount of justification.)

A simple test could be used to determine the effectiveness of intuition. The test would actually be very simple. All that would have to be done would be to keep track of the success rate of those claiming to use intuition. A record would be kept of all the claims to knowledge through intuition made by a person, often called an intuitive or a psychic. Those allegations would then be checked to see if they were in fact true. Allegations would have to be specific and precise. A type of intuition became known as extra sensory perception (ESP) and was later referred to as remote viewing. This more commonly involves intuitive perception of events at a distance.

There would be no allowance for vague predictions like those often made. Examples can be found in horoscopes such as "today you will enjoy a bit of good fortune." A friend of mine once read a similar statement on her horoscope and was very impressed when the next day she found a $10 bill in the parking lot of a convenience store. I said that i thought the statement was too weak, that we can usually find that we experience at least one incidence of good luck in our lives everyday. It would have been more impressive if the statement had been "you will find a $10 bill in the parking lot of a convenience store." Even better would have been "you will find a $10 bill with serial number IK45852247A," and the numbers matched. One never finds fortune readings that are anywhere near that precise, even when the fortune teller is face to face with the person and thus has the opportunity to tailor the prediction to the specific individual.

The test would gauge the success rate by keeping track of facts correctly pointed out as a percentage of the total number of allegations made. A test like this that was well executed would go a long way toward assessing the reliability of intuition. There is nothing like making an accuracy assessment of a method that purports to reveal the truth.

There are several meanings of intuition in the popular culture as well as in philosophy.("Intuition," The Encyclopedia of Philosophy.) Three important philosophical ones are (1) unconscious inference, (2) sensory intuition, and (3) dogmatic or mystical intuition. The following discussion will not associate these with conscious inference.

Unconscious Inference

The intuition that counts as unconscious inference is at first glance not an inference at all. The apprehension that takes place is immediate, with no time for engaging in any kind of inference. The person who experiences such an apprehension and finds that the observation turns out to be true is amazed by the process. No thought took place unless you count the observation as being a thought. Still the truth is revealed. It is magical. If at a later date, she has another noninferential experience that turns out to provide true information, she could begin to believe that she has discovered a special ability in herself. If the same kind of experience were to repeat itself in the future with the same success, she would more likely become convinced that she has the special ability. She could be included in the unique circle of those people considered to be intuitives.

The facts discovered through intuition need not be profound ones but instead could be very mundane. Here are some scenarios. Beth saw an old man walking on the sidewalk in front of her house. (All names in these scenarios are entirely fictitious.) She felt sure that he had recently bought and moved into a house on the next block. He was not just a visitor, she felt. Beth had not been on the next block for more than a year and thus had no idea if any house had been for sale there recently. Nevertheless, it turned out that the man and his family had just bought a house there and moved in last week.

Tom had the feeling that onions were on sale at the supermarket where he usually shopped. He had not seen or heard any advertisement about it. He went to the market and found that indeed onions were on sale.

Barbara had a frightening feeling that her sister Sophia was suffering a misfortune one afternoon. Barbara lived in southern California while her sister lived on the other side of the country in North Carolina. That evening she telephoned her sister to see how she was doing. Her husband answered. He sounded confused as he revealed that Sophia was in the hospital. She had suffered a heart attack that afternoon.

Charles was at work one day when one of his fellow workers Mike started to have trouble breathing and put his hand to his chest. He was trying to say something in a panic, but it was hard to understand what he was saying. Somebody yelled, "It's a heart attack." Mike started gasping for air and then fell to the floor. He was soon completely quiet and apparently became unconscious. Charles quickly moved into action. He got down on his knees next to Mike and moved Mike's arms to his sides. Four people were now standing around them. Charles took all the necessary steps in administering cardiopulmonary resuscitation (CPR). He worked hard to get Mike to start breathing again. It seemed to him that hours were dragging by. Finally, he got Mike to start breathing. He kept him breathing until paramedics arrived.

In the first three cases, there was very little information upon which to base the factual judgments made. As much as some people would like to say that the judgments could have been based on intuition, there is still the possibility that they were simply lucky guesses. People who believe in the power of intuition like to think of it as some magical power, especially when they conclude that they themselves have the special ability.

Judgments or predictions as in those first three cases are sometimes called "hunches." A hunch is not generally understood as a total guess. A hunch is generally understood as based on more evidence than a guess would involve. On the other hand, a guess can at times be based on some evidence and be called an "educated guess." For instance, if Beth had simply claimed that a man had moved into a house on the next block without a single item of evidence in support, without even seeing him, that would have been a total guess. Instead, she did at least see the man walking in front of her house and was able to observe him, his dress, and his demeanor.

Tom's claim seems the closest to being a guess. There was no mention of any evidence on which he may have relied. However, upon closer examination, it may be that Tom took other facts into consideration in making his prediction and was not aware of doing so. He could have been knowledgeable about it being a time of the year in which onions were being harvested more abundantly and therefore more likely to be put on sale. He could have also noticed that in recent weeks onions had not been put on sale. It was overdue then for that to happen. Also he may have recently overheard a store employee in the produce department telling a customer that a large shipment of onions was due any day. Tom knew that an abundance of any item of produce made it more probable that it would be put on sale.

In spite of Tom's knowledge of all these facts, he did not consciously take them into account in making his prediction of a sale. If asked, he would have said that the thought came to him suddenly--it was more like a feeling than anything else. He could have stated that he didn't remember ever hearing any store employee making any prior statement about onions. Yet, it is possible that even if he was not aware of the statement the experience was still stored somewhere in his brain.

The same could be true of the other items of information that he had known about onions. He was not conscious of those items at the time the thought of the sale came to him. He did not even think of those facts, much less put them into a neat inference. Nevertheless, he came to a conclusion that was not a naked guess. Instead, it was more of a very quick inference made on the basis of facts that Tom knew but of which he was not conscious at the time.

Much the same can be said in the cases of Beth and Barbara. If Beth had been asked afterward what details made her think the man had moved to the next block, she may have responded that she had never seen him before and that he was walking in front of her house that was on the same street to which he had moved, but a block away. She may have said that it was otherwise just a strong feeling on her part, just like the ones she felt before when she estimated that something had taken place.

Beth could come up with this assessment of her evidence even if at the time of her observation she was not aware of details. She liked the idea that she possessed certain powers of intuition. This made her less inclined to engage in any introspective analysis of what details she may have observed. She was fond of the thought that the power was immediate and forceful. She wanted to continue believing that she had the special ability of intuition.

There could have been other clues that helped her reach her conclusion. She might have come to realize that it made a difference that the man was walking in the direction in which his new house was located. She was also influenced by subtle clues like his looking around the houses he passed as if he had never seen them before and wanted to familiarize himself. He was also old enough to be retired so she got the impression that he lived in the neighborhood. It did not seem to her that at his age he would be likely to hold a job selling products door to door or even less the member of a gang and spying on residences to determine which ones were good targets for burglary.

These and similar observations could have influenced Beth in coming to her conclusion. Beth was not conscious of engaging in these observations partly because they were simple, common observations that were the kind that people make repeatedly in their lives. It was consequently understandable that she could have made them swiftly without being conscious of it.

Barbara's thought that her sister Sophia was having a negative experience may have also had background evidence to support it. It may be that Barbara knew that her sister had suffered from ill health since childhood. She had contracted pneumonia twice and almost died each time. More recently, she had become sick with diabetes. She had trouble controlling her high blood pressure and her thyroid.

Barbara could claim that she just got that feeling out of nowhere; that there had been no clues. Barbara may consider herself as an intuitive or a clairvoyant or someone with paranormal abilities. If Barbara happened to be the type of person who is always fretting about what misfortune could be around the corner, it could add to the likelihood that she would be assuming that some misfortune had struck her sister. Somebody less sympathetic to anyone seriously having these abilities may just believe she felt an accurate hunch based on the fact that she knew her sister suffered various physical maladies.

If Barbara had merely made a raw guess without having any history of her sister having health problems, she could be one of those people who place a high premium on their emotions and on what those emotions tell them about different situations they encounter. They tend to believe in the accuracy of what their intuition supposedly tells them. Whether their reliance is well placed is another matter.

The experience of Charles is different from the other three. Suppose these other facts were present. Thirty years ago, Charles saw a demonstration of how to administer cardiopulmonary resuscitation (CPR). He was part of a group of five other people. After seeing the demonstration performed on a dummy, each person in the group got to practice the technique. After that, Charles never saw any other performance of CPR (actual or in training) nor did he get a chance to use it himself in any emergency. He did not reflect much on the technique, essentially assuming he would never have to use it. Charles had not reviewed any instructions on CPR since then by consulting a book or in any other way.

After Charles saved his fellow worker, the ambulance technicians told him he had taken all the right steps after he had told them what he had done. He could not believe that he had successfully remembered everything he had been taught that many years ago. Charles did not consider the case to be anything but an unusual recall by his memory.

Charles's case is an example of unconscious recall of forgotten memories with perhaps the help of limited unconscious inference. It was uncanny that he was able to recall a complicated procedure, but that is the best explanation. There is no need to resort to something like intuition to explain the incident. Likewise, there are probably many cases of claimed use of intuition that are no more than occurrences of unconscious recall.

Other purported cases of intuition involve no more than the mere making of guesses that turn out to be correct. Some of those cases no doubt involve situations in which a reputed intuitive is already aware of some of the facts--consciously or unconsciously--that support the truth of the proposition under consideration. These include the "educated guesses." Naturally, these cases are more likely to turn out to be true. These claims and predictions should never be considered as discovered through intuition. The knowledge of some of the facts involved takes it away from the realm of intuition and puts it into that of inference. The inference could involve facts known consciously to the intuitive and some unconsciously.

As for naked guesses, they should never be included within the purview of intuition. How do mere guesses get included as intuition? First, those who believe in intuition tend to be emotional people who place great value on what emotions indicated was the case. Second, because of their faith in intuition, they tend to look for situations in which it appears that intuition produces results and then tend to celebrate and advertise their successes. This can involve their own apparent triumphs, or those that they find out were performed by others. They do not even think about the third problem with intuition--its failures.

It does not occur to them that there could be anything like failure or error in connection with intuition so they don't even think of searching for it. There is no thought of making an attempt to keep an accurate record of the successful claims made by intuition, as well as the failures. They don't keep records themselves and they don't have anyone else do it that might make an objective, accurate study of the accuracy of intuition. The record keeper would have to be a person who was not already convinced that intuition was a reliable method of attaining knowledge. An accurate study would also very much depend on the complete honesty of the intuitive who would have to report all incidences of having intuition, both the correct and the incorrect ones.

This then is the idea of unconscious inference: it is merely the result of a set of quick observations and inferences made unconsciously. These observations can be combined with conscious ones. Putting them together can bring about a quick inference that seems to the person making it unconsciously that it was an idea that just popped into the head. What happens is that the brain quickly brings together all past experience and knowledge, joins it to the present observations, and furnishes an insight. There is no guarantee that this insight is an accurate one, but it can turn out to be correct. It all takes place very quickly. It is not a magical occurrence, as intuition seems to be understood. It is a psychological phenomenon that can be studied and understood in everyday terms.

Standards for Measuring Success

This leads to the standards for measuring success. It would certainly seem that the intuitive would need to have a better record than just an occasional good "hit" on a prediction. Measuring the percentage of correct predictions from the total number of predictions made has been mentioned. How high should we expect that percentage to be?

If intuitives on average only turn out to be correct in, say, one out of four cases (25% of the time), that is not reliable even if it may seem amazing each time an intuitive turns out to be correct. What is so impressive about 25%? It would seem that almost anyone could claim to make correct guesses 25% of the time or even as high as 50% of the time. Even then, an expectation that an intuitive or psychic be accurate more often than not (on more than 50% of the attempts) is not high. It is hard to see how anything less can be given much credence.

This would be especially true if any of the guesses were educated guesses, with the intuitive knowing some facts about the situation. Off hand, it would seem that even raw guesses should be accurate more than half of the time. After all, the flip of a coin would point to the right answer 50% of the time. The standard for intuitive accuracy should be one that is at least equal to the accuracy of a simple coin flip.

Experiments have to be designed in a way that will not provide any more clues to the subjects beyond those that are clearly and openly known to all subjects and conductors of the experiments. The experimenters have to give greater credit to those intuitives who show sufficient success with little or no clues. They also have to be very careful about hidden clues.

It has been reported that intuitives have helped law enforcement discover certain pieces of evidence to help solve crimes. These cases provide real-time experiments on the effectiveness of psychic predictions. They reputedly solve murder cases by being able to intuit the exact place where victims' bodies were buried. Obviously, this is not a fact that can readily be uncovered by ordinary means. Kidnappings are a crime that could certainly use the services of people with good intuition especially right after the abductee has been taken. A psychic could quickly determine the place where the victim is being held.

That leads to the question of how much reliability should be expected of intuition in cases where information is not readily confirmable by ordinary investigation. Perhaps in these cases greater leeway should be afforded the intuitives. After all, if it is a cold murder case in the file for ten years that looks like it will never be solved, there is not much to lose if a psychic is consulted. The psychic would only be paid if the victim's body were found.

Suppose a clairvoyant asserts that a murder victim was buried in the northwest corner of the backyard of the house where the woman's boyfriend once lived. The people that live there now are happy to cooperate with the police and readily give them permission to dig in the backyard. The detective takes the psychic to the house, and she points out exactly where she has intuited that the body was buried.

Workers dig deep and wide but find nothing. The psychic then says that she has picked up that the body is buried at another spot in the yard, about 100 feet away. The family approves the new digging. Once again, a wide hole is dug but no skeleton found. The psychic apologizes and then says she feels that the body is located 200 feet away right next to a young tree. Digging there would risk killing it. At what point should reliance on an intuitive end? Should the police have even bothered to dig the second hole?

The answers to these questions depend very much on the particular circumstances. The cost of all the efforts would be an important factor. The detective and any other permanent police employees might not incur an extra cost to the department because they would be collecting their salary anyway, but it is not likely that they would just be sitting around if they did not have this case to work on. There are always crimes to solve. The victims or their relatives of other unsolved crimes might not be happy to see detectives spending time on a case that depends solely on the intuition of a psychic. The hole diggers could be volunteers but more than likely the police department would have had to hire them.

It seems that questions regarding ordinary, everyday events can be ascertained by taking a few simple steps. For instance Barbara's suspicion about her sister was easy to investigate by picking up the telephone and calling her sister. Given this, why even think of putting much faith in intuition? The safest way to be assured of the truth is to carry out a regular investigation even if it means more time and effort. An intuitive would have to possess an almost perfect record in order to command any greater confidence. A standard of perfection would be what would be expected of intuition in cases involving sensitive matters like accusations of crimes.

There are situations in which the services of a psychic could be of great use, but they would require great accuracy. There is the type of case in which police officers need to search an occupied house to find drugs. If the occupants were asked for permission to do a search, they would remove or destroy the drugs. In order to prevent this, the police will break down the door and charge in to seize the items before anyone there has a chance to hide them. A case like this caused great national commotion in 2020 in Louisville, Kentucky, when a woman was killed by police when they barged in. Her boyfriend had first shot at them.

Needless to say, the police need to be sure that they are targeting the right house or the effects could be devastating. If the home of completely innocent people is chosen by mistake, it could be a lasting shock to the residents with serious financial loss to the city if the residents file a lawsuit. Obviously, there is no room for error in these cases. A psychic trying to predict if there were drugs to be found in a house would need a record of 100%. Other criminal situations may not be as sensitive, but nevertheless call for close care.

Perfection is required in at least one other application of intuition. In military combat, some of the operations cannot leave room for mistakes. Suppose an army is moving forward to capture a city and therefore needs to know the exact location of an enemy army that is suspected to be in the area, as well as in which direction it is moving. Assume this information is needed immediately and that no technological tools are available, not even GPS.

If a psychic were consulted and provided information that turned out to be correct, that would be very helpful. However, if it turned out to be wrong, it could be disastrous for the advancing army. It could start moving in one direction to engage the opposing army as directed by the psychic, but the enemy could actually be coming from behind, spot the advancing army, and attack it.

In another example, armed forces could suspect that terrorists were hiding with a large cache of weapons in a house. They would consult a psychic who would confirm that there were terrorists there. Airplanes would then bomb the house, but it would turn out that only a family with five children was living there, and there were no weapons. While these kinds of devastating mistakes have been made through regular surveillance, it would not be advisable to hire psychics unless they were to have a better record than the regular intelligence gathering methods.

In the 1960's, the U.S. government through its Central Intelligence Agency funded some studies of remote viewing. The studies developed supporters as well as detractors. One of the criticisms was that too often the subjects who were supposed to be adept at remote viewing were provided hints that made them look good. The U.S. military supposedly hired remote viewers at one time to uncover needed information. It is interesting that these organizations did not keep remote viewers in their employ for long. Apparently they were not impressed enough with their production. You would think that if they had found any kind of reliable results that they would have gladly continued to hire the remote viewers.

In 2003, the United States invaded Iraq purportedly because the President's advisers became convinced that Iraq had weapons of mass destruction and was planning to use them. That assessment turned out to be wrong. There was tremendous suffering on account of the misguided and possibly illegal mission. A reliable intuitive could have pointed out the erroneous assessment before an invasion was mounted.

One would think that if a significant number of important cases were being solved by means of intuition that it would have been widely publicized. The psychics would have become well known and their services would be in high demand not only by police departments but by businesses and governments around the world as well as countless individuals seeking to find lost relatives or personal property. In the medical field they could help to analyze conditions in patients that are difficult for doctors to diagnose. Surely, intuitives with solid records would not only be reported in the media but also followed closely. There has been no credible news about these kinds of intuitive feats.

Instead of reliable intuitives, one hears of those who claim after an event that they had predicted it would happen. Reliable proof of the making of the prediction is not offered. Instead, one is supposed to rely upon the testimony of the soothsayer or his close followers that the prediction was clearly made beforehand. I know of one man who claimed that on a radio program he had predicted the attack on the World Trade Center in New York on September 11, 2001, in which two airplanes struck the twin towers there. It turned out that all he had said the month before was that he felt that a serious air crash would take place in the near future without specifying the location or any other details.

That is the kind of prediction that certain people make and then take seriously. They probably possess the kind of personality that is more fearful and anxiety-ridden. They regularly make such predictions of impending disaster--large or small--and fail to keep track of the times they are off the mark. There were others who claimed after the fact to have predicted the 9/11 assault. If only, they had come up with specific information beforehand, including the identity of the terrorists or at least their specific plan, and provided it to the FBI.

The claim has been made for a long time that animals possess intuition particularly when it comes to predicting the approach of major storms. Accurate studies are needed before it can be said that animals have significant intuitive ability and to what extent. Until now, there is no evidence of any significant and reliable human intuitive ability. Human knowledge has to be gained and justified through the old, traditional, and sometimes painstaking sensory means.

The Senses and Intuition

It has been claimed that intuition takes place in certain cases in which there is immediate sensory perception of objects. An example would be the observation of a yellow patch without giving it any thought.(The phrase "yellow patch" has been used repeatedly by recent philosophers in discussing these issues. It is just a large yellow imaginary spot.) The idea came from Kant who considered this sensory intuition as knowledge. According to him, it was knowledge obtained "without the mediation of concepts." This implied that language, specifically through the formation of propositions, was not involved in obtaining this type of knowledge. The sighting of a yellow patch is done without thinking and thus without the use of language. It is simply an immediate perception.

Bertrand Russell also believed in the concept and in his book The Problems of Philosophy called it "knowledge by acquaintance." He contrasted it with "knowledge by description" that obviously involved language. At first glance, knowledge by acquaintance seems obvious, but is it really true?

It is ironic that these philosophers gave so much credence to sensory intuition given that so many other philosophers have been so distrustful of the reliability of the senses. Philosophers have used several examples of sensory error. One would think that at least a quick assessment of the immediate perception would be needed before it could be counted as knowledge. At most it seems that quick perceptions should only be counted as tentative knowledge.

The Need for Justification

Another serious question that sensory intuition or acquaintance faces is the long accepted definition of knowledge. We have seen that since Socrates that definition has been something like "justified true belief." Actually Socrates did not declare that as the definition of knowledge.("Knowledge," The Oxford Companion to Philosophy) I proposed that the "true" requirement be dropped and instead the test be "highly justified belief." In either case, the requirement of justification is present, and there is no reason to leave that out if one wants to have sufficiently adequate reliability in the knowledge one possesses. Given how easy it is to encounter error in one's beliefs, it seems wise to engage in justification of them.

It seems that if one is to maintain a high standard for what is to count as knowledge that one would not consider immediate sensory intuition as knowledge. No matter how obvious such experience may appear, it would seem that a philosopher would want to leave it open to philosophical scrutiny if necessary.

After all, even in the case of clear observation, looks can be deceiving. In the case of the sighting of a yellow patch, it can be asked whether the patch was possibly an illusion or in a dream. Of course, one can maintain that in any of these cases one is still perceiving the color yellow. Still there can be illusions in which an object appears yellow but is not. The existence of an illusion could be based on the testimony of many other people who see the patch and unanimously agree that it is white. An opinion of the color of something can be in error on account of defect of the eye or the influence of drugs.

If one still wants to insist that a person gains knowledge based merely on her/is report of immediate sensory perception, then that leaves too much room for accepting knowledge on the basis of purely subjective observation. Perhaps those who favor intuition or acquaintance would want to discard the requirement of justification. This would not seem advisable since it would allow for claims to knowledge without providing any support for them. It would seem that even those who do not desire to include justification as part of the definition of knowledge would still want to have some requirement of justification.

The definition of knowledge could perhaps be changed to something like "belief based on reliable information." The requirement that there be information involved seems clear and simple enough. The "reliable" requirement is certainly understandable since no information that was not reliable would be desirable. The problem would come in deciding what is and what is not reliable. There would have to be reasons given on one side or the other on what would be reliable information, and the various reasons would have to be weighed as to their importance. All this reasoning would be unavoidable in determining what is reliable, and it would be nothing other than justification. Reasoning in support of a proposition is justification.

The need for justification for anything like knowledge simply cannot be avoided. Furthermore, justification must involve propositions that are expressed in language. Intuition or acquaintance exhibit a nonpropositional nature. There can be no claim to knowledge that just appears out of nowhere with an automatic guarantee of reliability. There must be some support for accepting an immediate sensation as knowledge.

It can be claimed that acquaintance is simply self-evident and should be accepted on that basis alone. The problem with that is that what is self-evident to one person may not be self-evident to another at any given time. To convince others to accept a proposition as self-evident usually involves the making of at least some argument, i.e. the engaging in justification. Furthermore, self-evidence as a standard for knowledge needs to be avoided. It could be used too readily in proposing that an item is knowledge when it is not. Self-evidence is best employed in exceptional situations such as that the sun provides light or the initial undefined terms of geometry.

The assumption with regard to acquaintance is that the apprehension is immediate, clear, and instantly understood by the observer without any prior help or input of information from any other source. Kant exempted the making of judgments in connection with what was intuited. Is complete absence of judgment possible? It would seem necessary to at least be able to state what was being intuited. If that were the case, then some amount of justification would be needed in order to support the conclusion. In reaching the conclusion "that is a yellow patch," prior experience and memory would have to be enlisted to distinguish that contention from other possible judgments such as that it is red or purple. It seems that from the start acquaintance has a problem with being able to avoid justification.

Here are some further considerations. When a person perceives a yellow patch, what exactly takes place? Does it involve only an act of immediate acquaintance, or do prior conditions need to exist before the acquaintance can be made? Imagine three different situations in which a child observes a yellow patch at the age of (1) one day, (2) one year, and (3) five years.

At one day, the baby would naturally not have any language. Assume that at (2) one year the child had a limited vocabulary, with no words for the different colors. At (3) five years, the child would have a grasp of the correct names of colors and readily state that the patch was a yellow one.

What would take place when the child was shown the yellow patch? At (1) one day old, it would see something, but it would probably not have enough awareness to distinguish the yellow patch from anything else in its field of vision. It would not be able to tell the color yellow apart from any other color. The most that it would be accomplishing at that time would be that it would be starting to train its eye muscles to focus and to coordinate the areas in its brain that aid in vision. Nothing would take place in its act of viewing that could be considered acquaintance, knowledge by acquaintance, or intuition. It would see nothing that it would recognize in any way because it would be seeing many objects for the first time.

Even upon looking at things after being one week old or even one month old, its memory might not be good enough to remember the first time it observed the yellow patch or any of the subsequent times it observed it. It may not remember any colors or other observations. Each observation may be like the first one with the experience being new each time. If that were the case, it would be strange to say that the baby was gaining knowledge by acquaintance or knowledge of any kind. Instead, there was a simple rudimentary awareness that lasted only as long as an object was being observed.

At (2) one year of age there is a more detailed situation. Although the child would not know the names of objects or colors, its visual ability would certainly be much sharper than at one day of age. It would surely be able to tell objects apart from each other and would be familiar with different colors. It would have experienced the color yellow many times by then as well as many other colors. In its mind, it would be able to tell yellow apart from other colors even though it may not have names for the colors. Surely, it would still in some way sense the difference between colors even though it is hard for us to know this for certain.

By the age of one, it would be able to recognize a yellow patch as having a distinct, recognizable color. The child would also become familiar with a number of other scenes. It would certainly become able to easily distinguish one object from another. All this would suggest that the child would by one year of age be able to gain acquaintance with many things. The child would not have the vocabulary to indicate that it was gaining knowledge. It certainly would still not be capable of articulating any propositions to justify any knowledge.

In addition the child would be familiar enough with things that it could depend on their appearance remaining the same. There would be a constancy. Objects that possessed the color yellow would continue to be yellow. The same would be true of other colors. When one toy wooden block was put next to another one, it would always be the case that two blocks would always be there. The same would be the case if instead of blocks it were candies or socks or people.

The child at one year would have also become familiar with other regularities. It did not just happen magically that the one year old sensed a familiarity with a yellow patch. It would have gone through many experiences several times a day since it was born. As a result of repeated experiences of this kind, the child would learn to expect and rely on certain regular occurrences that it would recognize. That could be considered acquaintance but not immediate acquaintance. It would have been gradually gained over a period of a year. It could said that the child gained a belief in view of those repeated experiences. It would be a nonknowing acquaintance. That is not what intuition is claimed to be.

The case of the (3) five year old is not much different from that of the one year old. The five year old would simply have many more experiences under its belt than the one year old. It would have seen many more instances of yellow patches and would therefore be more confident in knowing what a yellow patch looked like. Furthermore, it would have a good enough command of language to explain why it believed that it was perceiving a yellow patch. It could justify the belief on remembering that it had seen yellow patches many times before and that it had been told the name of the color under consideration was "yellow."

The five year old's knowledge of language could also enable it to know certain abstract facts such as that 1+1=2 and to be able to explain that through repeated observations it had come to believe it to be true. Since the five year old would know language, its justifications could be based on propositions.

Learning Space and Time

Kant thought that space and time were known through intuition although it was not sensory. The answer to that is that the preceding examination of how humans become acquainted with items like a yellow patch also applies to learning about space and time. It involves a gradual process of experience.

From day one, an infant begins to distinguish one object from other objects. Gradually, it comes to notice that there is a host of objects all around. It also comes to notice that it can move objects within its reach in all different directions. There is never more than one object in one place at the same time. Certain places can be occupied immediately after being left empty by an object. For instance, mother can come and stand in a spot. Later father can then come into the room and stand in the exact same spot where she stood. There is an emptiness in which there are no objects and in which objects can freely move. Sound can be heard coming from all sorts of directions. Sometimes it is heard right in front of the child and at other times at a distance.

Over a period of a year with the child making similar observations on a daily basis, it becomes familiar with objects and where they move. The familiarity becomes so deep that the child gains a knowledge of space that lasts through its lifetime without giving it much thought. The familiarity (belief) with space is justified on a sensory (empirical) basis. It is not a matter of some magical intuition.

The gaining of knowledge of time also involves repeated experiences. At some point in its early life, an infant begins to notice that objects do not always appear together. At times one will appear and after that another one that was not there and then later perhaps two more. In other words, objects can appear sequentially. The same can be said of events.

In the morning, mother appears above the crib and then leaves. She reappears with a small bowl containing soft food. The child eats it slowly, finishes it, and mother takes the bowl away. Many events take place after that with varying intervals between events until it becomes time to sleep again. The child develops a sense of time. Years later it learns about clocks. With their regular movements, they help keep track of the events that are always taking place. They make it easier to keep up with time.

Intuition and Mysticism

Mysticism has been in existence in various forms for about 2,000 years. A common feature is that it claims that people can gain knowledge of God or moral truths through intuition. It is usually believed that only a select few mystics can gain such insight. This mystical intuition supposedly provides a nonpropositional knowledge that needs no justification.

This is like the sensory intuition of Kant. It has been said that Kant's idea of sensory intuition gave support to alleged mystical knowledge by claiming that there can be a valid nonpropositional knowledge. With this support, mystics can freely fly to the sky with their claims that they have communed with God and therefore are sure of his wishes and the commands he has set down. They can also discern his nature. All this they can accomplish only through a mysterious power of intuition that does not involve justifying their claims.

A consequence of this is that the mystic having the intuition can declare that certain facts about God are true even though it is only based on her/is own private experience. The mystic often claims that the experience is inexpressible. Nevertheless, everyone else is expected to ignore its subjective, individual nature and rely upon any factual claims the mystic makes on the basis of the mystic’s experience.

The prior showing that there is no such thing as sensory intuition--that instead it is simple knowledge based on numerous prior experiences--negates placing any credence on mystical intuition. Such spiritual intuition is solely an emotional experience isolated to the individual having it. There is no reliable knowledge to be gleaned from it. This can be shown by the conflicting ideas that different mystics say they experience. One mystic can experience God as a single entity, while another mystic can feel the presence of three, and yet another communes with seven hundred. There is no limit on what each individual mystic can experience, and yet it is all supposed to be accepted as truth.

What further makes alleged mystical knowledge questionable is that it cannot be analyzed and compared with knowledge that is already known. For instance, a mystic's claims to intuit that God is unconditional love is allegedly unassailable. It supposedly cannot be refuted by the belief found in certain religions that God punishes people even for seemingly trivial acts and furthermore can condemn some to eternal damnation. This is not unconditional love. Mystics can become indignant when their mystical observations are questioned. It would appear then that at the bottom of their claims of their mystical experiences is a simple desire to shield their wishful feelings from cognitive attack.

Table of Contents (Part 1)


9 Two More Skeptical Doubts




There are two other important skeptical doubts besides the one involving the existence of physical objects discussed in Chapter 4. They are connected to (1) induction and (2) the self. The skeptical doubts about the two subjects were clearly expressed by David Hume and have been avidly discussed by philosophers ever since without any final resolution. Philosophers will not completely agree with the remarks here but hopefully the comments will shed some light on the two important topics. Induction is important in scientific investigation and the self has long been of interest to individuals pondering the essence of their nature.

Induction

Induction has been considered as the forming of a general conclusion on the basis of particular facts. The scientific method that proceeds from specific observations to general theories has been counted as employing inductive reasoning. Scientists as well as the general public accept inductive reasoning without much question and use it all the time. On closer look, how it works is not as clear as it would appear. There is the question of what constitutes an acceptable set of facts to support an inductive conclusion. Before that can be decided, general criteria need to be formulated to determine what is an acceptable set of facts.

Then there are the various inductive arguments that have been identified such as ampliative, primary, adductive, elaborated, and proportional. There are many issues to be settled in induction, and it should be understandable that there are various opinions on the subject.

To deal with all the issues involved in induction would take at least one book. Therefore, the focus will be on Hume's skepticism on induction since that has influenced a number of philosophers who have followed him, even to the point of denying that there can be such a concept. One approach to the rejection of induction has been through the promotion of the hypothetico-deductive method.(Max Black, "Induction," The Encyclopedia of Philosophy.)

Hume's Exaggerations

One place to begin questioning Hume's conclusions in connection with induction is by looking at the following statement he made:

For all inferences from experience suppose, as their foundation, that the future will resemble the past . . . If there be any suspicion that the course of nature may change, and that the past may be no rule for the future, all experience becomes useless, and can give rise to no inference or conclusion. It is impossible, therefore, that any arguments from experience can prove this resemblance of the past to the future, since all those arguments are founded on the supposition of that resemblance.(David Hume, Enquiry Concerning Human Understanding, Section IV, Part II, 51, quoted in Michael Cohen, "Induction," The Oxford Companion to Philosophy.)

Hume was correct in pointing out that there is no guarantee that the future will resemble the past. Almost everyone appears to make the assumption that it will without question, but upon closer reflection, there is nothing inevitable or necessary about it. We expect a flame to be hot and no doubt have always experienced it so. Yet it is not an absolute certainty that it should always be hot without exception. Theories of physics explain and predict the continuing association of heat with a flame, but in turn, there is no unbending certainty that the laws and theories of physics must always remain the same. We cannot be absolutely sure that tomorrow everything in the world will work in the same way that it has up to now.

Hume was insightful in coming up with his observation but went too far in implying that there is much possibility that "the course of nature may change." His claim that "all experience becomes useless" was also very extreme. This is a clear exaggeration that cannot be taken seriously. It is the kind of claim which if believed gives the skeptical position much greater weight than it deserves. The fact is that the long-time experience of observing the same effect result from the same cause without any known exception has to count for something, as is attested by the vast body of knowledge that has been accumulated in the sciences. The systematic knowledge that specific effects follow from particular causes has given a great amount of understanding and power over certain aspects of the world.

An example of such knowledge is that of spinning a magnet with its surrounding magnetic field inside a coil of wire. This action creates an electric current in the wire. This effect is so regular and expectable that in physics it was long ago labeled the law of electromagnetic induction. (This use of "induction" has nothing to do with what we are examining here. Presumably, it comes from the idea that a moving magnet "induces" an electric current in a wire.) Given a coil and a magnet large enough along with other factors, a person touching the coil can be electrocuted. Experts in the field of electricity, such as physicists and electrical engineers, would consider it a certainty that an electric current passes through the wire.

A philosophical skeptic like Hume would be well advised not to touch the wire in order to test the law and find out if cause and effect could fail to work in that particular instance. The experience in the past of measuring electric current in a circuit of this kind and the unfortunate experience of witnessing people actually electrocuted in such situations has been very useful even it it cannot be guaranteed that in some future test the law may not take effect.

Hume further exaggerated in pointing out that experience "can give rise to no inference or conclusion."(Id.) If an electric current has been produced every known time a magnet has been passed through a wire coil, it is a very sound expectation that it will happen again. Another sound and useful inference is that it is not a good idea to touch the wire because it can electrocute.

Not only are these sound inferences but their opposing conjectures are not. That is, to suppose that no current will flow through the wire the next time a magnet is turned or that touching an electric wire with current will not electrocute is not a sound assumption given the long history of experience with electric wires. In fact, almost anyone would call such a supposition downright foolish. We should not be concerned that any drastic changes in the regular processes will occur in the future.

Induction is Not Deduction

It has been pointed out that all Hume accomplished in claiming that inductive inferences could be unreliable was to show that induction was not deduction. The conclusions obtained through deduction are more reliable due to the nature of the method. Because deduction is more abstract, it appears that its results are more certain. The syllogism if A-->B and B-->C, then A-->C is not open to error or dispute, but then it is so general that it can only turn out to be incontrovertible. Deductive proofs were awarded a special place by philosophers, because they are abstract and present only safe processes of thought. This is understandable since philosophers have long been so intent on finding certainty.

Deduction seems infallible only in its abstract form. Take the two premises mentioned before: A-->B and B-->C. These abstract premises standing alone pose no problem. They have to be considered noncontroversially correct. Substitute actual, real-world statements for each of the variables, and it could cause problems. People could disagree that the specific statement substituted for the variable A in fact implies the specific one substituted for the variable B. An example could be the claim: legalization of marijuana implies that more people will become addicted to drugs.

This is another example where the quest for certainty has complicated matters more than necessary and impeded understanding of the inductive approach. While it may be of benefit to compare deduction and induction, it does not mean that if induction does not measure up to the standards of deduction, it should be considered an inadequate means of reaching conclusions. Many philosophers starting with Hume have downgraded induction in this manner. They have gone so far as to count induction as not rational simply because its conclusions do not follow strictly from its premises.

To define rationality in that manner is too narrow. Rationality involves the proper use of reason by the reaching of conclusions through inferences, but this need not be restricted to deduction. The reaching of correct conclusions in our everyday lives through what can only be called inductive methods is testimony to the viability of induction. My conclusion that my lovebird is about to suffer an epileptic attack based on her walking erratically, twisting her head, and spreading her wings is generally on the mark. It is possible that i could be wrong and that this time it is a false alarm, but most to the time an epileptic attack does occur. It could be said that there is a high probability an attack will take place whenever those signs are observed.

Support for Induction

Attempts have been made to place induction on a foundation of probability notably by John Maynard Keynes and Rudolf Carnap.(Black) These attempts have not met with success, but there does appear to be at least a loose relationship between induction and probability. Inductive reasoning cannot bring certainty, but it seems indisputable that it brings conclusions that are highly probable. There are many examples like the one involving my epileptic bird that show that nondeductive inferences made in everyday life are rational.

In scientific situations, inductive inferences have to be made all the time. For example, botanists observe a number of plants they have never seen before that look the same and have similar characteristics and decide they have discovered a new species. Chemists examine several samples of a liquid and determine they have the same chemical formula as phenobarbital and infer without hesitation that the samples are indeed phenobarbital. Medical doctors can take the samples of phenobarbital and conclude on the basis of their past training and experience that they will have the same sedative effect in the future that they have previously observed on their patients. Psychologists observe certain sets of behaviors in their patients and conclude that the patients are suffering bipolar depression, schizophrenia, or the like. They can then suggest the proper treatment. Induction is simply too prevalent and useful to not consider seriously as a rational procedure. It should not be dismissed because it does not measure up to the standards of deduction or because it has been difficult to find a noncontroversial justification for it.

Probably the best way to justify induction is through induction itself. This is what i did above when i discussed that through the use of induction it was a good idea to avoid touching a wire that had electricity running through it. This immediately brings up the accusation of circularity. This is understandable, but it may be that it is the only way to justify induction. It is a straightforward demonstration. If numerous cases in a variety of situations have worked repeatedly at various times in the past, then the general case can be made that repeated occurrences in the past will continue to take place in the future.(A short discussion of the considerations for accepting inductive support of induction can be found id. at 173.) It seems that evidence for induction such as this that involves the bringing forward of multitudes of examples from different areas of earthly existence through millions of years should count for something. Even if nothing like a deductive proof for induction will ever be forthcoming, it should still be given attention and consideration.

Perhaps the way to look at what happens when a conclusion is derived from a set of alleged facts is that one is not arriving at an infallible, rock-hard, inerrant fact but rather is only finding the best assumption on which to proceed based on the given facts. It is "the inference to the best explanation" a solid assertion formulated by Gilbert Harman.("Inference to the Best Explanation," The Oxford Companion to Philosophy.)

Take as an example Don who plans to go to his regular job tomorrow and afterwards stop by the grocery store. The night before, however, he wonders whether he should make the necessary preparations such as setting out the clothes he is going to wear, sorting documents in his briefcase that he needs to take back to the office, and writing a grocery shopping list. For all his life and for as long as anyone else can remember, every day has arrived with unerring predictability. The sun has always risen as expected although on cloudy days it has been covered from sight.

However, it has occurred to Don that there is a possibility that tomorrow may not be the normal day that everyone always expects. After all, there is no certainty that the future will resemble the past. Tomorrow could possibly be an exceptional day in which the sun does not rise. Maybe Don would just be wasting time to make all his daily preparations if something like that happened.

Almost anyone would agree that Don should make his usual preparations because the probability is very small that tomorrow will be significantly different from other days. That tomorrow will be like every other day may only be an assumption, but it is a very solid one nonetheless. This may well be an illustration that shows that conclusions derived through inductive reasoning are no more than assumptions, at least by the standards set down by Hume. Yet they may be very strong and reliable assumptions. If this is the case, it may clarify why inductive reasoning is not as reliable as deductive reasoning but should nevertheless establish it securely as a sound mode of reasoning.

The Self and the Soul

Another skeptical doubt of Hume brings up the questions of the nature of personal identity and the soul. Hume pointed out that he was unable to detect any unique, single self within himself. He concluded that people were "nothing but a bundle or collection of different perceptions, which succeed each other with an inconceivable rapidity, and are in perpetual flux and movement."(David Hume, A Treatise of Human Nature, (Penguin Books, 1969), Book I, Part IV, Sect. VI, 300, quoted in Justin Broakes, "Hume, David," The Oxford Companion to Philosophy.) This came to be known as the bundle theory of the self. The self has been considered as synonymous with the soul. As with other issues, Hume was content to express skepticism in relation to the self but did not explore any further. Nevertheless, his observations on the matter have been considered frequently in analyzing the question of personal identity.

It is clear that in his comments, Hume was trying to overthrow the widespread idea that everyone possesses a soul that is found in one place in the body. The soul was thought to be made of a substance wholly different from the physical substance that makes up the body as well as all other objects in the world. This substance was thought of as spiritual or mental substance. This was dealt with in Book I of this trilogy, The Predominance of the Physical World.

This dualistic structure is still followed by most people who are not philosophers. Ask anyone where their individual spirit is located, and they will not be able to give a credible response. Most likely they will believe that it is not known where the soul is located--that it is a mystery--but that it has to exist nonetheless.

There is less talk today than in the past of a disembodied soul, one that can exist entirely separate from the body and still have all the mental and emotional functions that it had while residing in a body. Put in a different way, there is not as much belief in ghosts anymore. This is probably due to the ever growing number of physical explanations for the many phenomena that involve human beings. Demons were once thought to bring medical maladies, such as hallucinations and peritonitis. It has since been discovered that these maladies have purely physiological causes.

The Evidence for a Soul

Nevertheless, many people who believe in an afterlife believe in disembodied souls as a means of reaching it. That is, they do not believe that they get to the next life through resurrection. Like the ancient Greeks, they probably find it hard to believe that anything can be resurrected from the remains of a dead body, especially if the resurrection is to be attempted thousands of years later. They may not accept resurrection but nonetheless believe there is a path that a person follows to the afterlife. They will claim that the soul passes on even if the body is left behind to slowly decompose.

They will not be specific on how the soul is extracted, preserved, and transported. They likely will admit being ignorant of the process but will still believe that the soul survives. They claim it is eternal. The soul is generally considered to go to heaven, which is supposed to be somewhere in the sky or beyond. Another advantage in believing in a disembodied soul is that since it has no weight it can travel much faster and get to heaven sooner.

Another way in which belief in disembodied souls manifests itself is in the idea that ghosts visit us on earth. Another name for them is spirits. It is taken for granted that these ghosts do not consist of any physical material but are instead composed of a spiritual substance that can fly and easily pass through heavy walls. Ghosts are said to be persons who have recently died who appear to their relatives or close friends. They want to pay a visit or to communicate important information. In most cases, the ghosts eventually discontinue their visits. This cessation seems to coincide with the time that the grieving persons stop feeling intensely for their deceased. The appearances mostly occur to survivors who are deeply aggrieved. These post-death apparitions are only reported rarely and usually they appear to the more credulous persons who had previously believed in ghosts.

Another instance in which the apparition of ghosts is reported is in the supposed case of haunted houses. Persons who lived there, and often were brutally murdered there, supposedly desire to return to visit. Perhaps they are restless because of the way they were dispatched from earth. Again this haunted house phenomenon is rarely reported. Most people never get to see ghosts.

There is no evidence of any soul inhering in the physical body during earthly life. The main obstacle to the idea is that physiologists have not identified any locus in the body for the soul. They have identified what are apparently all the organs and tissues as well as numerous chemicals taking the form of such things as hormones and enzymes, but there doesn't seem to be anything remotely comparable to a soul.

Since nothing is observed to leave a dying human body, it appears that the soul could only be in some nonphysical or spiritual state. Hence people who follow this line of thought inadvertently back themselves into the dualistic belief in a physical body and a soul mysteriously composed of some other very different substance. A dualist can take a modified position and repudiate the idea of a disembodied soul. S/he could instead maintain that a nonphysical soul exists somewhere in the body and only there. It cannot exist independently in any disembodied state. Once the body dies, the soul goes with it to oblivion.

There is little or no evidence that soul exists in any nonphysical form, i.e. there is no soul made of any spiritual substance. There is no evidence that it survives the body after death. This is even true of any divine souls since it is difficult to conceive what can be made of nonphysical substance, and there is no account of anyone carefully inspecting the divine makeup. In addition, no divine being has been known to reveal what he is made of. Most accounts of divine beings point to their being physical.

No single physical part of the body appears to play the role of the soul. It would seem that it would be most likely for the soul to be located someplace within the brain but such is not the case. The brain appears to be disorganized, at least from the standpoint of what would be expected. There does not appear to be a master command center that directs and coordinates all the various functions, and one would think that the soul would be this command center. Instead the different functions that the brain performs are spread out in different parts of the brain. Contiguous areas of the brain may control functions that show no relation to each other. There are times when an area of the brain is injured that controls a particular task, say walking. The person can no longer walk or walks with a limp. Other areas of the brain can then learn to take over those functions and the person can walk again, even if not as before.

The reason the human brain does not exhibit the simple and neat organization one would expect is that it was gradually formed by evolution. First, the reptilian brain was formed in animals. It is responsible for primal desires and emotions like fear and aggression. A more recent section found in the human brain is the frontal cortex located behind the forehead. It is involved with reasoning.

In spite of what should be clear to anyone studying the nature of the soul--that it is an abstraction and only figurative--there are still those who insist on believing in a nonphysical soul. They even try to enlist the aid of science. The peculiar results of quantum physics are sometimes used to try to bolster bizarre beliefs.

In 2008 a physicist on a radio program on National Public Radio called "New Dimensions" tried to justify the existence of the soul on the basis of the quantum phenomenon of nonlocality. The physicist did a fairly good job of describing what happens in nonlocality but not well at all in justifying the jump to the conclusion that it proved that a nonphysical soul had to be real. It would seem that at most nonlocality may suggest that the idea of what has been called extrasensory perception or paranormal communication is possible. This kind of contact allegedly involves silent, purely mental communication at long distances allegedly engaged in by humans as well as other mentally advanced animals.

Other physicists have debunked the idea that facets of quantum physics provide support for a purely spiritual realm and its alleged features. Victor Stenger, a professor of physics and astronomy at the University of Hawaii, wrote a book about this.(Victor Stenger, Physics and Psychics (Prometheus Books, 1990).)

Personality

In spite of scant evidence for a soul that is separate from the body, there may be room for believing in a more modest and limited one. Even someone who believes that all that constitutes a person is physical substance may well accept that there is a part of individuals that is the most important with respect to relating to them. There will certainly be physical characteristics by which to identify a person such as black hair, a snub nose, above average height, a bloated belly, a missing right foot. Yet these physical identifiers are not what will most likely make us stand out in the minds of others, especially those who know us closely. It is actually our intangible, nonphysical characteristics that are ultimately more important to other people.

Nonphysical traits include what are called emotional, psychological, and mental traits. Although these three groupings go by different names, they include many of the same characteristics such as kindness, generosity, confidence, shyness, aggressiveness, arrogance. What we deal with in everyday social contact with people is the sum of these intangible emotional characteristics found in a different blend in each individual.

A pleasant face is nice to look at, and in a sexual context, physical attractiveness makes a big difference. In the end, physical appearance is not nearly as important as how a person interacts with us in the emotional sense. A spouse may be a great sexual partner, but if her/is emotional characteristics make her/im unbearable to live with, the marriage is not likely to last, or if it endures, to be a happy one. The total of these characteristics is what is called personality and is what allows us to feel comfortable with persons or not, what brings us to like them or not.

This impact that personality has on the reaction other people have to us can influence a number of things such as whether we have many friends or whether we get a promotion at work. When people remember us fondly, it is our personality to which they mostly react. Personality is what people at least partly have in mind when they talk of the soul or the self.

The terms personality, soul, and self have not been used interchangeably before, and philosophers have not talked about them as if they were synonymous. All three of the names try to capture that nonphysical essence of an individual that tells so much about her/is behavior. It has little to do with considerations of physical strength, ability, or appearance. The terms are used in connection with a general description of a person's nonphysical nature such as "she has a winsome personality" and "he has a dark soul."

"Soul" has traditionally been used with immortality and the afterlife and thus is more closely associated with dualism. "Self" is associated with "mind," both of which are often used in a dualistic context. "Personality" is more recent and more commonly used by physicalists, those who think there is only a physical composition of objects of the world including human bodies.

Personality is not linked to any special substance of its own but is instead simply an abstract concept that captures behavior. It has only a metaphorical use and does not require that it be made of any separate substance or material. The physical body in all its varied complexity including the neurological portion provides all the mechanisms necessary for support of the behavioral actions that constitute personality.

Personal Identity--Composition

Keeping in mind that there is no such thing as a nonphysical soul but that instead all mental characteristics are based on the body will help clarify much of the confusion that has been created in the philosophical discussion of personal identity.

From the common, nonphilosophical point of view, the identity of a person is easy to determine. If the person looks like the person s/he is purported to be, then s/he fits the identity of that person. In cases in which such simple identification is not sufficient, additional tests can be employed. This can happen in a case where the person's face has been disfigured in an explosion or after a lapse of time of many years. Recent fingerprints and voice prints can be compared to any available ones from the past that belonged to the original person. Questions can be asked of the person to see if the events s/he claims to have happened in her/is past are consistent with what is agreed by her/is acquaintances to have happened to her/im.

This is consistent with what has been called a timeline. It has been said that from birth a person follows a timeline, that is, her/is body follows a continuous line in space from the time that s/he is born until the time her/is body stops living. This provides one definitive way of finding whether the person in question before us today is the same person we met in Omaha 30 years ago, i.e. to determine her/is personal identity. That is, if it were only possible. We simply trace back her/is timeline for 30 years to determine whether s/he was the same person. Of course, this would not be possible as a practical matter since we are prohibited from actually following a timeline. It would only be possible to do it precisely if we could reverse time. However in theory, it can certainly be said that timelines exist. This theoretical use of a timeline requires the use of a physical body and thus adopts the body as at least one of the criteria for determining personal identity.

The following discussion will not concern personal identity in the sense that it is sometimes used of “finding your identity,” usually referring to what are your personal proclivities and how they relate to what role you appear destined to play in life including what career you want to choose. It involves trying to choose how to mold your personality. This is important but will not be treated here.

Memory as the Criterion

The main focus of philosophers in the topic of personal identity has been the determination of whether the criterion for personal identity is the body or the memory. The impetus for the use of memory as a criterion is understandable since it is extremely important for the appreciation of living. The tendency to employ memory has too often been entangled with the desire of many to establish a separate soul.

Philosophers have resorted to the liberal use of puzzle cases in attacking the problem of the criteria for personal identity. John Locke in his deliberations came up with the first puzzle case, and philosophers that followed him have devised more and more elaborate schemes, none of which have definitively settled the matter.

Locke believed that the soul or self is determined by consciousness.(John Locke, Essay Concerning Human Understanding, 2nd ed., 1694, Chapter 27, reprinted in John Perry, ed. Personal Identity (Berkeley, California: University of California Press, 1975) 48.) In turn, he used consciousness and memory almost interchangeably.(Id. 50) Later writers did not combine the three terms so readily but neither did they delineate any clear separation of them. Locke did make a clear effort to distance the soul or consciousness from any dependence on any kind of substance whether physical or spiritual.

It is also clear from his writing that the nature of an individual's soul does not depend on the existence of any particular body parts. This makes sense because we have all observed or at least imagined that the removal of a finger or even the amputation of a leg does not have to alter the personality of a person. Of course, the soul (as well as the body) must depend on its continued existence on such parts as the heart and the brain. There are also situations in which the person with an amputated leg or missing hand can be psychologically affected by the unusual condition.

Locke settled on memory as the criterion of personal identity. He pointed out that if the soul of a prince were to enter into the body of a cobbler and supplant the soul of the cobbler that the person of the prince would then reside in the cobbler. By using the word "person," Locke apparently meant to identify it with personal identity. He admitted that those who knew the cobbler would still identify him to be the one with the body of the cobbler. According to Locke, they would be identifying the "man" in that case as opposed to the "person." This is another point at which Locke implied he was accepting the dualistic approach, which in turn caused confusion. Locke was also impliedly admitting that many people would use the body criterion of personal identity.

Some philosophers later criticized Locke’s adoption of memory as the criterion of personal identity. Joseph Butler was the first one. He pointed out that the body had to be the criterion of personal identity because memory presupposes the existence of body. This would become evident in a case of memory failure.

Suppose that Lanny, a young man who has a low I.Q., claims he went on vacation last year to Addis Ababa. As far as anyone knows, he has never been out of the United States. If he went anywhere, his parents would accompany him. They have denied that he went to Addis Ababa. Just in case he happens to be correct, a check will be run.

This check will be an investigation to find out if Lanny's body went to Addis Ababa. Lanny could provide documentary evidence such as a passport, airline tickets, credit card statements, etc. Witnesses can state that they saw him there. Those trying to ascertain whether Lanny went to Addis Ababa will surely find it absurd if any witness were to assert that they saw his memory or soul there but did not notice that his body was there. Any check on the presence of a person's memory has to involve the presence of the person's body.

In all the puzzle cases inspired by Locke's prince and cobbler, the transfer of memories between different bodies is mentioned without much discussion about how it could be undertaken and if it would even be possible. Philosophy is supposed to be a theoretical inquiry in which abstract possibilities can be examined without the encumbrance of having to get mired in practical details. This approach often allows for a quicker clarification of questions, but there has to be a limit on how far this can be taken. It is expected that philosophical problems and their solutions have some connection to real human questions and problems. If a theoretical analysis is far removed from what can actually take place in human life, it may not be of much use.

This may well be the case when in these puzzle scenarios philosophers talk of transferring the soul of someone to the body of the someone else. In the first place, the philosophers do not even present a clear, established definition of the soul so that it is not even clear in theory what is being transferred. Assuming that it could be established that precise clarity is not important, it would still be good to have an idea how the transfer is supposed to take place. Since no one has come up with a good description of a nonphysical soul, it will have to be counted that the soul is physical. In that case, the closest part of the human body that can be identified with the soul is the brain. This is consistent with investigations that have located memories in the brain.

Some Unforeseen Consequences

Locke assumed that friends and family would recognize the soul (memory or personality) portion even if it were located in a different body. Locke apparently believed this when he said, "[E]very one sees he would be the same person with the prince . . ."(Id. 44) Locke is talking here about the person of the prince in the cobbler's body. The belief is understandable since others develop a great acquaintance with the personality of an individual. That acquaintance can be positive or negative. For instance, people can remember the affability, the kindness, and the joviality of a person, or they can remember the bad temper and negative attitude toward life.

We can agree that typically the friends and family of a person continue to fully accept the person if s/he were to undergo a permanent physical change, especially one that was not noticeable. This is why a number of philosophers (including Locke) have thought that personal identity follows memory or personality rather than the body. Yet, there are situations in which others have difficulty adjusting to the new physical state.

Take cases in which something drastic happens to the individual. A man becomes a quadriplegic or becomes almost completely paralyzed or suffers severe burns to the face that leave it very ugly and hard to look at. It is honestly possible that an unfortunate individual may well lose some of his friends and have difficulty finding new ones. The man himself may find his new condition intolerable and become very depressed and perhaps even go so far as to commit suicide. That has been known to happen.

It could be said that these types of cases affect how an individual perceives himself as a person. In turn, the individual's perception of himself could alter his personality to an extent that his friends and family would find it hard to relate to him in the way they did before. A bodily change can also directly change personality as in the event of a brain injury or tumor. In these cases, the idea that personal identity is only affected by personality would be proved wrong. Bodily conditions can be very important in some instances.

Physical appearance could make a difference in the prince-cobbler case. If certain details are added, it does not seem so obvious that everyone would be satisfied with the idea that the personal identity of the prince would be determined by his personality. Assume that the prince is a handsome, energetic, physically strong, and sexy 22-year-old, while the cobbler is 75 years old, partially deaf, suffers from diabetes, has a hunchback, shows a very wrinkled face, and is impotent. Both are married. The princess is an attractive, energetic, physically healthy, and sexy 20-year-old. The wife of the cobbler is 72 years old, overweight, and unattractive.

The prince's personality moves into the cobbler's body, as Locke specified, and the cobbler's personality is made to reside in the prince's body. Assume that each personality returns to its usual abode with the members of its respective family. The prince’s personality shows up in the castle in the cobbler's body, while the cobbler’s personality appears at his house in the youthful body of the prince.

The cobbler observes that for the most part he feels like the same person as before but is more energetic and free from his former physical maladies. He notices that young women look at him with more of a smile than they used to. They used to sometimes stare at him as he walked, which didn't feel very assuring. The cobbler figured they were just taking notice of his hunchback. His wife and children and grandchildren are suspicious about what has happened. They are incredulous as they ask many questions. They wander why a young man would want to accept the soul of an old cobbler.

One of the things that keeps confusing the cobbler's family is that he always speaks with the quickness and cadence of a young man, much different from what they were accustomed. While the cobbler's family is happy to see that with his young new body his old maladies and diseases have disappeared, they have difficulty adjusting to how much energy he now has. His wife and his children, who are now in their forties, find it hard to keep up with him in this respect.

For several days they ask questions that try to trip up the young man to make sure he is not an impostor. They eventually become cautiously assured that the cobbler's memories are in that body. More than anyone else, the cobbler's wife has to deal with his new boundless energy. At first it was fun and exciting. The way it was when they first got married, but now it has become exhausting. With age, she has definitely become more reserved and quiet. He wants her to keep up with him in going to different places.

What is worse, sex becomes a pointed issue, specifically frequency. Before, it seemed that the two had reached an acceptable accommodation. He didn't show much interest, and she was fine with that. Now it seems he is eager everyday. He has said he will be satisfied with no less than four times per week, which to her is wildly unrealistic. She realizes she could simply refuse when she is not interested, which is most of the time. She also fears that if she takes the avenue of refusal he could stray or even worse ask for a divorce. She till loves him in spite of all the changes and would hate to see a divorce. She has noticed that young women look him over a lot and he responds accordingly. Unbeknownst to her, he has begun to wonder whether straying with one or another young woman could be more understandable than he ever before imagined.

The prince's father and mother are having a hard time accepting that he now looks older than they do and has more infirmities. At the same time, he still exhibits the same immaturity and impulsiveness of the prince they used to know. At least if he showed some of the maturity and wisdom that usually comes with age, they could feel more credulous.

His brother and sister feel much the same way. They find it aggravating that he agrees to go along with them to a function like a party or participate in an athletic event and then he doesn't show the enthusiasm and stamina that he had before. At a party before, he would usually be one of the last ones to leave. Now he doesn't show much interest and then wants to leave early because he needs to rest.

The prince used to be athletic. He used to do anything that involved riding horses including playing polo and entering a steeplechase. Now he has trouble controlling the horse on an easy ride. Part of the problem is that his back and hips begin to hurt not long after the endeavor begins. The prince himself gets very frustrated with the weak and infirm body with which he has been saddled. He is very angry with the philosophers who talked him into getting involved in this soul swapping experiment. He has instructed the army and police to find and bring them to him. The philosophers must know they are in trouble for there has been no success in finding them.

The princess may be even unhappier than the prince. She too is affected by the prince's low level of energy. There are things they used to do that they don't anymore. Now he always wants to go to sleep early. He used to play often with their son but not anymore. The worst thing is their sex life. They used to both enjoy sex several times per week. He was always eager to engage. Even after he showed up with the cobbler's body, he was just as eager for sex as ever, but then he did not perform well. The former body much more attractive. This has made him despondent and barely interested. It makes them both frustrated. The families on both sides have great trouble adjusting and wish they could return to the status quo.

One has to ask how the community would refer to the transformed men. Assume they saw the body of the cobbler walking down the street. They would likely say, “There goes the cobbler with his jumbled brain imagining he is the prince.” When they saw the prince’s body, they would think of him as the prince thinking strangely like the cobbler. They would identify them by their bodies, not by their memories or personalities. All this shows that physical appearance and therefore the body is more important in personal identity.

We know that serious injury to the head can bring about dramatic differences in personality. Perhaps the best known case is that of Phineas Gage, the victim in 1848 of a dynamite explosion that sent an iron bar through his left cheek, the front of his brain, and then the top of his head. He had been dependable and temperate but afterward became capricious, obstinate, and often engaged in the vilest profanity. Friends could no longer recognize him as the same person. Since then many cases of personality change have been observed and studied that were caused by brain injury or tumor.(Antonio Damasio, Descartes' Error (New York: G.P. Putnam's Sons, 1994), 3-19.) Experiences can also affect personality, especially traumatic ones. Being the victim of an assault, kidnapping, or other violence can bring that change in personality known as Post Traumatic Stress Disorder (PTSD).

It becomes clear that correlating personal identity with memory is not as simple as Locke and others thought. Identity depends greatly on a person's body. While the removal of some parts of a person's body may not make much difference on how she interacts with others, removal of crucial parts may have an effect. Change of an entire body can have an even greater impact. Change the narrative of the prince and the cobbler to both being single, healthy, and about the same age, and there could still be significant differences in how their families interacted with them.

In their thought experiments, Locke and other philosophers assumed that a personality was completely transferred to another body and that personal identity was associated with that personality regardless of the body to which it was transferred. Furthermore, they assumed that same personality could be considered as remaining the same personal identity that had been originally transferred. They did not stop to consider that the personality and thus the personal identity would in time be affected by the changed circumstances and surroundings that would naturally come about. All of this goes to show that bodily condition and external physical conditions are strongly involved in determining personal identity. It is not solely a matter of personality or memory.

The transfer of a personality does not appear to be possible in any way other than as a physical process. The details in carrying it out involve physical assessments of how to do it. The assumption that a personality continues being the same after it is transplanted into another body cannot be correct. A new personality has to come about as a result of the interplay of the old and the new physical factors. These involve the adaptation of the personality to the new body as well as the effect of the surrounding physical environment.

Memory has to be physical. As Butler observed, it is dependent on the body. Memory does not simply float somewhere in the air near or above an individual's body. This becomes more obvious if we think of it in the plural. Memories are stored in the physical body of the person, in particular they remain in the brain.

Observations and tests have been made for a long time showing that memories are physically dependent phenomena. If an individual is tired or has been deprived of food or sleep, he may have more trouble remembering events. Certain drugs can affect what and how many memories a person can recall. It is well established that brain damage can cause the loss of memories. Dementia, including Alzheimer's disease, is the best known ailment that makes memories gradually vanish. Scientists are certain that there is a physical cause of Alzheimer's. It is not a problem of the soul.

Perhaps comparison to how a computer stores information would provide further illumination. A computer stores information permanently on a hard drive. The computer can be shut off, and if turned on several days later the information can be returned from that hard drive. The information is organized in files. The hard drive has sometimes been called physical memory because it permanently stores whatever information its user chooses to save. In contrast, there is what is normally called memory, specifically random access memory (RAM). The information held there is important but cannot be considered permanent since is disappears when the computer is turned off.

The permanent storage takes place physically in the molecules of the materials that make up the hard drive. There is no way the information dangles in the air. It must reside in a physical substrate. The information is stored and can be erased by a physical process. Few people would want to deny that this is the case, even the most ardent spiritualists. Very much the same can be said for the memory found in biological beings. Memories can be created and also destroyed. In computers, information can be erased unintentionally by a surge in electricity or a strong magnetic field. Memory in a brain can be destroyed by injury or disease. Dementia causes a drastic erasure of the human hard drive.

Body as the Criterion

Even when it appears that memory is the criterion, it is seen to ultimately be dependent on the body, which after all stores memory. This is the first major indication that the body criterion of personal identity is actually the correct one. The second indication is the simple and common identification of people by their bodily appearance.

The third element of proof is related to the second one. This is identification of the body by very precise technical means. Fingerprints and voice prints were mentioned before as ways to identify individuals. Each of us has very unique prints by which we can be identified with extremely high accuracy. More recently there is the method of genetic identification, i.e. DNA testing. It is close to infallible. At times corpses cannot be identified because their bodies have been greatly altered. DNA analysis is then used, which obviously involves bodily identity.

The fourth indication is the refutation of the memory criterion. First consider what happens to the memory criterion with changes brought about by time. The simplest question to ask that illustrates the problem is whether women 80 years old are always the same persons as they were as children of 8 years of age.

There could be deep changes in a woman's personality. An old woman might laugh just as easily as when she was a girl and ask many questions. Yet she could have a more sour outlook and be unduly suspicious of others. At the same time, maturity could have helped her become more patient than when she was an energy-filled child. Those who had known her as a girl may recognize similarities but also differences, including those that normally appear in all people. Then there is the question of memories retained. At the extreme would be the case in which she developed full-blown Alzheimer's disease. In that case what connection is there between the little girl and the totally oblivious woman? How can it be said that the two share the same personality?

Of course, many physical changes would also take place in the span of years. Nevertheless a physical identity could still be established through documents and witnesses showing body continuity over time. Then there would be the very reliable fingerprint and genetic identification.

Locke believed there was an "extending consciousness backwards." According to this, if an older person has memories of having experiences as a younger person, it can be said that they are still the same person. If this is so, how many memories are required? Is a single memory sufficient? Genetic makeup, a bodily criterion, does a better job of showing a connection.

Philosophers have devised various puzzle cases to try to show or to refute that a common personal identity persists for a long time on the basis of memory. One well-known scenario was designed by Thomas Reid as a refutation of Locke's memory criterion. It is the Brave Officer Paradox. As a boy, General Jones was flogged for stealing apples. When he was a young officer, he performed a brave deed. At that time, he remembered the flogging. Today, as a general, he remembers performing the brave deed but cannot remember the flogging, no matter how hard he tries or how much others try to refresh his memory. He is simply incapable of remembering it.

Some philosophers claim that Reid's criticism can be answered. By pointing out that there is a relationship between each succeeding stage in a person's life, the continuity of the personality is allegedly preserved. In the case of General Jones, there are three stages. The first stage is related to the second stage by the recollection of Officer Jones in the second stage of the flogging that took place in the first stage. The second stage is related to the third stage on the basis that General Jones remembers that he performed a brave deed in the second stage. On the basis of these two relations, which were delineated as ancestral by philosopher Anthony Quinton, the relation between the identity of General Jones and the boy is established.(See the discussion by Perry in Perry, 16-20.)

This just doesn't work. It is just a contrived attempt to establish a connection between the General and the boy that has simply been broken by the General's being unable to remember the flogging. Recollections of events that took place in between do not cure the problem. If you are going to base personal identity on memory and an old man doesn't remember what happened to him as a boy, then you have a break in personal identity by memory. Assume that as a general he doesn't remember anything about his life as a boy, not just the flogging incident. Pointing to intermediary connections at stages in life in between does not avoid the problem of losing memories of boyhood and therefore the connection to personal identity as a boy. It is interesting that Reid only considered the forgetting of one memory. Certainly forgetting only one memory should not cause concern about personal identity—nor should ten memories. The more interesting case is the total loss of memory.

There are persons who lose their memories totally due to brain injury or serious dementia such as Alzheimer's disease. Suppose that this is serious enough that they cannot be said to be the same person. They remember very little of who and what they once knew and they lose the personality they once had. An individual in this situation can hardly be called a person. Her friends might become less close; after all she would no longer be the same person they knew before. She would no longer recognize them or remember any of their common past experiences. In extreme cases of injury, people often refer to someone like that as a "vegetable." She is not normally aware of herself and does not interact very well with others.

If the criterion is personality or memory, then this person has lost her personal identity. Yet if you were to tell the families and friends of these persons that they no longer had a personal identity, they would likely disagree. They would refer to the body of the individual and point out that it still had a resemblance to the person she was before she lost her mental faculties. There would be body continuity. They might point out that the person had a history, i.e. had lived her life in a continuous series of physical locations. In other words, they would present a timeline argument, which uses the body criterion.

There is logic in what the old acquaintances have to say. The fact that Al, as an example, has lost his mental faculties should not mean that he has lost his identity as a person. His friends could recount the various mental acts this man performed in past years. He may no longer have the same mental capacity, but the body observed today is the closest object of identification to the person his acquaintances knew before. It seems natural for them to continue to say that the same body continues to have the personal identity of the man they knew before, even if there is no trace of any personality contained in it. Al's family still considers him to be a family member.

They could claim that the mental characteristics and the memories of Al are all still fully stored inside his brain, but at the moment, they are frozen. Perhaps neurological science tomorrow could unlock that data and restore Al to his previous mental state, or at least to a partial state in which his mind was in before. The dementia cases with total loss of memory are strong proof that the body criterion for personal identity has to be the correct one.

The personality and memory are without a doubt ultimately based on the physiological characteristics of the particular body that holds the associated personality. Here is one way to put it: you cherish your personality but expect that others will recognize you by your bodily identity. Consider also nonhuman animals. It is believed that they do not have many long-term memories; some people do not believe they have souls. Yet they do have individual identities that can be determined by their DNA.

The current idea of considering a person who has lost his mental faculties through dementia to be a person who retains all rights is consistent with espousing the body criterion. Incidentally, this is what is behind opposition to euthanasia of humans under any circumstance. Opponents in effect believe that the body is the criterion for personal identity. Yet, there is still the understandable tendency to want to say that he cannot retain a personal identity if he has lost his personality. Can this quandary be resolved? There may be good reasons for society to allow euthanasia even if the body criterion is the proper one. This would be in cases where it is almost hopeless that an individual will ever recover the ability to appreciate life.

A completely demented person is no longer a self-conscious being, or more specifically, a human being. A human being can be considered a self-conscious animal. He has lost his identity as a human being even if he has retained his individual personal identity, however limited that may be. If we distinguish the two types of identity--personal identity and identity as a human, i.e., a member of the set of self-conscious beings, we clarify the problem of personal identity. Persons who descend into a demented consciousness retain their identity through their body but cease to be counted as regular human beings because of their loss of personality.

The Fate of the Soul

The philosopher Derek Parfit opined that the question of personal identity does not matter. Yet, It does for those who believe that the soul and its memories continue to live after death. For them, personal identity has to be based on memory or soul in order to assure that identity continues in an afterlife. However if it is the case that personal identity can only be based on the physical body, that presents some serious problems for the survival of the individual after death. With the death of the body, which has to include the brain, the means for the continuation of psychological characteristics and memories are extinguished.

It seems to be sadly the end of the road for the person. What can possibly survive after that? The body disintegrates and with it the entire basis for a continuation of any semblance of personality. The only survival that can take place is in the memories of friends and in any creations of the person such as artifacts constructed or letters written. This is the reason spiritualists try so desperately to find a foundation for some nonperishable mental element of the person to survive.

All is not lost, however. It is possible that in the future some device may be invented that can store the characteristics and memories of each person. It would be something like an external hard drive for a computer. It could even be something smaller like a flash drive. That information could then be stored in a safe location. The person would be completely reconstructed in the future inside an exact replica of the original body constructed on the basis of her/is genetic material. It would contain the exact same brain configuration that would produce the same personality with accompanying memories as before.

While those in charge of the process of personal reconstitution were at it, they could make some beneficial modifications. They could correct any physical defects. If the person had been nearsighted, they could give him/er the perfect vision s/he had always longed to have. No one would have to settle for being homely; everyone could be made attractive. Even after being recreated, individuals could ask for any modifications they later desired.

This may well become possible at some time in the far future--if the human race can manage to survive long enough. Unfortunately, this would mean that all who have not and will not have the opportunity to store the necessary information from their brain (and possibly their nervous system) will not be able to experience this resurrection.

There are those who can already hope to have their bodies (or perhaps their brains will suffice) frozen through the knowledge of cryogenics. In this way, they can hope to preserve themselves and their individual information that can be useful in case they can be resurrected in the future. However, can they trust that there will be those who will care enough to safeguard the frozen bodies for maybe thousands of years into the future? People can pay a company perhaps millions of dollars today in exchange for a promise to faithfully take care of their bodies in the future, but that is no assurance that the company will still exist thousands of years from now or that its employees might not decide to disregard old contracts. Nor is there any guarantee that those who do finally revive the bodies will have friendly intentions.

For those who harbor religious hopes, there is frankly not great evidence for the resurrection of conscious beings in the future. They can rely on faith as they often say they do. They better hope that God or whoever the caretakers of the universe may be are keeping exact records of all the memories and makeup of each individual so that each person can be accurately reproduced.

Table of Contents (Part 1)


10 Weighing and Verifying




This chapter is a treatment of two topics that have not been discussed much by philosophers. The first one does not clearly fall in the category of epistemology but is rather more one in logic.

The topics are the following:

(1) A method of reaching conclusions that is not treated in logic.

(2) A method for trying to avoid the bias produced by self-interest in reaching conclusions.

A Method of Reaching Conclusions

Ever since Aristotle, logic has been a major subject in philosophy. It has involved the analysis of thought in order to understand how to arrive at correct conclusions. The simplest topic in logic is sentential or propositional logic. The next step in complexity is first order predicate logic. Starting about 1900, a more mathematical logic extended the scope of logic.

The best known form in logic going back to the time of Aristotle is the syllogism. A well-worn example is

All men are mortal.

Socrates is a man.

Socrates is mortal.

There are a number of other forms that show how sound reasoning takes place and that it is always consistent. There is one form of reasoning that has not been dealt with in logic that is used by people in reasoning in a wide variety of contexts. This is the method of weighing the reasons in favor of taking an action or adopting a belief as opposed to those reasons against. It could be called weighing the pro's and con's or simply weighing.

In ordinary conversation, people talk about weighing the pro's and con's related to a decision to be made. No further details are considered in applying the method. Neither do logic textbooks have anything to say about it. It needs more analysis than it has been given. The way people ordinarily talk about making a decision is that they write a list in a column called the Pro's and in a different column write a list in a column called the Con's. Supposedly, if the number of reasons in the Pro's column is greater than the number in the Con's column, that indicates the way the decision should be made. For example, if there are two reasons or factors for taking an action in the Pro's column but four in the Con's column, the decision would presumably be to not take the action under consideration.

This is illustrated by the following example. Ruth needs to buy groceries. She would like to go today, but she has enough food to enable her to postpone the trip until tomorrow. Here are the considerations for going today.

Pro

(1) Need broccoli.

(2) Need oranges.

(3) Need pears.

(4) These items are on sale today.

(5) Want to buy today's newspaper.

(6) Have nothing much to do now.

Con

(1) It is raining and is not supposed to rain tomorrow.

(2) New boyfriend said he would call in the next few hours but not tomorrow.

(3) It is cold today and supposed to be warmer tomorrow.

Analysis of these decisions can be also be imagined as employing a set of double scales consisting of two trays hanging by chains on the ends of a rod balanced on a fulcrum. Weights are put on each scale to see whether the set of weights on one scale is heavier than those on the other. This type of scale is sometimes pictured as the scales of justice.

In Ruth's case, six weights would be placed on the left side tray representing the six reasons on the Pro side, and three weights would be placed on the right side tray representing the Con side. All weights would be equally heavy. Since there are six reasons in favor of going today and only three in opposition, it would seem clear that Ruth should go today. Still, she doesn't feel any desire to go in spite of the scale results. She hesitates and feels nonetheless like going tomorrow. She believes in using a rational approach in making decisions and realizes that she should go now in accordance with a rational procedure for deciding the issue. Why does she nevertheless feel so much hesitation in going today?

The truth is that it is not as simple as putting the favoring factors on one side, the opposing factors on the other, and then counting which side contains more factors. In this case, Ruth is impelled to go tomorrow for two overriding reasons. The first is that she wants very much to be at home when her new boyfriend calls. It is clear that she is in love and wants to talk to him often. This reason by itself is enough to cause Ruth to stay home today. In addition, she is a person who does not like cold weather and tries to avoid it whenever possible. It is very tempting to postpone the trip given that it is cold today, and the weather forecast says it will be warmer tomorrow. This in spite of the fact that weather forecasts can be wrong.

What then is Ruth to do? Would she be arbitrary and irrational if she were to disregard what the weighing procedure says she should do? Should she go to the supermarket in spite of her strong feelings against going today?

If one were to insist that Ruth follow the decision procedure, it could be used as a good example of what critics of logical processes mean when they point out that such procedures are cold and heartless. They have a point when discussing procedures like this one that do not take feelings into account. Acceptable logical procedure should leave room for how people feel. This should be done even in cases of negative feelings like hatred and anger.

In addition to demonstrating a lack of regard for feelings, the procedure of simple pro's and con's has another flaw which is closely related. It does not have a way of expressing the importance or significance of each factor involved, whether it is emotional or not. The method needs to be modified to take account of the weight of each factor with regard to importance. We can construct a weight for each factor in a range from 1 to 10 with 1 representing the least measure of importance and 10 the greatest. A different range could be used in order to furnish greater precision such as 1 to 100 or 1 to 1,000. This would result in a much different outcome from the prior approach of giving equal weight to all the factors and then simply letting the number of factors on each side of the scale make the determination.

Let us say Ruth would assign the weights to the factors in this way. Factors (1), (2), (3), and (5) on the Pro side would have an equal weight of 1. (4) and (6) on the Pro side would weigh the same at 2. On the Con side, (1) would weigh 2 while (3) would weigh 4. (2) would bear the greatest weight of all at 6. The total on the Pro side would be 8, while on the Con side it would be 12. This is consistent with what Ruth apparently really wants to do.

This weighted method, rather than giving the factors equal weight, is closer to the way people actually reach decisions even if they are not aware of it. It would be nice if the more complexly weighted method offered a fully reliable means for making decisions and resolving disputes. You would just determine the factors involved, decide on a weight to give each one, and observe which side wound up with the greater weight.

Taking a closer look at what happens in the process shows why it is still not easy as it may appear. In fact, it becomes clear why so many decisions and disputes are hard to resolve. The procedure is consistent and precise in its operation. It is easy to understand and apply once the factors are chosen and their respective weights are decided upon. However, choosing first the relevant factors and then their respective weights is the heart of the difficulty. The first hurdle can be deciding how many factors are to be counted. This determination can bring up significant disagreement. Certainly, each side would want to maximize the number of factors that could be counted as favoring its side while downplaying those aiding the other side.

Assuming that the factors to be allowed on each side is agreed upon, there is the most difficult task; how much weight each factor is to receive. Those assessments can be very subjective, and it is hard to overcome the problem posed by strongly held differing opinions.

In the situation confronting Ruth, another woman may well come with a different decision. This could be the case if she had a better tolerance for cold weather and maybe even enjoyed being in the cold. It would be even more likely that this woman would decide to go to the store today if she could be more patient about receiving a call from her new boyfriend. It is clear that the differing judgments of the two women with respect to those two factors would be subjective. It could not be said that either one of them was wrong in holding her own individual opinion.

The problem with subjective opinion can be found in wide-ranging social issues. Take, for instance, the burning, fractious issue of whether abortion should be legal. One of the important reasons that would surely be taken into account would be whether an abortion procedure itself is a threat to the life of the mother. Those opposed to abortion have pushed that as a reason for prohibiting abortion in general. Those in favor have held that the death of the mother is a rare occurrence and therefore not a significant factor. They would push for not even considering that as a reason against allowing abortion. Barring their being able to block it as a factor, they would surely claim that it should not be given much weight. On the other side, abortion opponents would advocate that the item be given great weight.

It is a matter of values. How do you decide how much weight to put on the value of a human life? There can be widely varying opinions on what it should be. The problem involved in this issue is illustrative of the stalemates that can occur in connection with the multitude of issues that one might try to resolve with this improved pro-con procedure.

Nor are these problems avoided by resort to the procedures used in traditional logic; in fact they may be even harder to resolve. These problems are present because logic is a study of the validity of arguments in terms of their form and not of their content. Logic studies deductive arguments that contain statements known as premises. Logic does not examine whether the specific statements are true but instead only looks at whether the forms used to find conclusions from initial premises are valid. Once the proper forms of the arguments are found, they are universally valid as long as the premises are true.

An example is the following where the arrow means "implies":

premises

If a --> b and b --> c,

conclusion

then a --> c.

As long as the two premises are true, the conclusion will always be true.

With logic, there are still problems in trying to decide which values are to be chosen for the variables in the premises. Then there is the problem of how much importance or weight each premise is to have in comparison with other premises used.

If one tries to put the pro-con procedure into a logical form, it looks like it would read something like this

If the factors in favor of decision "d" outweigh those against it, then choose "d." Otherwise, do not.

These forms are not helpful when the specifics have to be applied. What is evident then with regard to the use of any abstract procedure is that the easy part is deciding what the proper general procedure should look like. In other words, in deciding what should be the variables and their relation to each other. The real problem comes in deciding what values are to be inserted in those variables. In the pro-con procedure, the values involve reasons and the weight each one is to carry.

In the past, people (especially philosophers) hoped to establish an abstract method such as that found in logic by which disputes could clearly be resolved. This examination of the pro-con procedure makes it clear that the goal is very difficult to reach given how much variation there is in human opinion on what is reasonable and what is important. Whatever area of interest you choose--clothes, food, music, movies, sports, travel, pets--there are variations in taste and preference, and those are just the light subjects.

Go into areas like politics, religion, morality, social organization, family arrangements, waging war, care of the environment, and other large social issues, and you can find tension (even among friends) and even violence caused by disagreements. Given this, is it any wonder that there is so little hope for far-reaching universal agreement on much including the more important issues that can have a profound effect on peace and enjoyment on this planet? Just the simple acknowledgement of basic facts would go a long way. In spite of this, there is little else to do but to keep trying to reach agreement on as many issues as possible in the search for harmony in a better life. The consequences of failure could be not just frustrating but catastrophic.

Verifiability and Bias

The Verifiability Principle was first promoted by the logical positivists in the 1920's. It can generally be taken as a mandate that any belief be sufficiently verified before it is adopted, but it was intended to be more specific than that. The principle said that if a statement was to be considered true the proponent must, at least in principle, show how it could be shown to be true or false. Analytic statements were exempted from this requirement. Imperative and emotive statements like "my cup of joy overflows when i think of you" do not have to undergo the test because they do not claim to state facts. Logical positivists referred to factual statements as being cognitive.

The verifiability principle was widely criticized by theologians, philosophers, and others. For one thing, there were various formulations proposed by different people. Then there were questions like what does it mean for a statement to be verifiable if in practice it cannot be verified and is the verifiability principle itself verifiable. The biggest reason the idea received so much attention and criticism was probably that the logical positivists were not content to simply say that their rule required that a statement be supported by a way to prove it through empirical observation. They went further and claimed that an unsupported statement was not just false or questionable but was cognitively or factually meaningless. They also stated it another way. They said it was "nonsense" from a factual standpoint. The idea of it being "nonsense" was intended to mean that it literally made "no sense." This idea was derived from comments by Ludwig Wittgenstein in his Tractatus Logico-Philosophicus.

The logical positivists intended to fully discredit the traditional, nonempirical statements of metaphysics. These were the kinds of statements made by the religious and ideaist philosophers such as Hegel and Josiah Royce, statements like "reality is spiritual." These kinds of statements did not seem open to confirmation and were usually not intended to be so by those who uttered them. As with many religious-like statements, they were expected to be believed on faith, and faith is ultimately belief without evidence. It was no surprise that the religious and their sympathizers expressed a strong reaction to the claim of the logical positivists that a statement that could not be empirically verified was nonsense.

The logical positivists went too far in their characterization of what it means to fail to verify a statement. A better formulation would have been this: for a statement to be considered true a procedure must be specified for showing it to be true. Otherwise, the statement would not be called meaningless but would simply be suspect and possibly downright false.

Methods other than empirical ones could be used to show the verifiability of a statement, but they would have to be reliable in showing the truth. As a practical matter, empirical observation would have to be included as at least part of the procedure with regard to many statements that do not at first glance appear to involve empirical observation. This is because these statements at least in part involve physical phenomena, which must be verified by empirical means. Examples are "God holds the world in his hands" and "flowers fill the world with serenity."

This proposed version of the verifiability principle could be accused of being too weak in not requiring empirical observation in all cases, but it impliedly requires a well accepted method of substantiation. It certainly precludes emotive, poetic, and metaphysical claims that are passed off as fact but are no more than unsupported allegations of a prophet or poet that sound good to his followers.

A vigorous procedure would have to be demanding--certainly much more rigorous than the light scrutiny that metaphysical and poetic statements have traditionally received. It would have to be credible, intersubjective, unbiased, and not self-serving. The fact that the principle would no longer require empirical evidence exclusively would not mean the standards of proof would be relaxed.

Metaphysical claims such as that there is a deeper reality than ordinary reality or that God has certain traits would still have to be backed up and could still fail to be credible if they were not sufficiently supported. It is amazing that so many metaphysical claims have in the past been accepted only on the word of one highly admired person--Plato and Kant being good examples.

An Additional Principle

An impetus toward believing many metaphysical pronouncements is that they are self-serving. The claim that there is a heaven that people can go to, that heaven offers great joy, or that God is love are examples of those beliefs that promise to bring great benefit. It is understandable that people can be influenced by an unconscious self-serving bias toward readily believing such claims. They are very soothing. There is not much incentive to question them.

Surely everyone would agree that it is good idea to try to overcome this distorting bias in order to have a better chance to get at the truth, even if it is not as comforting as continuing to believe in those claims. With this in mind, i would like to propose a principle that would be used in addition to the verifiability principle. It would be ancillary to it, at least with respect to testing metaphysical assertions. The principle would be this: a metaphysical claim that inures to the benefit of humans should be proved by the verifiability principle and in addition by giving double proof of the claim. This extra evidence would be offered in an attempt to counteract the natural bias toward affirming self-favoring assumptions. This additional requirement could be called the AntiSelf-Serving Principle. It would be akin to the pronouncement: extraordinary claims require extraordinary proof. However, it would only be used to test metaphysical claims that propose a benefit to humans.

It is difficult to see how double proof would be provided by proponents of self-serving assertions. The fact is that even scant proof is rarely offered. Assertions are simply made. If nothing else, perhaps this new principle along with the Verifiability Principle can call attention to the bias.

At any rate, whether one accepts the Verifiability Principle or the Antiself-Serving Principle, the foregoing considerations should be a reminder that there should be a constant respect for belief based on evidence. Before adopting a belief, especially with strong confidence, one should seek to verify it in the light of all the evidence involved, both pro and con. This could be known as a modest verification principle. It could also be called the Evidentiary Principle.

A tendency has been pointed out in all people that has been named “confirmation bias.” We are attracted to facts and opinions that offer confirmation to our already existing opinions. The Evidentiary Principle would help to counteract this. It would give our opinions a stronger foundation. Testing them against all the evidence, even the most contradicting facts, is the best way to get to the truth. An ongoing effort should be made to examine and possibly disconfirm the reasons for holding a belief. The Antiself-Serving Principle would be another tool to aid in this endeavor.

Table of Contents (Part 1)

Part 2