Appendix A
Chapter 4 part B
Chapter 4 part C
Bibliography
In what follows, I shall use the term phenomenal consciousness as Block (1997) does, to denote states with a subjective feel, which can be immediately recognised if not defined. Van Gulick (2004) prefers to use the term qualitative consciousness for subjective feelings or qualia (such as the experience of seeing red), and defines phenomenal consciousness in a richer sense, as including the overall structure of experience, in addition to sensory qualia.
Although scientists do not use the term "phenomenal consciousness", they employ a closely related term, primary consciousness, which "refers to the moment-to-moment awareness of sensory experiences and some internal states, such as emotions" but excludes "awareness of one's self as an entity that exists separately from other entities" (Rose, 2002, p. 6). The main criterion used by scientists to verify the occurrence of primary consciousness in an individual is his/her capacity to give an accurate verbal or non-verbal report on his/her surroundings.
As some non-human animals can give non-verbal reports of events in their environment, the relation between phenomenal and primary consciousness is therefore philosophically significant for the purposes of this thesis.
Many contemporary philosophers argue that the question of which animals possess phenomenal consciousness can only be answered by carefully distinguishing it from other notions of "consciousness". Table 4.1 lists some of the more commonly cited distinctions.
My research on the subject of consciousness has led me to conclude that only one of the philosophical distinctions drawn between the various forms of consciousness is of any help in resolving the Distribution Question (Which animals are phenomenally conscious?). The distinction between transitive creature consciousness and phenomenal consciousness is relevant to the Distribution Question; the other philosophical concepts of consciousness lack relevance, for one or more of the following reasons:
(a) although they may help to sharpen our philosophical terminology, they have no bearing on the Distribution Question;(b) they are poorly defined;
(c) they are inapplicable to non-human animals;
(d) they fail to "carve reality at the joints" as far as animal consciousness is concerned - that is, they apply to too many animals (including animals that cannot plausibly be described as phenomenally conscious) or too few (e.g. humans and great apes only).
During the course of my research, I also found that the distinction between transitive and intransitive creature consciousness, which was supposed to be purely conceptual, turned out to be a real distinction.
I also discovered that what appeared to be a robust nomic connection between wakefulness (defined according to brain-based criteria) and phenomenal consciousness, had been entirely overlooked by philosophers, because of a conceptual distinction they had already formulated between these two notions of consciousness.
Finally, in Table 4.2 below, I propose three concepts of consciousness that I have uncovered in the scientific literature, which (I believe) do a better job of "carving reality at the joints" as far as animals are concerned than existing philosophical categories. Regrettably, these concepts of consciousness are almost entirely ignored in the current philosophical literature relating to consciousness.
Table 4.1 - Various philosophical usages of the term "consciousness".
Based on Rosenthal (1990, 2002); Dretske (1997); Block (1995, 1997, 1998, 2001); Carruthers (2000, 2004); Lurz (2003); and Van Gulick (2004).
N.B. For additional distinctions between different kinds of consciousness, see Van Gulick (2004): http://plato.stanford.edu/entries/consciousness/
Term | Definition | Comments |
1. Creature consciousness | Consciousness as applied to a living organism (e.g. a bird). (N.B. The distinction between creature consciousness and state consciousness was first suggested by Rosenthal (1986).) |
My verdict: The distinction between creature consciousness and state consciousness is philosophically significant, but has no bearing on the question of which animals are phenomenally conscious. |
2. State consciousness | Consciousness as applied to mental states and processes (e.g. a bird's perception of a worm). | My verdict: See comments for creature consciousness above. Additional comments: Ned Block (1997) has criticised the concept of state consciousness as a mongrel concept, and proposed a distinction between two different types of state consciousness: access consciousness and phenomenal consciousness. See comments below. |
Selected varieties of creature consciousness | ||
Intransitive creature consciousness | Being awake as opposed to asleep (e.g. a bird possesses intransitive creature consciousness if it is awake and not asleep or comatose). | My verdict: 1. Poorly defined concept. Assumes that "wakefulness" and "sleep" can be defined uniformly across different classes of creatures. Fails to distinguish between two very different criteria for wakefulness and sleep in animals - behavioural criteria, which are satisfied to some degree by nearly all animals, and brain-based criteria, which are only satisfied by mammals and birds (Shaw et al., 2000).
2. The philosophical distinction between phenomenal consciousness and wakefulness is actually counter-productive to attempts to resolve the Distribution Question, as it overlooks what appears to be a nomic connection between an animal's satisfying the brain-based criteria for wakefulness and its being able to give an accurate report of its surroundings - which is how scientists routinely assess the presence of consciousness. There is a sharp contrast between the EEG patterns of human patients in states of global unconsciousness (deep unconscious sleep, coma, PVS, general anaesthesia and epileptic states of absence) and the EEG of patients in a state of waking consciousness, who are able to give "accurate, verifiable report" of events in their surroundings (Barrs, 2001, p. 35). Additionally, everyday experience shows that no matter how hard we try, we cannot rouse a sleeping person to brain wakefulness without thereby making her (a) alert to her surroundings (primary-conscious) and (b) phenomenally conscious. Moreover, some neuroscientists believe brain sleep to be intimately related to phenomenal consciousness (Cartmill, 2000; Baars, 2001; White, 2000), and some (Baars, 2001; Cartmill, 2000) have even suggested that wakefulness - defined according to brain criteria - is a reliable indicator of phenomenal consciousness across all animal species. The connection between having a brain that is awake and being phenomenally conscious may well turn out to be nomic in animals. 3. Intransitive consciousness defined according to behavioural criteria can exist in the absence of phenomenal consciousness, as shown by the condition of persistent vegetative state (PVS), which has been defined as "chronic wakefulness without awareness" (JAMA, 1990). I describe this condition in the Appendix. PVS patients display a variety of wakeful behaviours, all of which are generated by their brain stems and spinal cords. Studies have shown that activity occurring at this level of the brain is not accessible to conscious awareness in human beings (Rose, 2002, pp. 13-15; Roth, 2003, p. 36). Conclusion: The term intransitive creature consciousness is inadequate as it now stands. Additional Notes: Animal sleep that also satisfies electrophysiological criteria is called true or brain sleep. Brain sleep is defined by various criteria, including: EEG patterns that distinguish it from wakefulness; a lack of or decrease in awareness of environmental stimuli; and the maintenance of core body temperature (in warm-blooded creatures) (White, 2000). Also, there is a massive contrast between the EEG patterns of human patients in states of global unconsciousness (deep unconscious sleep, coma, persistent vegetative state, general anaesthesia and epileptic states of absence) and the EEG of patients in a state of waking consciousness (Shaw et al., 2000). |
Transitive creature consciousness | Consciousness of objects, events, properties or facts (Dretske, 1997). Example: a bird's consciousness of a wriggling worm that looks good to eat. Also called perception. | My verdict: 1. Poorly defined concept. Transitive creature consciousness, in its broadest sense, could be said to be a property of all cellular organisms, as they all possess senses of some sort (see chapter two, part B). In a narrower sense, the term applies to all organisms with a nervous system, as they possess "true" senses (Cotterill, 2001).
2. Research shows that an animal can possess transitive consciousness in the absence of phenomenal consciousness (pace Dretske, 1997). The vomeronasal system, which responds to pheromones and affects human behaviour, but is devoid of phenomenality (Allen, 2004a, p. 631) is one good example; the phenomenon of blindsight in humans and monkeys (Stoerig and Cowey, 1997, pp. 536-538; p. 552) is another. 3. Transitive and intransitive creature consciousness are not co-extensive in nature; hence the distinction between them is real, and not merely conceptual. A few animals (e.g. alligators) show no sign of behavioural sleep. Additionally, the term "behavioural sleep" has been defined for animals but not for bacteria, protoctista, plants, or fungi (Kavanau, 1997, p. 258). Yet all of these creatures could be said to possess transitive consciousness to some degree. Conclusion: The criteria for possession of transitive creature consciousness need to be more clearly specified. For instance, blindsight varies across patients in its degree of severity, and the specificity of the responses shown by these patients varies accordingly (Stoerig and Cowey, 1997, pp. 536-538). Which of these responses count as bona fide instances of transitive creature consciousness? |
Selected varieties of state consciousness | ||
Access consciousness | According to Block, "a representation is access-conscious if it is actively poised for direct control of reasoning, reporting and action" (1998, p. 3). Direct control, according to Block, occurs "when a representation is poised for free use as a premise in reasoning and can be freely reported" (1998, p. 4). Elsewhere, Block (1995) stipulates that an access-conscious state must be (i) poised to be used as a premise in reasoning, (ii) poised for rational control of action, and (iii) poised for rational control of speech. Block (2001) now prefers to use the term global access instead of access consciousness. |
My verdict: 1. Because access consciousness presupposes rationality in a fairly explicit sense, it is doubtful whether it applies to any non-human animals. Block's claim that not only do some non-linguistic animals (e.g. chimps) have access consciousness states (1995, p. 238), but "very much lower animals" are access-conscious too (1995, p. 257) is therefore puzzling.
2. Access consciousness can exist in the absence of phenomenal consciousness, in certain situations. The strongest evidence for this claim comes from recent studies of the mammalian visual system (discussed in Carruthers 2004b). Research by Milner and Goodale (1995) suggests that each human brain has two visual systems: a phenomenally conscious system that allows the subject to select a course of action but which she cannot attend to when actually executing her movements, and an access-conscious system that guides her detailed movements but is not phenomenally aware. However, these findings relate to just one sensory modality (sight) and only apply to a limited class of animals (mammals). The case of the distracted driver, who is supposedly able to navigate his car home despite being oblivious to his visual states, is not a convincing example of access consciousness in the absence of phenomenal consciousness (Wright, 2003). See Appendix. Conclusion: Block's notion of access consciousness is a philosophically useful one. Nevertheless, it is of no use in helping us to answer the Distribution Question: it is, if anything, even more cognitively demanding than phenomenal consciousness, as it occurs only when an internal representation is "poised for free use as a premise in reasoning and can be freely reported" (1998, p. 4). To answer the Distribution Question, we need to define a "weaker" notion of consciousness that many animals could plausibly be said to satisfy even if they lacked phenomenal consciousness. Additional comments: Rosenthal (2002) faults Block's definition of access consciousness, on the grounds that one's ability to rationally control one's actions does not require consciousness of any kind. He finds Block's new definition equally problematic: global access is neither necessary nor sufficient for consciousness. |
Phenomenal consciousness | Block (1995) defines phenomenally conscious states as states with a subjective feel or phenomenology, which we cannot define but we can immediately recognise in ourselves. Recently, Block (2001) has forsworn the term "phenomenal consciousness" in favour of what he calls phenomenality. Van Gulick (2004) defines phenomenal consciousness more narrowly than Block: it applies to the overall structure of experience and involves far more than sensory qualia (raw subjective feelings, such as the experience of seeing red). |
My verdict: 1. The question of which animals are phenomenally conscious will most likely be answered by neurologists. At present, neither scientists nor philosophers can agree on what phenomenal consciousness is, how it first arose in organisms, or even what it is for (i.e. what function it serves). However, there is a broad scientific consensus on the neurological conditions for consciousness (see below).
2. It is likely that human beings are capable of having phenomenally conscious experiences without access consciousness, due to lack of attention or rapid memory loss. In his discussion of the refrigerator that suddenly goes off, Block cites "the feeling that one has been hearing the noise all along" as evidence for inattentive phenomenality (1998, p. 4). The most straightforward way of explaining this case is the hypothesis that "there is a period in which one has phenomenal consciousness of the noise without access consciousness of it" (1998, p. 4). Additional comments: Rosenthal (2002) has criticised Block's (2001) account of phenomenal consciousness for its ambiguity between two very different mental properties, which Rosenthal refers to as thin phenomenality (the occurrence of a qualitative character without a subjective feeling of what it's like) and thick phenomenality (the subjective occurrence of mental qualities). Rosenthal considers only the latter to be truly conscious. |
Reflexive consciousness, also known as introspective or monitoring consciousness. Also known as states one is aware of. | According to Block (1995, 2001), a state is reflexively conscious if it is the object of another of the subject's states (e.g. when I have a thought that I am having an experience). Alternatively, "a state S is reflexively conscious just in case it is phenomenally presented in a thought about S" (Block, 2001, p. 215). Similarly, Rosenthal (1986, 1996) defines a conscious mental state as a mental state one is aware of being in.
Conscious states in this sense require the existence of mental states that are about other mental states. "To have a conscious desire for a cup of coffee is to have such a desire and also to be simultaneously and directly aware that one has such a desire" (Van Gulick, 2004). |
My verdict: May not be applicable to non-human animals. It has yet to be shown that any non-human animals are capable of reflexive consciousness. Lurz (2003) considers the idea of a non-human animal having thoughts of any kind about its mental states to be highly implausible. (In Lurz's "same-order" account, a creature's experiences are conscious if it is conscious of what its experiences represent - i.e. their intentional object - even if they are not conscious that they are perceiving.)
Philosophers are currently divided over whether awareness of one's mental states is a requirement for having phenomenal consciousness (see Rosenthal, 2002; Dretske, 1995). |
Self-consciousness | Block (1995) defines self-consciousness as the possession of the concept of the self and the ability to use this concept in thinking about oneself. | My verdict: May not be applicable to non-human animals. "As yet, the only evidence that an animal may have an awareness of the 'self' versus awareness of other individuals has been demonstrated in chimpanzees and possibly orang-utans and dolphins" (Emery and Clayton, 2004, p. 41; see also Gallup, Anderson and Shillito, 2002; Reiss and Marino, 2001). This evidence comes from mirror tests.
Some philosophers (Leahy, 1994) question even this evidence, arguing that mirror tests merely indicate that an animal possesses consciousness of its own body, as opposed to true self-consciousness. Have we raised the bar too high? "If ... [t]he self-awareness requirement ... is taken to involve explicit conceptual self-awareness, many non-human animals and even young children might fail to qualify" (Van Gulick, 2004). |
Table 4.2 - Proposed new categories of animal consciousness, which I have uncovered in the scientific literature. The names of the categories are my own.
Term | Definition | Comments |
Integrative consciousness | The kind of consciousness which gives an animal access to multiple sensory channels and enables it to integrate information from all of them. | Mammals possess this kind of consciousness; snakes appears to lack it:
It seems that a snake does not have a central representation of a mouse but relies solely on transduced information. The snake exploits three different sensory systems in relation to prey, like a mouse. To strike the mouse, the snake uses its visual system (or thermal sensors). When struck, the mouse normally does not die immediately, but runs away for some distance. To locate the mouse, once the prey has been struck, the snake uses its sense of smell. The search behavior is exclusively wired to this modality. Even if the mouse happens to die right in front of the eyes of the snake, it will still follow the smell trace of the mouse in order to find it. This unimodality is particularly evident in snakes like boas and pythons, where the prey often is held fast in the coils of the snake's body, when it e.g. hangs from a branch. Despite the fact that the snake must have ample proprioceptory information about the location of the prey it holds, it searches stochastically for it, all around, only with the help of the olfactory sense organs (Sjolander, 1993, p. 3). Finally, after the mouse has been located, the snake must find its head in order to swallow it. This could obviously be done with the aid of smell or sight, but in snakes this process uses only tactile information. Thus the snake uses three separate modalities to catch and eat a mouse (Dennett, 1995b, p. 691). |
Object consciousness | Awareness of object permanence; ability to anticipate that an object which disappears behind an obstacle will subsequently re-appear. |
Reptiles appear to lack the concept of object permanence:
A snake has no ability to anticipate that a mouse running behind a rock will reappear. Cats and other predatory mammals are able to anticipate that the prey will reappear (Grandin, 1998). |
Anticipatory consciousness | Ability to visually anticipate the trajectory of a moving object. | Mammals can "lead" moving prey they are attacking by anticipating their trajectories - an ability that depends on their visual cortex (Kavanau, 1997, p. 255). Pigeons also possess this ability (Wasserman, 2002, p. 180). There is no evidence that fish and amphibians possess this ability. |
The contemporary philosophical debate about animal consciousness is split into several camps, with conflicting intuitions regarding the following four inconsistent propositions (Lurz, 2003):
1. Conscious mental states are mental states of which one is conscious.
2. To be conscious of one's mental states is to be conscious that one has them.
3. Animals have conscious mental states.
4. Animals are not conscious that they have mental states.
Common to all of the above positions is an underlying assumption: that the difference between phenomenally conscious mental states and other states can be formulated in terms of concepts which already exist within our language. This assumption may be turn out to be wrong: we may require new linguistic terminology to formulate this distinction properly.
I would suggest that the "original sin" of philosophers who have formulated theories of phenomenal consciousness was to suppose that the requirements for subjectivity could be elucidated through careful analysis. Now, an analytical approach might work if we had a good idea of what consciousness is, or why it arose in the first place, or what it is for. In fact, we know none of these things. Table 4.4 below illustrates this point: it lists a selection of theories regarding why consciousness exists.
Although a scientific consensus on the "why" of consciousness remains elusive, there is an abundance of neurological data relating to how it originates in the brain. In section 4.A.2, I review what scientists have discovered about primary consciousness, and I discuss the neurological requirements for consciousness.
Table 4.3 - Key positions in the contemporary philosophical debate on "consciousness" (Lurz, 2003)
School of thought | Description of school's position | Comments |
Higher-order representational (HOR) theories of consciousness | Accept propositions 1 and 2, and either 3 or 4. HOR theorists argue that a mental state (such as a perception) is not intrinsically conscious, but only becomes conscious as the object of a higher-order state. Higher-order states are variously conceived as thoughts (by HOT theorists) or as inner perceptions (by HOP theorists). | Dretske (1997) objects that HOR theories fail to explain the practical function of consciousness and thus effectively marginalise it. More recently, higher-order theorists have formulated their own proposals regarding the function of consciousness (see Carruthers, 2000). For an overview of theories of the function of phenomenal consciousness, see Table 4.4 below. |
Exclusive HOR theorists (Carruthers (2000, 2004) | Accept propositions 1, 2 and 4 but reject 3 - that is, they allow that human infants and non-human animals have beliefs, desires and perceptions, but insist (Carruthers, 2000, p. 199) that we can explain their behaviour perfectly well without attributing conscious beliefs, desires and perceptions to them. | My comment: Internally consistent, but almost certainly sets the bar for having phenomenal consciousness too high. The most natural way of explaining the similarity in responses between blindsighted humans and monkeys with blindsight (Stoerig and Cowey, 1997) is to suppose that both lack the same thing: phenomenal consciousness. Recent experiments with binocular rivalry demonstrating that humans and monkeys make identical reports about what they see when conflicting data is presented to their left and right two visual fields (Logothetis, 2003) suggest even more strongly that monkeys experience the world as we do. However, Carruthers could reply that there is no need to postulate higher-order states here: the monkeys simply have fluctuating first-order perceptions, which they have been conditioned to respond to by pulling a lever.
One may disagree with Carruthers' contention that an ability to distinguish between the way things appear and the way they really are is a pre-requisite for phenomenal consciousness, but it is an internally consistent position. (Allen (2002) himself proposes that any animals that can learn to correct their perceptual errors are phenomenally conscious, though he does not make it a necessary requirement.) I argue in the Appendix to chapter 4 part A that the meager experimental evidence available suggests that only human beings meet Carruthers' requirement for phenomenal consciousness. On the other hand, Carruthers' (2000) positive arguments in support of his theory of the origin of consciousness are rather unconvincing, and have been subjected to a detailed critique by Allen (2004a) (summarised in the Appendix to chapter 4 part A). |
Inclusive HOR theorists (Rosenthal, 1986, 2002) | Accept propositions 1, 2 and 3 but reject 4. Rosenthal (2002) construes an animal as having a thought that it is in some state. Such a thought requires a minimal concept of self, but "any creature with even the most rudimentary intentional states will presumably be able to distinguish between itself and everything else" (2002, p. 661). | My comment: Attributes an implausible level of cognitive sophistication to non-human animals. Rosenthal's (2002) HOT theory requires an animal to have the higher-order thought that it is in a certain state, before the state can qualify as conscious. This is a very strong requirement. According to HOT theorists, mental states do not become conscious merely by being observed; they become conscious by being thought about by their subject. This means that animals must have non-observational access to their mental states. As Lurz (2003) remarks, this is an implausible supposition for any non-human animal: "it is rather implausible that my cat... upon espying movement in the bushes... is conscious that she sees movement in the bushes, since it is rather implausible to suppose ... that my cat has thoughts about her own mental states". |
Lurz's (2003) same-order (SO) account | Presents itself as a via media between HOR and FOR. Accepts propositions 1, 3 and 4 but rejects 2. Lurz grants the premise that to have conscious mental states is to have mental states that one is conscious of them, but queries the assumption (shared by HOR and FOR theorists) that to be conscious of one's mental states is to be conscious that one has them. Lurz suggests that a creature's experiences are conscious if it is conscious of what its experiences represent - their intentional object - even if they are not conscious that they are perceiving.
|
My comment: Sets the bar for having phenomenal consciousness too low. Lurz's criteria could be satisfied by many animals that scientists agree are not phenomenally conscious (see below).
The example given by Lurz (2003) is that of a cat who notices a movement in the bushes and then behaves in a way that warrants our saying that she is paying attention to it. This (according to Lurz) implies that she is conscious of what she is seeing, and hence conscious of what a token visual state of hers represents, and thus in some way conscious of the mental state itself. However, the cognitive requirements that Lurz is imposing on animals are hardly exacting, as they seem to include nothing more than: (a) a capacity for paying attention, which exists in a rudimentary form even in fruit-flies (Van Swinderen and Greenspan, 2003), and (b) a capacity for object recognition, which is also found in honeybees (Gould and Gould, 1988) (see Appendix to chapter two part D). However, the available neurological evidence suggests that these animals lack the wherewithal for phenomenal consciousness (see below). |
First-order representational (FOR) accounts of consciousness (Dretske, 1995) | Accept propositions 2, 3 and 4 but reject 1. First order representational (FOR) theorists believe that if a perception has the appropriate relations to other first-order cognitive states, it is phenomenally conscious, regardless of whether the perceiver forms a higher-order representation of it (see Wright, 2003). For example, Dretske argues that a mental state becomes conscious simply by being an act of creature consciousness. Thus an animal need not be aware of its states for them to be conscious. On this account, consciousness has a very practical function: to alert an animal to salient objects in its environment - e.g. potential mates, predators or prey. However, attention is not a pre-requisite for consciousness: "You may not pay much attention to what you see, smell, or hear, but if you see, smell or hear it, you are conscious of it" (Dretske, 1997, p. 2). |
My comment: Sets the bar for having phenomenal consciousness too low. Dretkse's theory appears to have been falsified by empirical evidence, indicating that transitive creature consciousness can occur in the absence of phenomenal consciousness - as illustrated by the vomeronasal system, which responds to pheromones and affects human behaviour, but is devoid of phenomenality (Allen, 2004, p. 631) and the phenomenon of blindsight in humans and monkeys (Stoerig and Cowey, 1997, pp. 536-538; p. 552). There is also a massive body of neurological evidence (discussed below) indicating that phenomenal consciousness can only occur in animals with the right kind of brains: being able to perceive stimuli is not enough.
Additional comments: Lurz (2003) argues against Dretske on linguistic grounds: it is counter-intuitive to say that an animal could have a conscious experience of which it was not conscious. However, this argument overlooks the possibility that there may be different degrees of phenomenality - as shown by phenomena such as peripheral vision, so-called "distracted driving" and change blindness (for a discussion, see Hardcastle, 1997; Wright, 2003). A better argument against Dretske is that while some of a conscious animal's experiences may well be first-order states as he proposes, it would be improper to describe the creature as phenomenally conscious if all of its experiences were of this sort. Finally, Dretske's (1997) assertion that attention is not required for consciousness is at odds with his argument that consciousness must serve a practical function in promoting an animal's survival. An animal completely lacking the ability to pay attention to a salient stimulus would not survive very long in the wild. |
Table 4.4 - Theories of what consciousness is for: a brief overview
1. Consciousness is an epiphenomenon (Huxley). | |
2. Conscious feelings exist because they motivate an animal to seek what is pleasant and avoid what is painful (Aristotle). | |
3. Consciousness arose because it enabled its possessors to unify or integrate their perceptions into a single "scene" that cannot be decomposed into independent components (Edelman and Tononi, 2000). | |
4. Consciousness arose because it was more efficient than programming an organism with instructions enabling it to meet every contingency (Griffin, 1992). | |
5. Consciousness arose to enable organisms to meet the demands of a complex environment. However, environmental complexity is multi-dimensional; it cannot be measured on a scale (Godfrey-Smith, 2002). | |
6. Consciousness evolved to enable animals to deal with various kinds of environmental challenges their ancestors faced (Panksepp, 1998b). | |
7. Consciousness arose so as to enable animals to cope with immediate threats to their survival such as suffocation and thirst (Denton et al., 1996; Liotti et al., 1999; Parsons et al., 2001). | |
8. Consciousness gives its possessors the advantage of being able to guess what other individuals are thinking about and how they are feeling - in other words, a "theory of mind" (Whiten, 1997; Cartmill, 2000). | |
9. Consciousness arises as a spin-off from such a theory-of-mind mechanism (Carruthers, 2000). | |
10. Brain activity (as defined by EEG patterns) that supports consciousness in mammals is a precondition for all their complex array of survival and reproductive behaviour (e.g. locomotion, hunting, evading predators, mating, attending, learning and so on) (Baars, 2001). | |
11. Activities that are essential to the survival of our species - e.g. eating, raising children - require consciousness (Searle, 1999). It must therefore have a biological role. | |
12. Animals receive continual bodily feedback from their muscular movements when navigating their environment. Conscious animals have a very short real-time "muscular memory" which alerts them to any unexpected bodily feedback when probing their surroundings. A core circuit in their brains then enables them to cancel, at the last second, a movement they may have been planning, if an unexpected situation arises. This real-time veto-on-the-fly may save their lives (Cotterill, 1997). |
Neuroscientists commonly distinguish between primary and higher-order forms of consciousness (Edelman, 1989). Both forms, as defined below, qualify as phenomenal in the philosophical sense. In this section, I focus on primary consciousness, as most of the discussion of animal consciousness pertains to this kind of consciousness. The current evidence for secondary or "higher-order" consciousness in non-human animals is summarised in the Appendix to Part B.
Table 4.5 - Different scientific usages of the term "consciousness"
Term: Primary consciousness (also called "core consciousness" or "feeling consciousness").
Definition: "Primary consciousness refers to the moment-to-moment awareness of sensory experiences and some internal states, such as emotions" (Rose, 2002, p. 6). Relevance to animals: Rose (2002) remarks that "[m]ost discussions about the possible existence of conscious awareness in non-human animals have been concerned with primary consciousness" (2002, p. 6).
|
Term: Secondary consciousness (also known as "extended consciousness" or "self-awareness").
Definition: "Higher-order consciousness includes awareness of one's self as an entity that exists separately from other entities; it has an autobiographical dimension, including a memory of past life events; an awareness of facts, such as one's language vocabulary; and a capacity for planning and anticipation of the future" (Rose, 2002, p. 6).
|
Our investigation in this section will focus on the evidence for primary consciousness in animals, as most of the evidence for consciousness in animals pertains to this form of consciousness.
The standard observational criterion used to establish the occurrence of primary consciousness in animals is accurate report (AR). I summarise the problems associated with this criterion in Table 4.6. I conclude that while evidence of accurate report in animals is highly suggestive, it cannot establish that animals possess phenomenal consciousness. Seth, Baars and Edelman (2005) propose to resolve this deadlock by defining consciousness in terms of neurological criteria.
Are there any other behavioural indicators that might allow us to unambiguously identify primary consciousness in animals? After examining three categories of behavioural indicators - Panksepp's criteria for affective consciousness; the behavioural indicators for conscious pain; and hedonic behaviour in animals - I concluded that the answer was in the negative. While some of the behaviours cited do indicate the occurrence of phenomenal consciousness, positive identification of a phenomenally conscious state cannot be made without either verbally interrogating the subject (as in some forms of accurate report) or checking that the behaviour is regulated by parts of the brain that are associated with phenomenal consciousness (see Table 4.10 below).
As the interrogation of non-human animals is highly problematic (for reasons discussed in Table 4.6), it follows that phenomenal consciousness in animals ultimately has to be defined as a neurological state in order for us to make some headway in identifying it. Behavioural indicators alone are too weak to settle the matter of which, if any, animals are phenomenally conscious. However, the combination of behavioural and neurological evidence constitutes a very powerful case for the occurrence of phenomenal consciousness in these animals.
Table 4.6 - Summary of problems associated with using the behavioural criteria for primary consciousness as an indicator of phenomenal consciousness
What are the behavioural criteria for primary consciousness? The following criteria used by neurologists to identify primary consciousness in human beings: From the clinical perspective, primary consciousness is defined by
|
How are these criteria assessed in humans and other animals?
|
Definitional problems:
|
Is primary consciousness a necessary condition for phenomenal consciousness?
|
Is primary consciousness a sufficient condition for possessing phenomenal consciousness?
|
Conclusion:
|
Table 4.7 - Affective indicators of phenomenal consciousness
Definition: Panksepp (1998, 2001, 2003f) and Liotti and Panksepp (2003) have proposed that we possess two distinct kinds of consciousness: cognitive consciousness, which includes perceptions, thoughts and higher-level thoughts about thoughts and requires a cortex, and affective consciousness, which relates to our feelings and arises within the brain's limbic system.
|
Criteria for affective consciousness:
For Panksepp (2002b), the numerous resemblances between human affective behaviour, neurobiology, anatomy and pharmacology, and that of non-human constitutes very strong evidence for the occurrence of affective consciousness in non-human animals: Overwhelming evidence shows that animal brains elaborate many states of affective consciousness... Of course there is no "ultimate "proof" in science, merely the weight of evidence. To me it remains a mystery that certain scientists can ignore the mass of relevant evidence from (i) behavioral reinforcement studies; (ii) place preference-aversion studies; (iii) manifest and ubiquitous emotional vocalizations; (iv) neuroethological studies evoking the same emotional behavior from the same human/animal brain analogs and (v) the coherent translations between human and animal psychopharmacological work (Panksepp, 2002b).
|
An example of affective consciousness identified in non-human animals:
Panksepp and Burgdorf (2003c), in an article entitled "'Laughing' rats and the evolutionary antecedents of human joy?" discuss their recent discovery that play- and tickle-induced ultrasonic vocalisations in rats which are analogous to laughter in human children. The authors identify no less than twelve points of resemblance between rat "laughter" and children's laughter and argue that alternative non-mentalistic explanations are not well-supported.
|
My comment: The investigation of an "affective consciousness" in animals is scientifically productive. However, even proponents of a separate "affective consciousness" admit that it cannot be defined using behavioural criteria alone: Panksepp's own criteria in the quote cited above (2002b) include neurological and psychopharmacological analogies between humans and animals.
|
Table 4.8 - Behavioural indicators of phenomenally conscious pain
Definition: The International Association for the Study of Pain (1999; see Rose, 2002) defines "pain" as a conscious experience. In particular: (i) pain is an unpleasant sensory and emotional experience associated with actual or potential tissue damage, or described in terms of such damage; (ii) pain is always subjective; (iii) pain is sometimes reported in the absence of tissue damage and the definition of pain should avoid tying pain to an external eliciting stimulus (Rose, 2002, p. 15). Nociception, defined as "the activity induced in ...nociceptive pathways by a noxious stimulus" (2002, p. 15), "does not result in pain unless the neural activity associated with it reaches consciousness" (Rose, 2002, p. 16).
|
Behavioural criteria: Various behaviours in animals have been regarded at different times as indicators of pain. These include: stress responses in all cellular organisms; nociceptive responses to noxious stimuli in nearly all animals; the presence of pain-killing opiates within the brainstems of various kinds of animals; flavour aversion learning; classical and instrumental conditioning; self-administration of analgesics; and pain-guarding. However, clinical assessments of pain in human patients do not rely solely on these criteria, as they are considered insufficient to define the occurrence of pain.
|
My comment:
|
Example - do fish feel pain? Rose (2002), after conducting an exhaustive review of the literature relating the neurology and behaviour of fish and the clinical indicators used by neurologists to assess pain, concluded that consciousness of any kind in fish is "a neurological impossibility" (2002, p. 2). More recently, Rose (2003a) has written a devastating critique of a much-publicised report by Sneddon, Braithwaite and Gentle (2003) which claimed to have identified evidence of pain guarding in fish.
|
Table 4.9 - Hedonic behaviour as evidence of phenomenal consciousness in animals
Definition: Hedonic behaviour can be defined as pleasure-seeking activity on an animal's part.
|
Criteria: Various indicators have been proposed as evidence of conscious pleasure in animals, including: self-stimulation, intoxication, drug addiction, the phenomenon of satiety, and most impressively, (i) the willingness of some animals to make hedonic trade-offs whereby they expose themselves for a short time to an aversive stimulus in order to procure some attractive stimulus (Cabanac, 2003), and (ii) the occurrence of "rational" and "irrational" forms of pursuit in animals (Berridge, 2003). Irrational pursuit occurs when an animal desires something it neither likes nor expects to like, and can be identified when an animal, under the influence of some drug (e.g. dopamine), is suddenly presented with a "rewarding" stimulus, which cues hyperactive pursuit of the stimulus.
|
My comment: Neither the willingness of some animals to make hedonic trade-offs whereby they expose themselves for a short time to an aversive stimulus in order to procure some attractive stimulus, nor the presence of "rational" and "irrational" forms of pursuit can be treated as an unambiguous indicator of phenomenally conscious pleasure in animals. Berridge (2001, 2003) presents evidence from human studies that irrational desires need not be conscious: humans can be influenced to like or dislike something simply by subliminal exposure to stimuli which they report being unaware of. |
4.A.2.3 Neural pre-requisites for consciousness
(I would like to acknowledge a special debt of gratitude here to Dr. Jaak Panksepp, Dr. James Rose and Dr. David Edelman, for their patience in answering my queries. Any errors here are entirely my own.)
Table 4.10 summarises the neurological indicators for consciousness, which make the attribution of primary consciousness to mammals highly plausible. I propose that the only way to resolve the argumentative impasse regarding phenomenal consciousness in animals is to re-define phenomenal consciousness: instead of regarding it as tied to certain forms of behaviour, we would do better to simply define it in terms of the neurological conditions which generate it.
Problems arise when assessing consciousness in creatures whose brains are different in design from our own: here, we have to rely on analogy. As functional analogies between the brains of mammals and other animals are incomplete at present, we cannot definitively conclude that birds are phenomenally conscious, and the question of whether octopuses are conscious must remain even more speculative.
The major divisions of the brain. Diagram courtesy of Dr Anthony Walsh, Chairman, Department of Psychology, Salve Regina University, Rhode Island.
The reticular activating system (RAS) comprises parts of the medulla oblongata, the pons and midbrain and receives input from the body's senses - excluding smell. When the parts of the RAS are active, nerve impulses pass upward to widespread areas of the cerebral cortex, both directly and via the thalamus, effecting a generalised increase in cortical activity associated with waking or consciousness. Image courtesy of Dr. Rosemary Boon, founder of Learning Discoveries Psychological Services.
Divisions of the cerebral cortex. Diagram courtesy of Dr. Gleb Belov, Department of Mathematics, Technical University of Dresden, Germany.
Table 4.10 - Key neurological features of primary consciousness
What are the distinguishing neural properties of primary consciousness? There are three major properties of consciousness that are fairly well accepted by neurobiologists (Seth, Baars and Edelman, 2005):
|
Which parts of the brain are required for primary consciousness?
|
Which structures in the cerebral cortex are required for primary consciousness?
|
Which parts of the neocortex are required for primary consciousness?
|
Table 4.11 - Is consciousness possible in brain-damaged mammals, which lack a cerebral cortex?
(a) Evidence for consciousness without a cortex
Some neurologists (Panksepp, 1998, 2001, 2003f; Denton et al., 1996; Denton et al., 1999; Egan et al., 2003; Liotti et al., 2001; Parsons et al., 2000; Parsons et al., 2001) question the current neurological consensus and argue that conscious feelings may occur outside the cerebral cortex. Perhaps their most interesting evidence comes from studies of hydranencephalic children (who have little or no cerebral cortex) and decorticate animals (whose cerebral cortex has been removed). After carefully examining their technical arguments in scientific journals, I concluded that: (i) for animals whose cerebral cortex was removed during infancy, the trauma of decerebration may have affected the neural development of their brain stem, effectively "corticising" it so that some parts were able to take over some of the functionality normally handled by a cerebral cortex (vertical plasticity);
|
(b) Could consciousness be located in the cerebellum?
The cerebellum has sometimes been proposed as an alternative site for consciousness outside the cerebral cortex. It is the only structure in the brains of non-mammals lack structures with a comparable ability to rapidly integrate diverse kinds of information. Interestingly, the cerebellum, located at the back of the brain, "contains probably more neurons and just as many connections as the cerebral cortex, receives mapped inputs from the environment, and controls several outputs", and yet "lesions or ablations indicate that the direct contribution of the cerebellum to conscious experience is minimal" (Tononi, 2004), and "removal of the cerebellum does not severely compromise consciousness" (Panksepp, 1998, p. 311). The reason why activity in the cerebellum is not associated with consciousness is thought to be because different regions of the cerebellum tend to operate independently of one another, with little integration of information between regions (Tononi, 2004).
|
Table 4.12 - Which animals satisfy the neurological criteria for primary consciousness?
Mammals
|
Birds
|
Reptiles
My verdict: phenomenal consciousness in reptiles is possible, but unlikely, given the massive neurological dissimilarities between mammals and reptiles. |
Fish and amphibia
Evidence for complex agency in fish According to Culum Brown ("New Scientist", 12 June 2004, p. 42), fish have a fantastic spatial memory, equal to that of any other vertebrate, including non-human primates. Fish can also recognise individuals and keep track of complex social relationships. The report cites the introductory chapter of Fish and Fisheries (September 2003) as claiming that fish are "steeped in social intelligence, pursuing Machiavellian strategies of manipulation, punishment and reconciliation ... exhibiting stable cultural traditions and cooperating to inspect predators and catch food". Of course, it needs to be borne in mind that there are 28,500 species of fish; some are a lot "smarter" than others. Rose (2002) writes that "[i]n spite of the diversity and complexity among species, the behaviors of fishes are nonetheless highly stereotyped and invariant for a given species" (2002, p. 9, italics mine). He adds that these behaviours are controlled almost entirely by the brainstem (which, as we have seen, is devoid of consciousness) and remain strikingly preserved even if the fishes' cerebral hemispheres are removed. |
Octopuses
|
Honeybees
My verdict: the fact that bees have impressive conceptual abilities does not tell us that they are conscious, for the simple reason that we do not know what consciousness evolved to do in the first place. As we have seen, intentional agency is certainly possible without agency; why not insight? |
Table 4.13 - Status of philosophical arguments for the occurrence of phenomenal consciousness in different kinds of animals
The failure of similarity arguments
Where does that leave us? While the similarity arguments beloved by philosophers can be used to make a strong cumulative case that conscious feelings are widespread among mammals, the massive dissimilarities between the neocortex of the mammalian brain and the much more primitive structures in the brains of birds and reptiles effectively undermine any arguments for conscious feelings in these animals that are based on "similarity" alone.
|
Analogical causal arguments for phenomenal consciousness
As we cannot yet identify structures in the brains of birds that are homologous to the mammalian neocortex, any arguments for consciousness in non-mammals must therefore be based on certain structures which play an analogous causal role in regulating their behaviour, which equals or surpasses that of mammals in cognitive sophistication. Because birds meet all the other neural requirements for consciousness and equal mammals in behavioural sophistication, I conclude that birds are probably phenomenally conscious.
|
Arguments against consciousness in lower vertebrates
Because the brains of all vertebrates are built according to thesame basic pattern (Rose, 2002), we can formulate an counter-analogical argument that fish and amphibians are not phenomenally conscious, on account of the massive neural and behavioural disparities between these vertebrates and conscious mammals.
|
Invertebrates
For invertebrates, whose brains are too unlike those of mammals to permit even a functional comparison of their brains with ours, an inferential approach is required to ascertain whether they have conscious feelings: we need to identify behaviour on their part that cannot be plausibly explained except in terms of phenomenally conscious states. Edelman, Baars and Seth (2005) make some useful suggestions regarding future neurophysiological and behavioural research with these creatures.
|
So far, our investigation points to at least three distinct senses in which interests can be ascribed to creatures:
For some philosophers, a capacity for phenomenal consciousness is regarded as a sine qua non for having interests and being morally relevant. However, the above summary suggests that the ethical divide between mindless organisms and animals with minimal minds is a greater one than that between animals with minimal minds and phenomenally conscious animals, and the division between the simplest organisms and assemblages lacking intrinsic finality is greater still. Animals' interests, whether conscious or not, can be measured and which can be harmed by our actions. In the Appendix, I provide specific examples of how the welfare of fish (who lack phenomenal consciousness) can be measured using specific indices, and of how it can be harmed by practices such as aquaculture and angling.
Of course, we have a strong prima facie duty to refrain from treating phenomenally conscious animals cruelly, and the duty (under more restricted circumstances) to be kind to them. For companion animals, that would entail befriending them. Logically, any animals that lacked phenomenal consciousness (such as goldfish) could not serve as true "companions".