Work in Progress
Does operant conditioning require phenomenally conscious experiences? The attribution of beliefs and desires to animals capable of operant conditioning invites the question of whether they have phenomenally conscious experiences. I shall use this terminology here, to mean an experience where there is something that it is like for the subject to have it. Unfortunately, a great deal of philosophical confusion has arisen because terms such as "perception", "feeling", "awareness", "experience", "sentience", "subjective experience", "conscious experience" and "phenomenally conscious experience" have no agreed meaning in the literature, and are used in different senses by different authors.
There is an argument which purports to show that the ability to undergo conditioning implies a capacity for phenomenally conscious experience. Bermudez (2000, p. 194) argues that "learning through conditioning works because primary reinforcers have qualitative aspects. It is impossible to divorce pain's being a negative reinforcer from its feeling the way it does." However, this argument may prove too much: presumably Bermudez would not regard bacteria's aversion to certain stimuli as an indication of pain. Why should negative reinforcers be any different? If, however, we restrict the scope of the argument to operant conditioning, then we do have an additional reason to regard negative reinforcers as painful: the animal is an agent which tries to avoid the stimulus. Why does it try? A plausible answer is: because it finds the stimulus painful. An alternative, non-phenomenal response would be that its sensors register a dangerous stimulus (e.g. heat). On this account, the fly is fleeing danger, not pain.
The contemporary philosophical debate over phenomenally conscious experiences in animals is often cast as an argument between those (e.g. Dretske, 1995; Lurz, 2000) who think that they can be explained in terms of first-order representations and those (e.g. Carruthers, 1998, 2000, forthcoming) who believe that they require higher-order representational states, whereby an animal represents its own mental states to itself. According to the former account, phenomenally conscious experience is common in the animal kingdom (Dretske, 1995). If the latter account is correct, it is relatively rare, and possibly unique to human beings (Carruthers, 2000).
Before adjudicating between the two sides, I should set out what they agree about. Both Dretske and Carruthers agree that many animals possess what is known as creature consciousness: they are conscious (awake) at some times and not others, and they are conscious of objects, events, properties and facts. Thus Searle's observation that "you cannot eat, copulate, raise your young, hunt for food, raise crops, speak a language, organize social groups, or heal the sick if you are in a coma..." (1999, p. 63) is not germane here. The authors also agree that most animals are sentient, and that many animals possess concepts, but few possess higher-order mental states.
After reading the exchanges between Carruthers (1998, 2000) and Lurz (2000), my impression is that neither side has been able to score a knock-out blow against the other. Nor have neurophysiological studies resolved the issue, as findings are open to multiple interpretations. A case in point is the occurrence of blindsight in brain-damaged monkeys. Carruthers (forthcoming) criticises Dretske (1995) for assuming that because blindsight entails the loss of phenomenally conscious visual experience in humans, it must do the same in brain-damaged monkeys.
As I see it, the real point at issue between the two philosophical camps is the significance of phenomenal experience for human cognition and action. Carruthers (forthcoming) describes its role as "almost epiphenomenal": only rarely does a human agent act as she does "in virtue of the phenomenally conscious properties of her experience". According to Carruthers, it is involved only when we think about our experiences as such, or describe our experiences to others, or distinguish between the way things are and the way they seem. He argues that animals can do none of these things, as they lack a "theory of mind". Dretske, on the other hand, argues that "there are many things people with experience can do that people without experience cannot do... That is why we, and a great many other animals ... have conscious experiences" (1995).
Dretske (2001) argues that if an animal has organs whose biological function it is to collect and store information gathered from a sensory modality (e.g. color vision) which guides their behaviour, then that animal is aware, or conscious, in that sensory mode, of what it senses. Since honey bees have eyes whose function it is to see things in colour, which helps them in foraging for food, Dretske regards them as having visual awareness or consciousness. He counters Siewert's suggestion (1998) that they may be "buzzing blindsighters" by asking what properties conscious experiences have that their internal representations of colour lack.
I would suggest that we can resolve the issue for Drosophila by asking what it needs to pay attention to, in order to navigate its way around its environment. I have argued that in order to undergo operant conditioning, a fly has to compare the sensory inputs arising from motor movements that are controlled from within, with the sensory feedback it gets from its external environment. In other words, it has to pay attention to its sensory inputs, distinguishing internal from external signals. Does subjectivity get a foothold here? Not necessarily. For although the fly can distinguish "inside" from "outside" signals, they need not have that meaning to the fly. To the fly, they may merely be type A and type B signals. Such a characterisation is objective, while retaining the first-person inner-outer distinction we alluded to earlier. If it is correct, flies are agents but not subjects.
I conclude that the case for insect subjectivity remains unproved, at this stage. However, I also wish to note that there are no plausible human analogies offered for the lack of phenomenally conscious experience in animals. Carruthers (1998) cites the following cases: absent-minded driving; sleepwalking; experience during mild epileptic seizure; the experiences which guide fast-reaction activity; and blindsight. Blindsight, as Lurz (2000) points out, does not normally elicit purposeful and intelligent behaviour. The same goes for sleepwalking. Fast reactions are not controlled by the subject. A subject having an epileptic seizure is not able to control her bodily movements. And the fly, unlike the absent-minded driver, is not distracted by anything: on the contrary, its attention is focused on avoiding the heat beam.
On the other hand, there may be a good reason for the lack of a good human analogy: "we have no idea how to imagine, from the inside, a perceptual state that isn't phenomenally conscious" (Carruthers, forthcoming).
Back to Insects Previous page References