Ned Block (1942- )Edit
Two types of consciousnessEdit
According to Block, "Phenomenal consciousness is experience; the phenomenally conscious aspect of a state is what it is like to be in that state. The mark of access-consciousness, by contrast, is availability for use in reasoning and rationally guiding speech and action." Block feels that it is possible to have phenomenal consciousness and access consciousness independently of each other, but in general they do interact.
There is no generally agreed upon way of categorizing different types of consciousness. Block's distinction between phenomenal consciousness and access consciousness tries to distinguish between conscious states that either do or do not directly involve the control of thought and action.
Phenomenal consciousness. According to Block, phenomenal consciousness results from sensory experiences such as hearing, smelling, tasting, and having pains. Block groups together as phenomenal consciousness the experiences of sensations, feelings, perceptions, thoughts, wants and emotions. Block excludes from phenomenal consciousness anything having to do with cognition, intentionality, or with "properties definable in a computer program".
Access consciousness. Access consciousness is available for use in reasoning and for direct conscious control of action and speech. For Block, the "reportability" of access consciousness is of great practical importance. According to Block:
"reportability.....is often the best practical guide to A-consciousness" [Note: Block often uses the terms "P-consciousness" and "A-consciousness" to refer to "Phenomenal consciousness" and "Access consciousness"]
Also, access consciousness must be "representational" because only representational content can figure in reasoning. Examples of access consciousness are thoughts, beliefs, and desires.
A potential source of confusion is that some phenomenal consciousness is also representational. The key distinction to keep in mind about representational content that Block would place in the access consciousness category is that the reason it is placed in the access consciousness category is because of its representational aspect. Elements of phenomenal consciousness are assigned to the phenomenal consciousness category because of their phenomenal content.
An immediate point of controversy for Block's attempt to divide consciousness into the subdivisions of phenomenal consciousness and access consciousness is that many people view the mind as resulting (in its entirety) from fundamentally computational processes. This computational view of mind implies that ALL of consciousness is "definable in a computer program", so Block's attempt to describe some consciousness as phenomenal consciousness cannot succeed in identifying a distinct category of conscious states.
As mentioned above, Block feels that phenomenal consciousness and access consciousness normally interact, but it is possible to have access consciousness without phenomenal consciousness. In particular, Block believes that zombies are possible and a robot could exist that is "computationally identical to a person" while having no phenomenal consciousness. Similarly, Block feels that you can have an animal with phenomenal consciousness but no access consciousness.
Block believes that we can have conscious experiences that are not possible to produce by any type of computational algorithm and that the source of such experiences is "the hard problem" of consciousness. Block's position with respect to consciousness is analogous to that of Vitalists who defined Life as being in a category distinct from all possible physical processes. Biologists refute Vitalism by describing the physical processes that account for Life. In order to refute Block's claim about the distinction between phenomenal consciousness and access consciousness, it is up to biologists and artificial consciousness researchers to describe computational algorithms that account for consciousness.
Why are some neurobiologists and computer scientists sure that Block's dualist division of consciousness is wrong? What is the source of Block's certainty that there are non-computational forms of consciousness? One example of phenomenal consciousness discussed by Block is a loud noise that you do not consciously notice because you are paying attention to something else. Block is sure that you were aware of the noise (phenomenal consciousness) but just not "consciously aware" (access consciousness). Many scientists would say that in this case, you were not "consciously aware" of the noise, but it is almost certain that portions of your unconscious brain activity responded to the noise (you could electrically record activity in the primary auditory cortex that is clearly a response to action potentials arriving from the ears due to sound waves from the noise). This suggests that Block's controversial "non-computational" category of phenomenal consciousness includes brain activity that others would categorize as being unconscious, not conscious. Some unconscious brain activity can begin to contribute to consciousness when the focus of one's conscious awareness shifts. This suggests that some of what Block calls phenomenal consciousness is brain activity that can either take place outside of consciousness or as part of consciousness, depending on other things that might be going on in the brain at the same time. If so, we can ask why the consciously experienced version of this kind of brain activity is computational while the unconscious version is not.
Block stresses that he makes use of introspection to distinguish between phenomenal consciousness and access consciousness. Presumably this means that when the loud noise was not noticed, it was not accessed by introspection. Block has thus defined a category of consciousness that is outside of our "conscious awareness" (although he says we are "aware" of it in some other way) and not accessed by introspection. Maybe it is this inaccessibility of some cases of phenomenal consciousness that motivate Block's idea that such forms of consciousness cannot be computational. When experiences are accessible to introspection and available for inclusion in reasoning processes, we can begin to imagine computational algorithms for their generation.
Forms of phenomenal consciousness that are open to introspectionEdit
In his 1995 article, Block went on to discuss the more interesting cases such as if upon starting to "pay attention to" the load noise (see above) that was previously ignored, the experiencer noticed that there had been some earlier experience of the noise, just not of the type that we "pay attention to"; a type of experience that had been just "on the edge" of access consciousness.
In Ned Block's entry for "Consciousness" in the 2004 Oxford Companion to the Mind, he discusses another example that he feels distinguishes between phenomenal consciousness and access consciousness.
As described by Block, Liss performed an experiment in which he presented test subjects with visual stimuli: views of 4 letters. The 4 letters were shown to the test subjects in two different ways:
1) "long" stimulus, e.g. 40 msec, followed by a second visual stumulus, a “mask” known to make the first stimulus (the letters) hard to identify
2) "short", e.g. 9 msec, without a second stimulus (the mask).
- "Subjects could identify 3 of the 4 letters on average in the short case but said they were weak and fuzzy. In the long case, they could identify only one letter, but said they could see them all and that the letters were sharper, brighter and higher in contrast. This experiment suggests a double dissociation: the short stimuli were phenomenally poor but perceptually and conceptually OK, whereas the long stimuli were phenomenally sharp but perceptually or conceptually poor, as reflected in the low reportability."
This experiment demonstrates a distinction between
- i) reportability of names of the letters
- ii) perceptual sharpness of the image.
Block's definitions of these two types of consciousness leads us to the conclusion that a non-computational process can present us with phenomenal consciousness of the forms of the letters, while we can imagine an additional computational algorithm for extracting the names of the letters from their form (this is why computer programs can perform character recognition). The ability of a computer to perform character recognition does not imply that it has phenomenal consciousness or that it need share our ability to be consciously aware of the forms of letters that it can algorithmically match to their names.
If Block's distinction between phenomenal consciousness and access consciousness is correct, then it has important implications for attempts by neuroscientists to identify the neural correlates of consciousness and for attempts by computer scientists to produce artificial consciousness in man-made devices such as robots. In particular, Block seems to suggest that non-computational mechanisms for producing the subjective experiences of phenomenal consciousness must be found in order to account for the richness of human consciousness or for there to be a way to rationally endow man-made machines with a similarly rich scope of personal experiences of "what it is like to be in conscious states". Other philosophers of consciousness such as John Searle have similarly suggested that there is something fundamental about subjective experience that cannot be captured by conventional computer programs.
Many advocates of the idea that there is a fundamentally computational basis of mind feel that the phenomenal aspects of consciousness do not lie outside of the bounds of what can be accomplished by computation. Some of the conflict over the importance of the distinction between phenomenal consciousness and access consciousness centers on just what is meant by terms such as "computation", "program" and "algorithm". In practical terms, how can we know if it is within the power of "computation", "program" or "algorithm" to produce human-like consciousness? There is a problem of verification; can we ever really know if we have a correct biological account of the mechanistic basis of conscious experience and how can we ever know if a robot has phenomenal consciousness?
Many neurobiologists and computer scientists feel that philosophers such as Block and Searle are overly-pessimistic about the power of "computation", "program" or "algorithm" to produce human-like consciousness. The study of "computation", "program", "algorithm" and consciousness is too primitive for us to be able to trust our intuitions about exactly what is possible for computational algorithms to accomplish. Further, it may not matter what we call physical processes that can generate consciousness as long as we can figure out what they are and how to work with them. Thus, neurobiologists and computer scientists feel justified in continuing to search for the physical basis of consciousness and for ways to endow man-made devices with human-like consciousness. Further, despite warnings from philosophers, neurobiologists and computer scientists often suspect that conventional physical accounts of brain processes and some form of computational algorithm can be found to explain consciousness and allow us to instantiate it in robots.
Some philosophers such as Thomas Nagel have claimed a fundamental distinction between the first person experience of consciousness and any third person account of the mechanisms by which consciousness is generated. If philosophers can be overly-pessimistic about what neuroscientists and computer scientists can accomplish from the third person perspective, they might also be overly-enthusiastic about the reliability of first person introspection. Some philosophers have been fundamentally skeptical about our ability to be certain about anything we observe from the first person perspective. Despite any sense we me have about our inability to be be wrong about our subjective evaluations of our own consciousness, it may be wise to keep an open mind and remain open to the possibility that phenomenal consciousness is not a distinct category from access consciousness. For example, they may be at the two ends of a continuous spectrum of consciousness for which some forms of consciousness are easier to imagine as being algorithmically generated that others
The results of the experiment of Liss (discussed above) can have several interpretations. Viewing printed letters can lead to activation of many different brain regions and brain processes. Some parts of the brain that are devoted just to visual processing make heavy contributions to our ability to form a clear mental image of the shape, form and color of letters. These brain regions allow us to become aware of visual features but we are almost totally unable to have any introspective insight into how we become aware of shape, form and color. Other parts of the brain are required for our normal ability to report the names of letters that we see. The experiment shows that by controlling the exact conditions under which experimental subjects are asked to report their experience of the letters, conditions will either favor awareness of letter form or awareness of the names of the letters. Presumably, a sufficiently detailed analysis of brain activity would reveal how the variable test conditions of the experiment result in different patterns of activity in various parts of the brain and would allow for an account of the results of the experiment in terms of the details of brain function.
Most philosophers participate in introspective efforts to understand the steps involved in their own linguistic competencies. Introspection can allow us to be aware of mental processes that seem to have a linear sequence for the production of speech or lines of reasoning. Computer science also has an established history of defining explicit algorithms by which strings of words can be placed in grammatically correct orders and various theorem generating programs now exist which seem to replicate some aspects of reasoning. Thus, introspection combined with knowledge of what computer science has and has not yet accomplished provides philosophers with certain intuitions about the nature of consciousness and the nature of computation. In particular, Block has been led to suspect that phenomenal consciousness is fundamentally outside of the range of things that can be done with programs.
- ^ Block, N. (1995). ON A CONFUSION ABOUT A FUNCTION OF CONSCIOUSNESS. Behavioral and Brain Sciences 18 (2): 227-287.
- ^ Block, N. (2004). "Consciousness" (in R. Gregory (ed.) Oxford Companion to the Mind, second edition 2004).
- ^ Liss, P., (1968). “Does backward masking by visual noise stop stimulus processing?” Perception & Psychophysics 4, 328-330.
- ^ For a short account, see the Wikipedia entry for phenomenal and access consciousness. Charles Siewert provides a more detailed analysis in his article "Consciousness and Intentionality" in The Stanford Encyclopedia of the Philosophy of Mind.
- ^ "What is it like to be a bat?" by Thomas Nagel in The Philosophical Review LXXXIII, 4 (1974): 435-50.
- ^ On Certainty by Ludwig Wittgenstein. Publisher: Harper Perennial (1972) ISBN: 0061316865.
- ^ Güven Güzeldere described such intuition about the distinctions between phenomenal consciousness and access consciousness as segregationist intuition. See "The many faces of consciousness: a field guide" in THE NATURE OF CONCIOUSNESS; PHILOSOPHICAL DEBATES Publisher: The MIT Press (1997) ISBN: 0262522101.
Alternatives to Block's two categories of consciousnessEdit
- "Naturalizing consciousness: A theoretical framework" by Gerald M. Edelman in Proc Natl Acad Sci U S A (2003) Volume 100 pages 5520–5524.
- "A neuronal network model linking subjective reports and objective physiological data during conscious perception" by
Stanislas Dehaene, Claire Sergent, and Jean-Pierre Changeux in Proc Natl Acad Sci U S A (2003) Volume 100 pages 8520–8525.
- "An information integration theory of consciousness" by Giulio Tononi in BMC Neurosci. (2004) 5, 42.
- "A framework for consciousness." by F. Crick and C. Koch in Nat Neurosci. (2003) Volume 6, pages 119-126.
- "Consciousness" by J. R. Searle in Annu Rev Neurosci. (2000) Volume 23, pages 557-578.
- This essay about Block's division of consciousness into two distinct categories was originally written for the "Consciousness studies" wikibook.