SUMMARY AND REVIEWS OF THE LITERATURE ON CONSCIOUSNESS
JOHN SEARLE AND DAVID CHALMERS

reviewed By ANIL MITRA phd, 1999 updated May 2003
REFORMATTED
February 2015

Home   |   Contact

Document status: Thursday, February 05, 2015; no further action needed; may be useful if I write specifically on consciousness and mind.

 .

OUTLINE

The Reviews. 4

John Searle, New York Review of Books, The Mystery of Consciousness, 1997. 4

John Searle, The Rediscovery of The Mind, 1992. 12

David Chalmers, The Conscious Mind, 1996. 22

Copyright and Latest Update. 26

Footnotes. 26

 


CONTENTS

The Reviews. 3

John Searle, New York Review of Books, The Mystery of Consciousness, 1997. 4

Consciousness as a Biological Problem.. 4

Francis Crick, the Binding Problem and the Hypothesis of Forty Hertz. 5

Gerald Edelman and Reentry Mapping. 6

Roger Penrose, Kurt Gödel, and the Cytoskeletons. 7

Consciousness Denied: Daniel Dennett’s Account 8

David Chalmers and the Conscious Mind. 8

Israel Rosenfield, the Body Image, and the Self 11

Conclusion: How to Transform the Mystery of Consciousness into the Problem of Consciousness. 11

John Searle, The Rediscovery of The Mind, 1992. 12

Introduction. 12

What’s Wrong with the Philosophy of Mind. 12

The Recent History of Materialism: The Same Mistake Over and Over. 12

Breaking the Hold: Silicon Grains, Conscious Robots, and Other Minds. 14

Consciousness and its Place in Nature. 14

Reductionism and the Irreducibility of Consciousness. 15

The Structure of Consciousness: an Introduction. 16

A Dozen Structural Features. 16

Three Traditional Mistakes. 18

The Unconscious and its Relation to Consciousness. 18

Consciousness, Intentionality, and the Background. 20

The Critique of Cognitive Reason. 20

The Proper Study. 21

David Chalmers, The Conscious Mind, 1996. 22

Introduction: Taking Consciousness Seriously. 22

Two Concepts of Mind. 22

What is Consciousness?. 22

A catalog of conscious experiences. 22

The phenomenal and psychological concepts of mind. 23

The double life of mental terms. 23

The two mind-body problems. 23

Two concepts of consciousness. 23

Supervenience and Explanation. 24

Supervenience. 24

Reductive explanation. 24

Logical supervenience and reductive explanation. 24

Conceptual truth and necessary truth*. 25

Almost everything is logically supervenient on the physics*. 25

THE IRREDUCIBILITY OF CONSCIOUSNESS. 25

Can Consciousness be reductively explained?. 25

Naturalistic Dualism.. 25

The Paradox of Phenomenal Judgment 25

TOWARD A THEORY OF CONSCIOUSNESS. 25

The Coherence Between Consciousness and Cognition. 25

Absent Qualia, Fading Qualia, Dancing Qualia. 25

Consciousness and Information: Some Speculation. 25

APPLICATIONS. 26

Strong Artificial Intelligence. 26

The Interpretation of Quantum Mechanics. 26

Copyright and Latest Update. 26

Footnotes. 26

 


The Reviews


John Searle, New York Review of Books, The Mystery of Consciousness, 1997

Consciousness as a Biological Problem

Common sense definition of consciousness - as states of sentience and awareness - is simple…

Analytic definition (in terms of an underlying essence) and establishing consciousness-matter (brain) relations is hard.

Problem 1: Cartesian Dualism seems to place consciousness outside natural science.

Eccles and others believe it is outside the explanations of science.

The ideas “mental”, “physical”, “materialism”, “idealism”, “monism”, “dualism”…are thought of as clear and that the issues have to be posed and resolved in these traditional terms. This is a source of problems. As a result of accepting this framework the reality of consciousness implies dualism and negation of the scientific world view that took 400 years to develop. This motivates many scientists (Dennett is one) and philosophers to eliminate consciousness by reducing it to something else. A small number of scientists (Chalmers, Penrose and others) accept dualism - this, too, is problematical.

Searle believes that the old categories and ingrained habits of thought need to be discarded. Consciousness is a biological problem: brain causes consciousness.

Problem 2: That the brain “causes” consciousness is “philosophically loaded” - it seems to imply dualism.

But the implication is only on the “effect temporally follows cause” notion of causality.

An example of cause-effect that is not temporal is the solidity of a table that is caused by the behavior of molecules but is not an extra event - it is a feature of the table.

Consciousness is like that - it is a feature of the brain.

Problem 3: How do publicly observable phenomena - brain processes - produce the private, subjective characteristic of all consciousness? This is the hard or philosophical problem…I called it the fundamental problem.

It is related to, but definitely not the same as the scientific problem of explaining, in detail the kinds and particulars of mental states and process on the basis of the kinds and particulars of the brains anatomical divisions and its physiological states and processes.

Problem 4: Taking the computer metaphor of the mind too literally.

Strong AI: mind is nothing but a computer program. Searle refutes this:.

The Chinese room argument - minds have semantics; programs do not - they are purely syntactical. The Chinese room showed that semantics is not intrinsic to syntax. This argument due to Searle is famous; it dates to the early 1980s.

Searle provides a newer argument, one that he thinks is more powerful and decisive: Syntax (computation) is not intrinsic to physics (computers) but is due to interpretation, i.e., it syntax is context dependent.

Francis Crick, the Binding Problem and the Hypothesis of Forty Hertz

Searle thinks that the breakthrough to explaining consciousness will come from some simple system. Still, says Searle, this does not detract from Crick’s neurobiological explanations of vision - a good choice since so much work has been done on it.

Crick’s philosophical mistakes.

Thinks the problem of qualia is about communicating the subjective aspect.

Talks eliminative reduction, practices causal emergentism - the argument against elimination is that the two features exist despite reduction but Crick advances Patricia Churchland’s mistaken criticism of the anti-reductionist argument.

Talks about the neural correlates of consciousness - which goes against reduction since a correlation is between two things - but correlation explains nothing - explanation requires a causal theory. Further, Crick sometimes denies direct perceptual awareness using a bad philosophical argument - the one that interpretations can sometimes be mistaken from the 17th century - due to Descartes and Hume.

Crick talks of the binding problem - how the brain unifies the different aspects of a perception into a unified whole. Based on work by Wolf Singer in Frankfurt and others spatially separated neurons corresponding to shape, color, movement fire in the range of 40Hz Crick and Koch speculate that these might be the brain correlates of visual consciousness. Searle thinks this is not an explanation, it is at most an explanation and makes a proper analogy to an internal combustion engine, stating that combustion correlates with motion but this is not explanation.

AM: an explanation would be that the chemical energy is really the sum of molecular potential energies that is liberated in combustion causing higher molecular kinetic energies and momentum that in turn cause greater forces to be exerted by individual molecules that aggregate as a greater total force exerted by the burning gas on a piston that causes motion. A black box macroscopic explanation could have been given but the microscopic aspect makes the analogy useful in the present context. There are two key points (1) the identification of micro-macro physical properties and (2) that at each point in the chain of events the causation is the invocation of a law - Newton’s second law of motion.

Explanation of consciousness would be similar but an ingredient is missing: what is the identification of consciousness with the physical or biological levels that do not include explicit mention of consciousness[1]. The identification could be microscopic or macroscopic or both. However it poses a problem not present in the physical case for the identification of liberation of chemical energy with motion is based on a chain of reasoning through a law of physics. Where does the identification of consciousness with the physical level come in? It need not be through a law and three solution ideas arise; (1) just as force is causally related to motion through a causal law, what are the law-like causal powers of consciousness, (2) the flip from “as if” or third person to the phenomenal or first person modes of description could well be a simple identity once enough is known about bio-physiology and about consciousness and mind, and (3) identifying the material realm as a phase or territory within a graded idealism that includes the case of classical or other materialism and, possibly, natural rather than bridging laws. An alternative to idealism is to identify a realm of experience in a field that may be unspecified. The field may turn out to be matter, being, mind…and the relation to the field may be effect or correlate (matter), phase (being), embedding or identity (mind). Lack of specification provides an overview within which specifications may be situated and related e.g. causal relationship is a case within correlation; there are various reasons for a disposition toward a causal relation but the final explanation may or may not be causal and therefore a hierarchy in which cause is situated within correlation provides a research program within which specific ideas may be tested without the rigidity of unnecessary commitment. The arrangement allows for awareness to be the world seen through itself. A final solution may be formulated from a number of solution ideas and include one or more of the above. These solution ideas, especially (1) and (3) define research projects on which I have done work.

The fundamental problem of metaphysics is not so much the question of idealism vs. materialism and so on but the question of the meaning of “idea”, “being”…and, relative to humankind, not so much the given meaning or usage but are the potential meanings and usages….

…and potential, relative to humankind, does not mean possible but, rather, what is true but not yet known.

…and the fundamental psychological issue of metaphysics is to use but not be wedded to given/received meanings, usages and knowledge.

Gerald Edelman and Reentry Mapping

“Of all the neurobiological theories of consciousness I have seen, the most impressively worked out and the most profound is that of Gerald Edelman.”

But:

“…it is possible that a brain could have all these functional, behavioral features, including reentrant mapping, without thereby being conscious.”

And, recalling Crick’s ideas:

“…the problem is the same as the one encountered before: how do you get from all these structures and their functions to the qualitative states of sentience or awareness that all of us have.”

AM: For this critique by Searle to be significant and reasonable:

Explanation of subjectivity must be tractable on scientific/material accounts.

The problem is, of course, the explanatory gap between subjectivity and brain/matter that, in scientific description, is devoid of matter.

Tractability: the gap is merely a gap and not an unbridgeable chasm.

There should be a specification of the kind of criteria that an explanation should satisfy. This follows from the fact that it is not obvious what these criteria might be - presumably the system should have certain kinds of effects or causal powers…but what are these powers and how will we demonstrate that those causal powers, even if equal to the causal powers of consciousness, imply consciousness? Of course, the criteria may not yet be known and may be tied to the explanation.

It seems to me that that criterion (set) is going to have to be magic (in the sense of insight, a change of Gestalt…but not in the abracadabra sense)…or plain obvious (seen but not recognized…); Searle’s “consciousness is a feature of the brain” is a candidate but is not yet very helpful.

Roger Penrose, Kurt Gödel, and the Cytoskeletons[2]

The main ideas of Roger Penrose as presented in The Emperor’s New Mind, 1989 and Shadows of the Mind, 1994, are:

Minds are capable of non-computable (non-algorithmic) processes, therefore minds cannot even be simulated by computers or algorithms.

The mathematical theories of classical and today’s quantum physics are computable and therefore the physical elements underlying minds and mental processes must include phenomena that will require description by some new and future non-computable physical theory.

Penrose extends the argument in (a) to quantum computers defined as machines that obey the rules of today’s quantum mechanics. The argument is that this class of quantum computers are equivalent to the class of classical computers that are enhanced by a randomizing element. Therefore quantum computers are also limited to computable operations. This is also the argument for the computability of quantum mechanics to date.

Penrose’s candidate for the new physics is quantum gravity.

Obviously, this discussion omits a wealth of detail and subtlety including some crucial issues such as how a neuron might deploy quantum behavior. The problem with the latter is the issue of the superposition of neuron computation given the disturbance of the environment by each neuron signal.

Briefly, the proposed and tentative resolution is “quantum coherence in microtubules.”

Searle’s responses:

“…from the fact that a certain type of computational simulation cannot be given of a process under one description it does not follow that another type of computational simulation cannot be given of the very same process under another description.” (p. 71).

And :”An intelligent version of Weak AI[3] should attempt to simulate actual cognitive processes. Now, one way to simulate cognitive processes is to simulate brain processes…” (p. 72) The point to Searle’s argument is that simulation of brain processes is not a simulation at the level of mathematical reasoning. Penrose’s argument, according to Searle, “shows that there cannot be a computational simulation at the level of mathematical reasoning.” Searle continues, “Fine, but it does not follow that there cannot be computational simulation of the very same sequence of events at the level of brain processes, and we know that must be such a level of brain processes because we know that any mathematical reasoning must be realized in the brain.” (p. 74).

“How is it even conceivable that his hypothetical quantum mechanics could cause conscious processes? What might the causal mechanisms be?” (p. 84).

“In any case the motivation for his whole line of reasoning is based on a fallacy…there is no problem whatever in supposing that a set of relations that are noncomputable at some level of description can be the result of processes that are computable at some other level.” (p.85).

Consciousness Denied: Daniel Dennett’s Account

No summary.

David Chalmers and the Conscious Mind

Traditional philosophy of mind:

Monists.

Idealists.

Materialists.

Dualists.

Substance dualists.

Property dualists; The same substance, e.g. a human being, can have both types of property.

Although most people are probably dualists, most professionals in philosophy, psychology, AI, neurobiology and cognitive science are materialists; Thomas Nagel and Colin McGinn are among the relatively few property dualists and the very few substance dualists are those such as Sir John Eccles who have a religious commitment to the soul.

A problem with materialism: a lot of mental data - pains, beliefs, desires, thoughts,… are left over after the material facts. Materialists think they have to get rid of mental facts by reducing them to material phenomena or showing that don’t exist. Even when they are reduced, mental facts continue to factually present, to be psychologically significant and to be causally efficient despite the alleged reduction; and elimination is obviously false.

Brief history of 20th century reductive materialism

Radical or logical behaviorism - in contrast to the research program of methodological behaviorism - insists that there are no mental facts, no mental phenomena. Mental states are (and mental terms are at most terms that refer to) patterns of behavior where behavior is body movement without a mental component. Chief proponents: Gilbert Ryle, Carl Gustav Hempel.

Problems:

Intuitively false: pain and pain behavior are different.

Omits causal relations: belief causes behavior…in contrast to the belief is “behavior” of behaviorism.

Circularity: cannot be analyzed without reference to other mental states (beliefs and desires require each other).

The Identity Theory or Physicalism: mental states are brain states…J.J.C. Smart and others.

Problems:

What characteristic of brain states makes them mental - as opposed to brain states that are not mental?

It is too restrictive to say that only the brain can have mental states.

Functionalism: mental states are physical states but it is their causal relations that make them mental. Any system with the right causal relations is mental.

Resolves some of the objections to (1) and (2) above.

Utterly implausible - it leaves the mind out of mind e.g. pain is not pain’s causal relations…but functionalism is the best form of materialism available and so is the most widely held philosophy o mind today. In the version linked to computers it is the dominant theory in cognitive science.

Problems:

Although brains are not necessarily the only things that can have minds, not everything that has the right causal relations has mental states - the Chinese Room argument. Recall that functionalism is also known as black box functionalism…if functionalism is interpreted in a way that eliminates the black box(es) and input-output relations at a certain level of description then it becomes physicalism.

Leaves out qualia…the red-green inversion shows that different subjective states are functionally identical. Mental states are still (functional states are) physical states in functionalism.

Does not how different physical states have the same causal relations.

Strong AI: a version of functionalism in which the computational state of a computer is exactly like a functional state in a brain. Mental states are information processing states of (a program implemented in) the brain.

Problems:

Scientific commonsense: pains are qualitative experiences caused by specific neurobiological processes.

According to functionalism, pains are physical states in brains or anything else with the right causal relations (read program)…the physical states don’t cause the pain, they are the pain.

Philosophers sympathetic to the functionalist perspective have a choice when explaining consciousness: accept he irreducibility of consciousness and deny functionalism (Thomas Nagel) or keep functionalism and deny irreducibility (Dennett).

Chalmers’ position: he wants both functionalism and (property) dualism: functionalism (and strong AI) are adequate accounts of mind up to consciousness which must then be tacked on and which is not subject to functionalist analysis...Chalmers calls this “non-reductive functional dualism.” (p. 249)...”Cognition can be explained functionally; consciousness resists such explanation.”...Searle says:

This is peculiar because functionalism evolved to avoid irreducibility of mental phenomena and so avoid dualism.

Chalmers uses standard arguments to prove functionalism can’t account for consciousness but does not accept the same arguments against functionalism in general.

Although functional organization is not consciousness the two always go together - “As long as the functional organization is right conscious experience will be determined.”

“The Conscious Mind” is a symptom of desperation in cognitive studies: the main research program - computer functionalism - would be hard to give up but a remotely plausible functional account of consciousness has not been given.

Chalmers’ “solution” is to keep the ideology (functionalism) and accept consciousness and its irreducibility, which the cognitive studies community is largely ready to accept.

Chalmers argues that consciousness is not physical because it is not logically supervenient on the physical.

...and that there are two aspects to mental states - a functionalist one (in which pain, for example, is not conscious) and the subjective one in which pain is a conscious sensation.

...the relation between the two occurs on account of the “principle of structural coherence” - there is a 1-1 relation between the structure of consciousness and functional organization.

Chalmers argument for the principle: without it conscious states could change without changing behavior (functional organization) but this is impossible since change in mental content must be mirrored in a change in functional organization – which does not follow since it is what was to be proved and since functional organization is not brain state.

The implausibility of Chalmers’ position

As noted above, psychological terms have two quite distinct meanings - a functionalist one referring to material entities and a conscious meaning referring to conscious entities.

Consciousness is explanatorily relevant to everything physical that happens in the world. In particular, consciousness is irrelevant to human behavior.

Even your own judgments about your consciousness cannot be explained - neither entirely nor even in part - by your consciousness.

Consciousness is everywhere - the absurd view called pan-psychism.

Chalmers’ responses

Brain causes consciousness is equally implausible (Searle: but that’s irrelevant since it’s empirically true.

Shifts the burden of argument: critics should prove why not pan-psychism.

Searle’s response

The absurdities follow from joining property dualism to the contemporary functionalist-computationalist account of mind.

By dropping property dualism and functionalism:

There are not two definitions of psychological terms. Rather, only systems capable of consciousness can have any psychology at all.

There is no compulsion to say that consciousness is explanatorily irrelevant.

As a result consciousness is essential where its own representation is concerned.

There is not the slightest reason to adopt pan-psychism.

Resolution

Assuming the Cartesian vocabulary and modern science forces materialism, which omits consciousness and so forces some false strategy like functionalism. So: jettison the Cartesian categories and recognize that consciousness is a biological process.

Israel Rosenfield, the Body Image, and the Self

No summary.

Conclusion: How to Transform the Mystery of Consciousness into the Problem of Consciousness

Some final suggestions:

The philosophical importance of computers, like that of any new tool, is exaggerated. This explains the attachment to AI and computer functionalism, which is anti-biological.

Brains matter crucially for consciousness.

Crick, Edelman and Penrose are on the right track.

There is no unifying principle of neuroscience today.

One approach: through the unconscious e.g. what is the difference between blind sight and conscious sight.


John Searle, The Rediscovery of The Mind, 1992

Introduction

The book addresses the denial or the reduction of simple and obvious truths about mind - the existence of qualitative conscious states, of intrinsically intentional states such as beliefs, desires and intentions - in the prevailing view in mainstream philosophy of twenty years ago[4] and in the new discipline of cognitive science. Related concerns were the denial that “consciousness and intentionality are biological processes caused by lower-level neuronal processes in the brain, and neither is reducible to something else. Furthermore, consciousness and intentionality are essentially connected in that we understand the notion of an unconscious intentional state only in terms of its accessibility to consciousness.”

“I try to overcome Cartesian shibboleths such property dualism, introspectionism, and incorrigibility but the main effort is to locate consciousness within our general conception of the world and the rest of our mental life.”

What’s Wrong with the Philosophy of Mind

The first three chapters criticize the dominant views in the philosophy of mind - materialism and dualism.

Chapter 1 analyses the nature and source of the materialist and (property) dualist confusions over mind; their powerful hold; six varieties of “unlikely theories of mind,” the (often unstated) theses that are the foundation of modern materialism and their origins, and “undermining the foundations.”

The basic hold of materialism has origins in i) the hold, often implicit, of the Cartesian view, ii) the idea that to be scientific is to objective and that mental states are subjective. Searle points out that the subjectivity of mental states is epistemic - the content is experienced, but the ontology of such states is objective - although a pain is private and experienced subjectively, its existence is objective.

The Recent History of Materialism: The Same Mistake Over and Over

Appendix: Is There a Problem about Folk Psychology?

Regarding the following sections, comments are added only where the material is not covered in the popular account, The Mystery of Consciousness, is not contained in other comments here, or is peripheral to present purpose of general understanding.

The Mystery of Materialism

Behaviorism

Type Identity Theories: A type identity theory says that every type of mental state corresponds exactly to a specific type of brain state.

Token-Token Identity Theories… “For token instance of a mental state, there will be some token neurophysiological event with which that token instance is identical.” This removes the objection to type identity theories that “…it seems too much to expect that every type of mental state is identical with some type of neurophysiological state.”

Black Box Functionalism

Strong Artificial Intelligence: Mind is nothing but a computer program. The union of artificial intelligence and functionalism, computer functionalism. “…one of the most stunning aspects of this union was that it turned out that one can be a thoroughgoing materialist and still believe, with Descartes, that the brain does not really mater to the mind.”

Eliminative Materialism: Mental states and processes do not exist. One version of this is the attack on “folk psychology” the everyday account of psychology - that beliefs, desires as understood in common human relations (unburdened by academic apparati) is, if not wholly false, then at most approximate, that this will be revealed to be true by a “mature cognitive science” and, therefore, beliefs, desires and so on do not exist. Technical objections center around the fact that folk psychology is not a research project. Commonsense objections are that eliminative materialism seems crazy - it seems crazy to “say that I never felt thirst or desire, that I never had a pain, or that I never actually had a belief, or that my beliefs and desires don’t play any role in my behavior.” Defenders of eliminative materialism “almost invariably respond with the heroic-age-of-science maneuver…that giving up belief in beliefs is like giving up the belief in flat earth…” (P. S. Churchland, “Reply to McGinn,” in Times Literary Supplement, Letters to the Editor, March 13, 1987.

Naturalizing Content: This is based on a twofold idea, i) marginalize consciousness since we can get along with intentionality alone, ii) naturalize intentionality through externalist causal theories of reference. The idea is to get “an account of intentional content solely in terms of causal relations between people, on the one hand, and objects and states of affairs in the world, on the other.”

The objections to recent materialism are commonsense - leaves or denies mind or intentionality, makes caricatures of the straw idols it criticizes, imputes mind to things that don’t have minds; and technical. The technical objections are more specialized - they are usually tailored to the version of materialism being considered. Common themes are circularity and inadequacy.

Breaking the Hold: Silicon Grains, Conscious Robots, and Other Minds

Argues against any logical or necessary connection between mental states, especially conscious ones, and behavior.

Argues that behavior is not the sole basis on which we know of the existence of other minds - that except when doing philosophy there really is no “problem” about other minds, that our knowledge of other minds is, rather than being problematic or hypothetical, an unlabored recognition of similar causes from similar effects.

Consciousness and its Place in Nature

Chapters 4 to 8 characterize consciousness. Beyond materialism and dualism, chapter 4 locates consciousness in relation to the rest of the world.

Consciousness and the “Scientific” World View:

Meaning of consciousness: Searle: although it is not possible to give a definition in terms of necessary and sufficient conditions or genus and differentia or to give a noncircular verbal definition it is necessary to say what he means by the notion because it is often confused with several others.

Consciousness by example (ostention): the state of subjective - but not necessarily passive; it includes action and motivation - experience; sentience.

Consciousness is often confused with conscience, self-consciousness, cognition. Some philosophers e.g. Block, Two Concepts of Consciousness, use a sense in which there is no sentience, in which a total zombie could be conscious. Searle knows of no such sense, and, in any case, this is not his sense.

It is on-off; but once on it is a rheostat - there are different degrees of consciousness.

Objective of this chapter: situate consciousness with respect to an overall scientific conception of the world - especially the atomic theory and the evolutionary theory of biology…which are so fundamental, so well established they are not “up for grabs”, not optional for reasonably well educated citizens.

Atomic theory:

Atomism: big things are made up of and explained in terms of little ones.

Causal explanation: the properties of the big things are caused by the properties of the little ones acting according to laws of nature.

Evolutionary theory:

Life based primarily in C, N, O, H atoms.

Darwin and Mendel.

An example of atomism.

Subjectivity

Its ontology and relation to science.

Consciousness and the mind-body problem

The subjectivity of consciousness makes for the (putative) consequent difficulty (Nagel) or impossibility (McGinn) of the mind body problem.

Consciousness and selectational advantage

“One of the advantages conferred on us by consciousness is the much greater flexibility, sensitivity, and creativity we derive from being consciousness.”

…and “The behavior of he mechanist traditions…blind us to these facts…”

Reductionism and the Irreducibility of Consciousness

Accounting for the irreducibility of consciousness according to the standard patterns of scientific reduction:

Causally emergent properties of a system are those that in addition to the properties of the components and environmental relations also require the causal interactions among the elements.

Reduction - “A” is nothing but “B”…Types of reduction:

Ontological.

Ontological property.

Theoretical.

Logical or definitional.

Causal.

Consciousness is an irreducible feature of physical reality

It is a feature because consciousness is a causally emergent feature of the brain

Irreducibility: that consciousness is nothing but (in the sense of ontological reduction) brain processes leaves out the essential ontological feature of, e.g., pain - i.e., its subjectivity

Irreducibility of consciousness has no deep consequences

…because irreducibility is a trivial consequence of the pragmatics of the definitional practices of irreducibility

AM: although consciousness is bio-physical, we do not know how it is bio-physical and so it remains a mystery in the sense of wonder though the irreducibility is not mysterious in any sense

Supervenience comes in constitutive and causal flavors (check to see whether these are related or identical to the logical and natural varieties that Chalmers uses) and only the causal one is significant for mind-body issues…it is bottom-up causation…and beyond this is not useful in philosophy

The Structure of Consciousness: an Introduction

What are the structural features of consciousness?.

Two crucial issues that Searle says he understands insufficiently to make much comment.

Temporality…the asymmetry between the experience of space and time - unlike the experience of physical reality, consciousness itself is experienced only as temporally extended…the mismatch between physical and phenomenal time.

Society…that society - the category of “other people” - plays a special role in the experience and structure of consciousness; that the capacity for assignment a special status to other loci of consciousness is both biologically based and is a Background presupposition for all forms of collective intentionality.

A Dozen Structural Features

1.                  Finite Modalities

The senses - the traditional five, the sense of balance, proprioception. Proprioception includes physical sensations - of hot and cold, of pain…and of sensory awareness such as the position of my arms and legs (the kinesthetic sense) and the feeling in my right knee. Stream of thought includes words, images, feelings such as emotions, felt drives such as thirst, the contents of all conscious intentionality.

The limitation to this particular set of modalities is evolutionary contingent, other species have other sensory modalities. Searle does not explain that the finiteness of the number of modalities is necessary; it is presumably necessarily related to the physical properties of the world and contingently related to our particular evolutionary history and (set of ) niche(s).

The modalities may be tinged or have tones…Searle points out the unpleasant/pleasant dimension; the origin of the dimension and its polarity are, no doubt, adaptive.

2.                  Unity

Normally, the stream of conscious experience contains disparate elements that, present in consciousness as a unity. The world presents as one world. Unity exists in two dimensions: horizontal unity is the organization of conscious experiences through small stretches of time; vertical unity is the simultaneous awareness of the diverse features of any conscious state - what Kant called the “transcendental unity of apperception.”

The latter is a generalized binding problem - a problem whose neurophysiological understanding is rather primitive[5].

“Without these two features we could not make normal sense of our experiences.”

3.                  Intentionality

Most, but not all, consciousness is intentional. It is difficult to distinguish descriptions of objects from descriptions of experiences of objects because the features of the objects are precisely the conditions of conscious experiences of them…the two vocabularies of description are the same.

Unlike the objects, the experiences are always perspectival - they are from a point of view; therefore, all intentionality is aspectual - every intentional state has an aspectual shape.

4.                  Subjective feeling

“What-it-is-like”; the key feature of consciousness…and the source of philosophical puzzlement concerning consciousness.

5.                  The Connection Between Consciousness and Intentionality

“Only a being that could have conscious intentional states could have intentional states at all, and every unconscious intentional state is at least potentially conscious.”

…and, from the next Chapter, “The notion of an unconscious mental state implies accessibility to consciousness.”

6.                  The Figure-Ground, Gestalt Structure of Conscious Experience

Related to attention. Figure and ground are both in consciousness, only the figure is in attention. The discrete form may be modified as a discrete-continuous form. A special case of structuredness; all (normal) perception is perception as, all (normal) consciousness is consciousness as

7.                  The Aspect of Familiarity

The aspect of familiarity is not a separate feeling but part of normal consciousness; it is not due to experience under familiar circumstances but is part of the constitution of consciousness; familiarity found even in unfamiliar scenes that have some familiar shape or element; familiarity comes in degrees; familiarity includes the experience of organization and order.

“…So these features hang together: structuredness, perception as, the aspectual shape of all intentionality, categories, and the aspect of familiarity.”

8.                  Overflow

Conscious states in general refer beyond their immediate content; the context is part of the baggage…

9.                  The Center and the Periphery

Attention. Attention goes away from where it is not needed. What is conscious and not at the center - not in attention - is peripheral…but not, as is sometimes mistakenly thought, (in the) unconscious.

10.              Boundary conditions

Part of the situadedness of a scene though not even in the periphery; pervasive…and this is noticeable in its breakdown - for example the sense of disorientation when one is suddenly unable to recall time and place…

11.              Mood

The color or tone - elated, depressed, cheerful, neutral - of an experience. Moods are conscious but not necessarily intentional. A mood by itself never constitutes the whole content of a conscious state. We are always in some mood or other - even neutrality of mood has the feel of some mood…

12.              The Pleasure/Unpleasure dimension

Regarding whole conscious states, ones that possess unity and coherence described above, “it seems to me there is always a dimension of pleasure and unpleasure.” And, “As with mood, we must avoid he mistake of supposing that the intermediate and therefore nameless positions on the scale are not on the scale at all.”

Three Traditional Mistakes

All conscious states are self-conscious.

Consciousness is known by a special faculty of introspection.

Knowledge of our own conscious states is incorrigible. We cannot be mistaken about such matters.

Searle analyses these mistakes. He believes 2, and 3 - and perhaps 1 - have a common origin in Cartesianism. He believes they are all mistaken. Incorrigibility and introspection have nothing to do with the essential features of consciousness…(but) several recent attacks on consciousness, such as Dennett’s, are based on the mistaken assumption that if we can show that there is something wrong with the doctrine of incorrigibility or introspection, we have shown that there is something wrong with consciousness.

The Unconscious and its Relation to Consciousness

How do we account for the unconscious and its relation to consciousness?

The Unconscious:

…refers to mental states that I am not thinking about &or I have repressed--in contrast to non-conscious phenomena that are non-mental.

The connection principle

The notion of an unconscious mental state implies accessibility to consciousness.

For a state to be mental (conscious or unconscious) it must be intentional. This implies 1) unconscious states must be genuinely intentional…not just “as-if” intentional, and 2) any theory of the unconscious must account for the fact that intentional states represent their conditions of satisfaction only under certain aspects, and those aspects must matter to the agent.

The Argument for the connection principle

The ontology of unconscious mental states, at the time they are unconscious, consists entirely in the existence of purely neurophysiological phenomena.

The following is Searle’s summary of his seven point argument for the connection principle:

To summarize: The argument for the connection principle was somewhat complex, but its underlying thrust was quite simple. Just ask yourself what fact about the world is supposed to correspond to your claims. When you make a claim about unconscious intentionality, there are no facts that bear on the case except neurophysiological facts. There is nothing else there except neurophysiological states and processes describable in neurophysiological terns. But intentional states, conscious or unconscious, have aspectual shapes, and there is no aspectual shape at the level of the neurons. So the only fact about the neurophysiological structures that corresponds to the ascription of intrinsic aspectual shape is the fact that the system has the causal capacity to produce conscious states and processes where those specific aspectual shapes are manifest.”

Two Objections to the Connection Principle

No summary.

Could There Be Unconscious Pains?

No summary.

Freud on the Unconscious

Freud thinks that the ontology of the conscious is the same as that of the conscious.

Remnants of the Unconscious

In Searle’s ontology unconscious phenomena do not and cannot keep their unconscious shape; Searle refers to the naïve picture in which unconscious phenomena keep their conscious shape.

Four notions of the unconscious

As-if metaphorical attributions of intentionality to the brain which are not to be taken literally.

Freudian cases of shallow (repressed) unconscious phenomena; these are potentially conscious.

Shallow unconscious mental phenomena that just do not form the contents of consciousness at any given point in time. these are potentially conscious and are what Freud called the pre-conscious.

Deep unconscious mental phenomena that are in principle inaccessible to consciousness. There is no evidence for their existence and postulation of their existence violates a logical constraint on the notion of intentionality.

Consciousness, Intentionality, and the Background

What are the relations between consciousness, intentionality, and the Background capacities that enable us to function as conscious beings in the world.

The Background

The background is the set of mental capacities, abilities, and general know how that enable mental states to function; it is non-intentional. The background is the mental correlate of context.

The Thesis of the Background

Intentional phenomena such as meanings, understandings, interpretations, beliefs, desires and experiences function only within a set of Background capacities.

Intentional phenomena only determine conditions of satisfaction relative to a set of non-intentional capacities - the Background.

The Network

Intentional states function only within a Network of intentional states that is part of the background.

Consciousness, Intentionality, and the Background

All conscious intentionality - all thought, perception, understanding etc. - determines conditions of satisfaction relative to a set of capacities that are not and could not be part of that very conscious state.

The Critique of Cognitive Reason

Criticisms of the dominant paradigm in cognitive science

Searle distinguishes three positions:

The view that the brain a digital computer, or cognitivism.

The view that the mind is a computer program, or Strong AI.

The view that the operations of the brain be simulated on a digital computer, or Weak AI.

Searle agrees with Weak AI but disagrees with Strong AI. Searle has previously given arguments against Strong AI, and show that the truth of Weak AI is somewhat trivial. He notes that the issue of cognitivism might seem to lose much of its interest due to the judgment against Strong AI. This, however, is not the case since, granted that there is more to the mind that the syntactical operations of a digital computer, nonetheless, it might be the case that mental states are at least computational states, and mental processes are computational process operating over the formal structure of these mental states. He adds that that position is held by a fairly large number of people. The chapter is a criticism of cognitivism.

Paraphrase of the summary of this chapter’s criticism of cognitivism:

Computation is syntactic: it is symbol manipulation. Therefore, syntax is not intrinsic to physics; it is not discovered physics, it is assigned to it.

Therefore, though a computational interpretation could be assigned to the brain - a computational interpretation can be assigned to anything, it is not intrinsically a digital computer.

Computers (and programs) are designed to facilitate computational use. However, we assign / interpret syntax and semantics to / in the physics…the causal explanations of the computer are, then, the causal powers of the machine as a physical system together with the intentionality of the interpreter or “homunculus”. The standard, tacit, way out of this is to commit to the homunculus fallacy[6]. These arguments cannot be avoided by supposing that the brain is doing “information processing.”

The Proper Study

How to study the mind and how to not make so many obvious mistakes

Consciousness, in all in forms including the active ones, is the special feature of the brain, which is a biological organ. All of the great features that have been thought of as special to the mind are dependent on consciousness: subjectivity, intentionality, rationality, free will, mental causation. More than anything else, it is neglect of consciousness that results in so much barrenness and sterility in psychology, the philosophy of mind and cognitive science. The study of mind is the study of consciousness. The future will give us not only new explanations but new forms of explanation.

The connection principle - the principle that unconscious states are mental states by virtue of the fact that they are accessible to consciousness - eliminates a whole level of deep unconscious causes. The normative element that was supposed to be inside the system in virtue of its psychological content now comes back when a conscious agent outside the mechanism makes judgments about its functioning.

There are two levels of mental process: causative (brains) and intentional (including consciousness.) The functional level is not an extra level: it is a causative level defined in terms of our interests i.e., the functional level is normative. The intentional level. Intentional phenomena are causal but related to normative phenomena such as truth and falsity, success and failure, consistency and inconsistency, rationality, illusion and the conditions of satisfaction generally. Any process by which mental contents are related may or may not have mental content at all in addition to that of the relata; even though of course talk and thoughts of that principle will have content referring to the principles. There is a distinction between processes such as rule following that have content that functions causally in the production of behavior and those processes that associate mental contents with stimuli, output behavior and other mental contents.

The foregoing argues against the following typical strategy in cognitive science: discover patterns such as those found in perception and language and postulate combinations of mental representations that will explain the pattern in the appropriate way. When there is no conscious or shallow unconscious representation, postulate a deep unconscious representation. Thus Searle argues against Chomsky’s Universal Grammar and deep unconscious rule following in, for example, language acquisition - the argument is that rules should be shown to be causally efficacious and not merely predictive; he further argues that the phenomena of language acquisition can be much more simply accounted for by the following hypothesis: There is a language acquisition device innate in human brains that constrains the form of languages that human beings can learn. And, so says Searle, he arrives inadvertently at a defense of connectionism which shows, for example, the nature - in principle - of input-output conversion without positing rules, principles, inferences in between and is not obviously false or incoherent in the way that the traditional cognitivist models that violate the connection principle are.

Conclusion: how to do research in mind and consciousness. 1. Stop saying things that are obviously false, 2. Recognize that the brain produces consciousness in all its forms, 3. Focus on what actual facts in the world are supposed to correspond to claims we make about the mind, and 4. Rediscover the social character of the mind.


David Chalmers, The Conscious Mind, 1996

Introduction: Taking Consciousness Seriously

No review.

Two Concepts of Mind

What is Consciousness?

In the basic meaning, to be conscious is to have subjective experience. Related terms are phenomenal, qualia, experience and “what it is like”. Following Thomas Nagel, a being is conscious if there is something it is like to be that being. These are used in the same semantic sense. This is the sense used in this book and a central sense of interest in modern philosophy and cognitive science.

Alternate but related meanings are awakeness, awareness, to know, to have or focus attention, to introspect[7], to be able to report, to be self-conscious, to be conscious of consciousness, higher consciousness, and even conscience. Except conscience, these meanings fall into two clusters: variations, specializations and evolved forms of the phenomenal sense of the previous paragraph; and variations or aspects of the objective or third person counterparts of the phenomenal sense

Introduces the aspect of information at the start.

A theory of consciousness should do at least the following: “…it should give the conditions under which physical process give rise to consciousness, and for those processes that give rise to consciousness, it should specify just what sort of experience is associated. And we would like the theory to explain how it arises, so that the emergence of consciousness seems intelligible rather than magical.”

A catalog of conscious experiences

Visual, auditory, tactile, olfactory, gustatory experiences.

Hot and cold.

Pain.

Other bodily sensations. Headaches, hunger, itches, tickles, urinary urgency, proprioception - the sense of where one’s body is in space. Perception of time.

Mental imagery.

Conscious thought.

Emotions.

The sense of self.

The phenomenal and psychological concepts of mind

The double life of mental terms

The two mind-body problems

Two concepts of consciousness

In these sections Chalmers is setting up the distinction between the phenomenal and what he calls the psychological aspects of mental states and process. The psychological, as distinguished from the phenomenal, is studied behaviorally - in laboratories in Departments of Psychology steeped in the empiricist-behaviorist tradition, and is characterized by information processing. Thus, for example there are two meanings of pain, the phenomenal meaning - the feeling, and the psychological or functionalist, materialist, information meaning. And, regarding the co-occurrence of the two kinds of mental property “It is a fact about the human mind that whenever a phenomenal property is instantiated, a corresponding psychological property is instantiated.”

The following table shows some of the correspondences:

Phenomenological

Psychological

Consciousness as such

Awareness

Test: a mental notion is likely phenomenological if there could not be an instance of it without any particular phenomenological quality

Test: a mental notion is likely psychological if there could be an instance of it without any particular phenomenological quality

Problematicity: the phenomenological problem of consciousness, explaining how subjective experience arises or is caused by the brain is the hard problem of consciousness. A major philosophical-scientific problem.

Problematicity: significant technical, scientific problems in explaining the inter-relations among psychological phenomena, and the causal relations between psychology and action…but not a major philosophical issue. Philosophically, the easy problem of consciousness.

Varieties: the varieties of phenomenal experience

Varieties: awareness, awakeness, introspection, reportability, self-consciousness, attention, voluntary control, knowledge

Supervenience and Explanation

Supervenience

The frame work for the basis of consciousness in the physical, B is supervenient on A if A determines B.

“B-properties supervene on A-properties if no two possible situations are identical with respect to their A-properties while differing in their B-properties.”

In local supervenience the situations are individuals; in global supervenience they are worlds.

In logical supervenience the possibility is logical; in natural supervenience it is consistent with the laws in our world.

When B-properties supervene logically on A-properties, A-facts entail B-facts, i.e. it is logically impossible for A-facts to hold without B-facts holding.

Materialism is the doctrine that all positive facts (there exists as opposed to the negative there does not exist) about the world are globally logically supervenient on the physical facts.

Reductive explanation

…is explanation in terms of something else, usually simpler or lower-level. Reductive explanation of a phenomenon is not reduction of the phenomenon.

Logical supervenience and reductive explanation

A natural phenomenon is reductively explainable in terms of some low-level properties when it is logically supervenient on those low-level properties.

Conceptual truth and necessary truth*

Almost everything is logically supervenient on the physics*

THE IRREDUCIBILITY OF CONSCIOUSNESS

Can Consciousness be reductively explained?

No - consciousness is not logically supervenient on physical properties. And the relevant physical aspect when it comes to mental states is functional organization - “the pattern of causal organization embodied in the mechanisms responsible for the production of my behavior.”

Naturalistic Dualism

Chapter three showed that consciousness cannot be explained in physical terms. This chapter asks is consciousness physical? Or, can consciousness be reduced to physical properties of individuals? The answer is no! Consciousness is not physical because facts about consciousness are facts about our world that are over and above the physical facts. However consciousness supervenes naturally on the physical - a thing that is contingent on the fact that we are living in our world.

The Paradox of Phenomenal Judgment

The psychological and the phenomenological come together in phenomenological judgment.

TOWARD A THEORY OF CONSCIOUSNESS

The Coherence Between Consciousness and Cognition

Needs a non-reductive theory. Coherence is empirical. A coherence principle is that second order judgments (“I’m having a red sensation” vs. “That’s red.” Consciousness and awareness go lock-in-step. The principle of structural coherence goes beyond this: The various structural features of consciousness correspond directly to structural features that are represented in awareness.

Absent Qualia, Fading Qualia, Dancing Qualia

This chapter contains Chalmers’ arguments for the principle of structural coherence - that there must be a direct correspondence between consciousness and functional organization.

The argument is from consideration of fading and dancing qualia but amounts to the following: without the principle of structural coherence, individuals’ conscious states could change without change in behavior (functional organization) but this is impossible since change in mental content must be mirrored in a change in functional organization.

Consciousness and Information: Some Speculation

Moving toward computer functionalism, this chapter argues that information processing is the hallmark of mental activity and so of consciousness and that any information processing system, such as a thermostat - or a computer, must be conscious even if that consciousness is elementary and uninteresting.

This is the double-aspect principle of information: all information is realized in experience. This is pan-psychism, which Chalmers advocates and hedges at the same time. The hesitations come out in talk of “proto-phenomenal” experience, and “it may be better to say that a rock contains systems that are conscious: presumably there are many such sub-systems, none of whose experiences count canonically as the rock’s (any more than my experiences count as my office’s).” Chalmers therefore sees a need to constrain the double-aspect principle by narrowing the class of physically realized information spaces that have phenomenal counterparts: information that is available for global control, certain kinds of amplification or certain kinds of causation.

APPLICATIONS

Strong Artificial Intelligence

Strong Artificial Intelligence is the thesis that there is a nonempty class of computations such that the implementation of any computation in that class is sufficient for a mind, and in particular, is sufficient for the existence of conscious experience.

Here Chalmers elaborates and argues for this thesis and defends the thesis from attack by a number of standard arguments: the internal objections that even if a computer could simulate human behavior it would not have phenomenal experience -”the Chinese room”, “syntax and semantics”, “a simulation is just a simulation”, and the external objections that a computer could not even simulate behavior - objections from “rule following” (a computer could not perform creative acts), from Gödel’s theorem, from uncomputability and the fact that the brain processes may be continuous.

The Interpretation of Quantum Mechanics

Argues in favor of the beauty and simplicity of Everett’s interpretation despite the fact that it is the craziest interpretation.


Copyright and Latest Update

No Copyright by Reviewer, Anil Mitra PhD

Thursday, February 05, 2015

Footnotes

[1] All of the works I have read, including those reviewed by Searle, suffer from this defect. Furthermore, although Searle recognizes the problem and its fundamental nature, I do not see any adequate resolution of the problem or any adequate description of an approach  or criteria for an approach in Searle's work - including his, in my view correct, idea that the Cartesian habits of thought are hard to discard, that those habits are largely responsible for the fundamental mind-body confusions of the current academic milieu and that the correct approach is to recognize mental processes, consciousness as features of the brain - as topics in biology, “the most important problem in modern biology.” Personally, I wonder whether it is the Cartesian categories as such that are responsible for the problems. Perhaps the notions mind and body under other names or in nameless form are biologically, and possibly culturally, ingrained as capacities of adaptation to a certain bio-social environment that does not include knowledge or need for knowledge of any ultimate, or even extra-environmental nature of the world.

[2] Comment on Chapter 3 - review of Gerald Edelman's work on the theories of neuronal group selection and reentrant mapping as the basis of mental processes including consciousness - is omitted at least for now. Edelman's work is brilliant and interesting but I do not need to consider it for the purpose of this overview which is to summarize the fundamentals. I have read his Bright Earth, Brilliant Fire and recognize that for any future consideration of a detailed bio-physical theory of consciousness, familiarity with Edelman's ideas would be useful.

[3] The position that mental process can be simulated by a computer. It is not clear whether this means some, all, or all types of mental process. The inclusive version is problematic. Weak AI is the view that Penrose rejects in (a) and his rejection is a statement that though some mental process may be simulable others are not. Searle opposes Penrose's position and, for that opposition to have substance, Searle must mean that all mental processes are simulable.

[4] Searle is writing in 1992.

[5] But, see Neuro-Psychology - the Binding Problem.

[6] “The homunculus fallacy is endemic to computational models of cognition and cannot be removed by the standard recursive decomposition arguments. They are addressed to a different question.”.

[7] Apperception is the perception, especially the reflective apprehension, by the mind of it's own inner states.