What is consciousness?
When we think of consciousness, we do so on several levels-
1. Informal - as an 'everyday' human animal, ie as someone who subjectively experiences its repeated presence (being awake) and absence (being asleep) on a diurnal (daily) basis
2. Formal - as an objectively conceived and scientifically studied biological process. Adult humans also come to objectively concieve of life itself as a form of thematic consciousness and thence may conceive of its absence, death, as a form of (obviously permanent) unconsciousness. Of course, permanent unconsciousness can have no experiential analog, but therein lies an uncertainty which is exploited by religions who may lure followers with promises of an afterlife.
If we think about consciousness, we also become aware that we are talking about several types-
1. Awake vs asleep - presence or absence of self and space in which one is placed (as above)
2. Aware vs unaware - presence or absence of an object, item or quality in that space
3. Attention vs not attending - attentional focus on an object that we are otherwise always aware of
In terms of machine control metaphors, awareness is a 'bottom-up' or 'subductive' construct, while attention is a 'top-down' or 'supervisory' one.Even at this taxonomic level of analysis, consciousness clearly involves multiple cybernetic levels, ie the 'whole of mind'.
In Principles of Psychology, William James discussed the experimental ascending bisection of the frog spinal cord. As the position of the scalpel's cut got higher and higher, more and more of the animal's reflexes were destroyed, until after some point in the brainstem, the frog lost consciousness. James continued to make higher and higher cuts until he compromised those reflexes which involved breathing and other life-critical functions, at which the frog died. When he tried to cut other sections of the brainstem, those found to be associated with memory and learning, he found that the frog was stuck in the same moment in time, being unable to learn, and therefore (presumably) unconscious of the past. When he cut into the parts we now know as being motor planning areas, he found the frog could not anticipate, which he interpreted as the frog's inability to be conscious of impending events.
Clearly James was well ahead of his times, in considering both memory and motor representations as virtual forms of the more conventional physical type of consciousness. Following his example, there is one more division we must make- we can be conscious of self and/or other things at two 'planes'-
1. Physical/ spatial, ie spatio-temporal - we experience self or other things in physical spacetime
2. Virtual/ mnemonic, ie synto-semantic - we experience self or things in virtual memory-time, eg when speaking or thinking about them, either in the past or in the future. The most flexible kind of of language of thought (LOT) hypothesis is implicitly assumed. Clearly, dividing conscious items and spaces into physical and virtual categories strongly connotes a Computer Theory of Mind (CTM), in which the common feature is immediate autonomous control over memory area access.
The philosopher Jerry Fodor divides unknowns into problems and mysteries. Consciousness, he claims, is not just a scientific problem to be solved, but a mystery upon which 'light' must be shed before it can be regarded as a 'mere' problem. This is just pseudo-scientific nonsense. It is either a problem or it is not. Fodor was one of the main proponents of the Computational Theory of Mind, but has recently rescinded that position.
Marvin Minsky is, arguably, a most cogent commentator on this topic. He asks, for example, why we use the term consciousness only about shorter term memories? He (correctly, in my opinion) considers 'consciousness' to be the word we use to describe the contents of our current 'self-memorised moment', or, in the terms of TDE theory, the current Situation Image (SI). Following his lead, we might then suggest the use of 'imagination' for a future or putative SI, and 'memory' for a SI from the past.
Yet, one can see why he might tread warily in this area. Before the 1990's, using such poorly defined terms in peer-reviewed publications could lead to a certain loss of reputation, and consequently, funding. In the most cited work concerning the true nature of consciousness, by Schneider & Schiffrin (1977)**, the authors don't even use the 'c' word, they use the term 'controlled processing' to mean consciousness, and 'automatic processing' to mean subconsciousness (or unconsciousness, depending on what convention you use).
Yet the point they make is clear enough- the difference between conscious (controlled) and non-conscious (automatic) cognitive processes comes down to information criteria (ie novelty & relevance). Sensorimotor patterns that have been well-learned, ie rehearsed, are much more likely to become automatised. We are all aware of doing things on 'autopilot', eg after driving to work with a looming deadline, or listening to something interesting on the radio, we can recall next to nothing about the experience. If our minds have automatised a certain behaviour, it has memorised the precise functional and temporal relationship between the contextual conditions to be satisfied, and the contentful actions to 'fire' when those criteria are met.
This makes a lot of sense. Indeed, it is hard to imagine building any type of efficient cognitive system that didn't use some kind of prioritisation scheme when deciding which memories to keep and which memories to delete (or allow to be overwritten). The more that brains are studied, the more one is struck by the sheer efficiency and common-sense evolutionary processes use in their adaptational development.
TDE theory goes further than Schneider & Schiffrin, in that it allocates anatomical locations to both conscious and non-conscious procedures. Automatic behaviours are stored in the cerebellum, and activated by its anatomically related neural structures (basal ganglia and RAS). Controlled (conscious) processing occurs in the cerebrum, though not in every part of it.
In the multi-part illustration above, the middle diagram depicts the two-level, conscious/involuntary architecture used by the CNS. To the left are the diagrams which depict its conceptual development. At right is the neuroanatomical 'packaging' arrangement, depicted as a perspective sketch.
The split-brain operations used to control severe temporal-lobe epilepsy provide an important piece of information for this puzzle. In an amazing set of 'live' experiments, Sperry & Gazzaniga *** hypothesised, and others subsequently proved, the existence of two completely separate consciousnesses within each cerebral hemisphere. This conclusion applies not just to epileptics, but to each and every one of us- were it not for our corpus callosa, our minds would split into two separate people.
This is such an extraordinary, and some might say, counter-intuitive result. Yet it is a result clearly predicted when the TDE is recursively elaborated into the multi-level TDE-R, as the diagram below demonstrates.
When two separate TDE's are used as a model, the split-brain condition is simulated. Each TDE has a conscious part (part of the Parietal Lobe, and the Frontal Lobe) and an unconscious part (the other part of the Parietal Lobe, and the Temporal lobe).
When the TDE's are recursively combined into a TDE-R, as is the case for all neuro-anatomically intact people, only the left cerebral hemisphere (LCH) and part of the Reticular Activation System (the ascending RAS, or ARAS) support conscious processing, while the other part of the RAS (the descending part) and the right cerebral hemisphere (RCH) support non-conscious operations. This is because, in the TDE-R, the LCH, a part of every TDE which is normally accessible to voluntary attention, is functionally equivalent to the 'global frontal lobe', while the RCH, a part of every TDE which is NOT normally conscious, is functionally equivalent to the 'global temporal lobe' (see diagram below).
Of course, there is much more to the topic of consciousness than is covered by the preceding discussion. What has been covered here is the computational mechanism that drives it.
*Minsky, M. (1991) "Machinery of Consciousness", Proceedings, National Research Council of Canada, 75th Anniversary Symposium on Science in Society, June 1991
** Shiffrin, R., and Schneider, W. (1977). Controlled and automatic human information processing: II. Perceptual learning, automatic attending, and a general theory. Psychol. Rev. 84: 127-190.
*** Gazzaniga, M. S. and Sperry, R. W. (1964) Some comparative effects of disconnecting the cerebral hemispheres. Fed. Proc. 23, Part 1, 359 (Abstr.).
------------------------------ Copyright 2013 Charles Dyer------------------------------