In the previous post of this series, I argued that artificial consciousness is a matter of degree. Now we will begin to consider: degree of what?
This is a blog, not a book, so I will be content with the informal definition of consciousness that most people have: awareness of the environment and especially awareness of self. By awareness, we mean something more than the ability to think. What we have in mind is thinking about the fact that we’re thinking. This is the difference between a problem-solving sort of intelligence and consciousness, is it not?
A common chess-playing computer is an excellent problem-solver. As such, it exhibits its own specialized intelligence. However, we don’t consider it to be conscious, even in its specialty. That’s because it doesn’t think about what it’s doing, no matter how well it does it.
Could a machine think about thinking, and thus possess rudimentary self-awareness? I suggest that it could. To see why, let’s take a step back and ask what distinguishes thought from unthinking reflex.
Keeping this informal, rational thought consists of “mulling things over.” In order to gain distance from the raw, sensory input, we construct symbols that we can mentally arrange and rearrange. We are so used to doing this that we don’t even think of it as symbol-manipulation. Yet whenever we use language as an aid to thought, we are using symbols, even if we’re not using language out loud. And whenever we picture something in our minds, that picture is a symbol (it is certainly not the real thing).
We might think of an unthinking reflex as a direct reaction to stimulus without any intermediary symbols, and thought as the introduction of a symbol-processing phase in between stimulus and response. (This simplifies the situation. The distinction between direct reaction and reaction through symbols is not very clear. I’ll develop this theme in the next post.)
If thinking is symbol-processing, then thinking about thinking consists of constructing a second level of symbols about the first level, and manipulating those.
Humans can think about thinking about thinking about thinking, and so on, to an arbitrary depth.
Could a computer program do the same thing? Yes. Easily. In a future post I’ll explain one way to do it. But first we must ask whether an intelligence must understand this reflection on its own symbols in order to be conscious. That, too, will be considered in the next post.
I realized why I’m having trouble with machine based artificial consciousness. The missing ingredient is EMOTION. Emotion has biochemical origins. Emotions are a complex interplay between genetics and environmentally induced learning.
One could argue that emotion is quite a bit more primitive than the ability to think about thinking.
What do you think? 😉
I agree that emotion is part of what makes us human, but is it an essential ingredient in consciousness?
I’ve recently been exposed to some Eastern philosophy. Some Eastern schools of thought value the ability to think about the world in a clear-eyed way — to see things exactly as they are without letting our emotions and prejudices cloud our vision. Are they advocating that we lose even a smidgen of consciousness? No; they call this “mindfulness” and regard it as a higher consciousness.
Or consider the biologist who is analyzing data from the field. She does her best to avoid letting emotion color her reflections. Is she shutting off her own consciousness? I’m sure you’d agree she is not. In fact, she is trying to let the best of her consciousness (best in that situation, anyway) rule the day.
When you said that emotion is quite a bit more “primitive” than the ability to think about thinking, you exposed an important point. Emotion arises to a significant degree from the more primitive parts of our brain — precisely those parts that predominate in animals whose degree of consciousness is much less than our own.
Sure, there are some situations where a lack of emotion would almost seem like a lack of consciousness. But even then, I’d say emotion and consciousness are distinct things. The psychopath who kills without emotion is not unconscious; he is just sick.
In a future post, I’ll argue that there are fewer essentials to consciousness than we commonly think. Perhaps emotion is one of them.
Pingback: Artificial Consciousness: Symbol Manipulation | Path of the Beagle