Peter Watts' replies
Dec. 27th, 2006 03:01 pmSo I emailed yesterday's post to Peter Watts. Here (edited slightly to remove the greetings and email headers) is his reply:
I *welcome* debate. I love arguing. And truth be told, I find my own ruminations on this whole sentience thing to be extremely unpalatable. I really hope you're right and I'm wrong.
But if you are, it ain't because of fire.
The way I see it, what you've done is show that some kind of higher reasoning process must have been necessary to override intinctive fire-aversion responses. Fair enough. You also make a good case that those reasoning processes have to be able to look several steps ahead, i.e, recognise that long-term gain sometimes necessitates short-term pain. What you *haven't* done is show me why those processes would have to be self-aware.
This is what you said:
But fire is different. The instinct of every land animal (except man) is to avoid fire. Fire kills. So, to master the use of fire, somebody (actually, probably a lot of somebodies with burnt fingers) had to make a conscious decision to override their instinct. Not only that, they had to do so with a plan. The ape-man who said "fire pretty, me want" and grabbed it died. The ape-man who had a vision of fire as a useful tool (sometimes) succeeded. Consciousness, then, gives humans the ability to make long-term plans. You don't build a spaceship without the ability to do long-term planning.
What you've done here is slip consciousness into the argument as though it were impossible to make a "plan" or have a "vision" without it. True, you don't build a spaceship without the ability to do long-term-planning: but who ever said that planning process had to involve sentience? The simplest commercial chess program is smart enough to sacrifice its own pieces in pursuit of checkmate; chess is the archetypal long-term planning scenario. Yet no one has ever claimed that these programs were sentient.
Perhaps more fundamentally, recent peer-reviewed research strongly suggests that even *we* do our best complex problem-solving on an unconscious level, that our brains perform better at multivariable analysis when the conscious parts are actively excluded from the process.
The unconscious mind can handle more variables, perform more complex analyses, is far faster, and metabolically much cheaper than the conscious mind. Further, the conscious mind seems to get its input not from the sensory systems directly, but premassaged after those inputs have already been interpreted by nonconscious modules. So what does consciousness actually do, other than *observe* the results of the subconscious mind's analysis?
And why wouldn't the nonconscious parts of the mind figure out uses for fire much more rapidly than the self-aware homunculous? How do we know it hasn't?
Cheers,
P.
My reply is coming soon.
I *welcome* debate. I love arguing. And truth be told, I find my own ruminations on this whole sentience thing to be extremely unpalatable. I really hope you're right and I'm wrong.
But if you are, it ain't because of fire.
The way I see it, what you've done is show that some kind of higher reasoning process must have been necessary to override intinctive fire-aversion responses. Fair enough. You also make a good case that those reasoning processes have to be able to look several steps ahead, i.e, recognise that long-term gain sometimes necessitates short-term pain. What you *haven't* done is show me why those processes would have to be self-aware.
This is what you said:
But fire is different. The instinct of every land animal (except man) is to avoid fire. Fire kills. So, to master the use of fire, somebody (actually, probably a lot of somebodies with burnt fingers) had to make a conscious decision to override their instinct. Not only that, they had to do so with a plan. The ape-man who said "fire pretty, me want" and grabbed it died. The ape-man who had a vision of fire as a useful tool (sometimes) succeeded. Consciousness, then, gives humans the ability to make long-term plans. You don't build a spaceship without the ability to do long-term planning.
What you've done here is slip consciousness into the argument as though it were impossible to make a "plan" or have a "vision" without it. True, you don't build a spaceship without the ability to do long-term-planning: but who ever said that planning process had to involve sentience? The simplest commercial chess program is smart enough to sacrifice its own pieces in pursuit of checkmate; chess is the archetypal long-term planning scenario. Yet no one has ever claimed that these programs were sentient.
Perhaps more fundamentally, recent peer-reviewed research strongly suggests that even *we* do our best complex problem-solving on an unconscious level, that our brains perform better at multivariable analysis when the conscious parts are actively excluded from the process.
The unconscious mind can handle more variables, perform more complex analyses, is far faster, and metabolically much cheaper than the conscious mind. Further, the conscious mind seems to get its input not from the sensory systems directly, but premassaged after those inputs have already been interpreted by nonconscious modules. So what does consciousness actually do, other than *observe* the results of the subconscious mind's analysis?
And why wouldn't the nonconscious parts of the mind figure out uses for fire much more rapidly than the self-aware homunculous? How do we know it hasn't?
Cheers,
P.
My reply is coming soon.