Dec. 27th, 2006

chris_gerrib: (Default)
So I emailed yesterday's post to Peter Watts. Here (edited slightly to remove the greetings and email headers) is his reply:

I *welcome* debate. I love arguing. And truth be told, I find my own ruminations on this whole sentience thing to be extremely unpalatable. I really hope you're right and I'm wrong.

But if you are, it ain't because of fire.

The way I see it, what you've done is show that some kind of higher reasoning process must have been necessary to override intinctive fire-aversion responses. Fair enough. You also make a good case that those reasoning processes have to be able to look several steps ahead, i.e, recognise that long-term gain sometimes necessitates short-term pain. What you *haven't* done is show me why those processes would have to be self-aware.

This is what you said:


But fire is different. The instinct of every land animal (except man) is to avoid fire. Fire kills. So, to master the use of fire, somebody (actually, probably a lot of somebodies with burnt fingers) had to make a conscious decision to override their instinct. Not only that, they had to do so with a plan. The ape-man who said "fire pretty, me want" and grabbed it died. The ape-man who had a vision of fire as a useful tool (sometimes) succeeded. Consciousness, then, gives humans the ability to make long-term plans. You don't build a spaceship without the ability to do long-term planning.

What you've done here is slip consciousness into the argument as though it were impossible to make a "plan" or have a "vision" without it. True, you don't build a spaceship without the ability to do long-term-planning: but who ever said that planning process had to involve sentience? The simplest commercial chess program is smart enough to sacrifice its own pieces in pursuit of checkmate; chess is the archetypal long-term planning scenario. Yet no one has ever claimed that these programs were sentient.

Perhaps more fundamentally, recent peer-reviewed research strongly suggests that even *we* do our best complex problem-solving on an unconscious level, that our brains perform better at multivariable analysis when the conscious parts are actively excluded from the process.

The unconscious mind can handle more variables, perform more complex analyses, is far faster, and metabolically much cheaper than the conscious mind. Further, the conscious mind seems to get its input not from the sensory systems directly, but premassaged after those inputs have already been interpreted by nonconscious modules. So what does consciousness actually do, other than *observe* the results of the subconscious mind's analysis?

And why wouldn't the nonconscious parts of the mind figure out uses for fire much more rapidly than the self-aware homunculous? How do we know it hasn't?

Cheers,
P.

My reply is coming soon.
chris_gerrib: (Default)
A housekeeping note - I've re-tagged all entries related to my ongoing debate with Peter Watts as Blindsight so I can track them.

Peter, in his reply (posted earlier today) says "planning doesn't require sentience." He sites as an example computerized chess programs. He's correct. I was in error in my post, which focused on planning. (He's been ruminating on this subject for years, I just got started last week.) Before I move on, I will point out that my day job for the last ten years has been herding computer networks large and small, and I know exactly how that computer chess program works - it calculates every possible move. It's a brute force approach that requires at least a 33 MZ chip (33 million calculations / second) to work.

Peter points out that the conscious mind gets inputs "predigested" (my words) from the subconscious. Therefore, what exactly does the conscious mind do? Let's stick with the computer chess example. The computer can calculate all possible variations of moves and the probabilities thereof. But until a computer reaches up and tweaks the player's nose, it ain't sentient. The computer can't plan outside the rules of chess - it can't think outside the rules of the game - or "think outside the box." I would argue that the first time we can prove humans thought outside the box was when they controlled fire. There have been others - agriculture, metalworking and animal husbandry come to mind.

My contention is that computers and non-sentient beings are very good at following rules. But not following rules get you rocketships and the Internet. Don't get me wrong - once the problem gets narrowed to a "crank the numbers" level, the subconscious may be faster. (Or, it could be that going to sleep frees up enough clock cycles in the brain to do the calculations.)

Here's another way to look at it. Humans can sleepwalk, even sleep-eat. These are fairly complicated but routine behaviors, being performed without the conscious mind being engaged. Ever hear of anybody sleep-cooking? Successfully sleep-driving? If you want complex activities, living things seem to need consciousness. Again, Peter's forgotten more about evolution then I'll ever know, but everything I've read suggest that the smarter the animal, the bigger and more developed the cerebral cortex. If the cerebral cortex is the source of "consciousness" maybe animals are more "conscious" then we thought. But if you have to snap a line in the sand, my vote is for controlling fire.

Profile

chris_gerrib: (Default)
chris_gerrib

August 2025

S M T W T F S
      12
3456789
10 11 121314 1516
171819202122 23
24252627282930
31      

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Aug. 31st, 2025 09:44 am
Powered by Dreamwidth Studios