Newsgroups: comp.ai.philosophy
Path: utzoo!utgpu!watserv1!maytag!watdragon!violet!cpshelley
From: cpshelley@violet.uwaterloo.ca (cameron shelley)
Subject: Re: Testing for machine consciousness
Message-ID: <1990Nov1.151507.26595@watdragon.waterloo.edu>
Sender: daemon@watdragon.waterloo.edu (Owner of Many System Processes)
Organization: University of Waterloo
References: <3499@media-lab.MEDIA.MIT.EDU> <1990Oct4.154655.23004@canon.co.uk> <oliphant.4676@telepro.UUCP> <1990Oct30.091654.25318@canon.co.uk> <1990Oct31.023922.13795@watdragon.waterloo.edu> <1990Oct31.142817.1999@canon.co.uk>
Date: Thu, 1 Nov 90 15:15:07 GMT
Lines: 44

In article <1990Oct31.142817.1999@canon.co.uk> rjf@canon.co.uk writes:
>In article <1990Oct31.023922.13795@watdragon.waterloo.edu> cpshelley@violet.uwaterloo.ca (cameron shelley) writes:
>>
>>
>>  I'd like to inject a few comments regarding testing for machine
>>consciousness.
>>
>>  Firstly, why do we accept the belief that other humans are conscious?
>>(I use the word "belief" advisedly, since I think that knowledge of
>>another's subjectivity is problematic.)  I would argue that we use a
>>genetic analogy: I am human (which is now a genetic term), and I am
>>conscious; therefore since this other individual is human, he or she
>>is also conscious.  In other words, we believe ourselves to be conscious,
>>and we believe that the genetic connection between ourselves and other
>>humans is 'close' enough to preserve that property.
>
>I think cameron is on the right lines here, but I don't think he's
>quite got there.  For one thing, his account suggests that this is an
>intellectual phenomenon, but I don't think that can be true.  For
>another, he puts self-consciousness before belief in others'
>consciousness.  I think that we identify with, and therefore by (my)
>definition believe in the consciousness of, other people, long before
>we become self-conscious (even if we don't at that stage put it in
>quite those terms).
>

You're quite right when you say I was addressing "this" as an 
intellectual phenomenon, but I think this is reasonable since
I was addressing 'testing for consciousness' and not its
evolution or acquisition.  I agree that consciousness is deeply
connected to social interaction, but I question the attempt at
strongly ordering identification of others as prior to identification
of self.  I'm not saying it must be the other way around, just that
the ordering is more ambiguous than you seem to suggest.

Do you see your suggestions as having an impact on testing for
consciousness in machines?

[rest deleted, sorry! :>]
--
      Cameron Shelley        | "Fidelity, n.  A virtue peculiar to those 
cpshelley@violet.waterloo.edu|  who are about to be betrayed."
    Davis Centre Rm 2136     |  
 Phone (519) 885-1211 x3390  |				Ambrose Bierce
