Newsgroups: comp.ai.philosophy
Path: utzoo!utgpu!watserv1!maytag!watdragon!violet!cpshelley
From: cpshelley@violet.uwaterloo.ca (cameron shelley)
Subject: Re: You can't get semantics by playing with syntax.
Message-ID: <1990Nov20.151400.2252@watdragon.waterloo.edu>
Sender: daemon@watdragon.waterloo.edu (Owner of Many System Processes)
Organization: University of Waterloo
References: <1990Nov19.215824.7547@cbnewse.att.com>
Date: Tue, 20 Nov 90 15:14:00 GMT
Lines: 55

In article <1990Nov19.215824.7547@cbnewse.att.com> rhb@cbnewse.att.com (richard.h.bradley) writes:
>
>The argument can be suggested by considering an analogy used elsewhere.
>As digestion requires chemical interaction with a substrate, so thought
>requires semantical interaction with some object.  Formal simulation -
>incremental description of the process - will not digest an apple or
>create an idea.
>
[...]

>Through I/O devices, models of formal systems are able to interact with
>external physical objects.  Thus internal syntactical operations
>are able to affect and be affected by external things.  This critical I/O
>interface introduces all the semantics that should be relevant.
>
  This leaves unaddressed the fact that the objects of consideration
may not exist and therefore have never truly presented themselves to 
the I/O interface (I am speaking of intelligent creatures here of course).
The ability to anticipate, and therefore deal *meaningfully* with
things that may never be or occur is, I think, a vital component of
any model of thought.  Also, by saying that I/O introduces all the semantics
that "should be relevant" (to what?), you seem to be arguing against
any form of innate knowledge.  Are you therefore suggesting a 
behaviourist model of learning?

>The correctness or error of the subject statement may turn out to be
>unimportant.  If the output of the entire system is best explained as an
>interaction of thoughts, ideas, and circumstances, then it seems practical
>to ascribe thoughts and ideas to the system.  Although to affect the world
>an idea must have substance, the particular substance would seem not to be
>significant.  (Perhaps a dualist will disagree.)
>
  If by "substance", you mean the medium in which thoughts are processed,
then I agree.  I am unsure if you can describe the output of the entire
system in terms of the high-level entities like "thoughts" or "ideas".
The gap between volition (to use a single example) and execution is 
quite large for non trivial systems like people.  That is to say, the
thought "I am hungry, so I'll go to the store and get something to eat"
and the realization of the the necessary sequence of actions are very
different, yet it is the actions which are observed at the "output".

  What is important to note then, is that both input and output are
structured, but "thoughts" are still at liberty to ignore the structure
(the imagining implied above).  But since people are apparently capable
of acting on "input" never received, ie. translating a pure "idea"
into a structured series of actions, I think we can still conclude
that the semantics of "thoughts" or "ideas" are related to observable
structures - but not necessarily in some obvious or even direct 
fashion.

--
      Cameron Shelley        | "Logic, n.  The art of thinking and reasoning
cpshelley@violet.waterloo.edu|  in strict accordance with the limitations and
    Davis Centre Rm 2136     |  incapacities of the human misunderstanding..."
 Phone (519) 885-1211 x3390  |				Ambrose Bierce
