[HN Gopher] A landscape of consciousness: Toward a taxonomy of e...
       ___________________________________________________________________
        
       A landscape of consciousness: Toward a taxonomy of explanations and
       implications
        
       Author : danielam
       Score  : 90 points
       Date   : 2024-07-01 11:44 UTC (4 days ago)
        
 (HTM) web link (www.sciencedirect.com)
 (TXT) w3m dump (www.sciencedirect.com)
        
       | bzmrgonz wrote:
       | This is a wonderful project, I had no idea there was so much
       | fragmentation n the topic of consciousness. Maybe we should feed
       | these writings and concepts to AI and ask it to give us any grand
       | unifying commonality among them, if any.
        
         | bubblyworld wrote:
         | I would love to be wrong about this, but I don't think anyone
         | knows how to do that yet. You're basically asking for automatic
         | most-likely hypothesis generation given a set of input data.
         | Concepts about consciousness in this case, but you could
         | imagine doing the same with scientific data, system traces
         | around bugs and crashes, etc. That would be wild!
        
         | russdill wrote:
         | It's precisely the type of thing that current LLMs are not
         | suited for. They excel at extrapolating between existing
         | writings and ideas. They do really poorly when trying to do
         | something novel.
        
           | mistermann wrote:
           | On their own yes, but as human-like intelligent agents
           | running within a larger framework it's a different story.
        
         | superb_dev wrote:
         | Just skip all the thinking ourselves and see if some AI can
         | come up with plausible sounding nonsense? I'm not interested
        
         | dcre wrote:
         | You should probably try thinking about it instead.
        
         | poikroequ wrote:
         | Why do people feel the need to AI everything?
        
       | cut3 wrote:
       | This topic is so interesting. If I were creating a system for
       | everything, it seems like empty space needs awareness of anything
       | it could expand to contain, so all things would be aware of all
       | other things as a base universal conscious hitbox.
       | 
       | Panpsychism seems neat to think about.
        
         | CuriouslyC wrote:
         | You don't need empty space. All the processing power can be
         | tied to entities, and space emerges from relationships between
         | entities.
         | 
         | Want something fun to think about? What if the Heisenberg
         | uncertainty principle is basically a function of the
         | information capacity of the thing being examined. To make a
         | computer analogy, imagine you have 8 bits of information -
         | using 6 for position leaves 2 momentum, for example.
        
       | brotchie wrote:
       | Two things I'm absolutely convinced of at this point.
       | 1. Consciousness is primitive. That is, interior experience is a
       | fundamental property of the universe: any information system in
       | the universe that has certain properties has an interior
       | experience,       2. Within the human population, interior
       | experience varies vastly between individuals.
       | 
       | Assertion 1 is informed though reading, introspection,
       | meditation, and psychedelic experience. I've transitions the
       | whole spectrum of being a die hard physical materialist to high
       | conviction that consciousness is primitive. I'm not traditionally
       | panpsychic, which most commonly postulates that every bit of
       | matter has some level of conscious experience. I really think
       | information and information processing is the fundamental unit
       | (realized as certain configurations of matter) and certain
       | information system's (e.g. our brain) have an interior
       | experience.
       | 
       | Assertion 2 is informed through discussion with others. Denial of
       | Chalmer's hard problem doesn't make sense to me. Like it seems
       | logically flawed to argue that consciousness is emergent.
       | Interior experience can't "emerge" from the traditional laws of
       | physics, it's like a nonsense argument. The observation that
       | folks really challenge this makes me deeply believe that the
       | interior experience across humans is not at all uniform. The
       | interior experience of somebody who vehemently denies the hard
       | problem must be so much different from my interior experience to
       | the extend that the divide can't be bridged.
        
         | tanepiper wrote:
         | You word my position here too.
         | 
         | 20's - a rabid Dawkins reading Athiest. 40's - I think Dawkins
         | is an idiot and my favourite book is "Beelzebub's Tales to His
         | Grandson"
        
           | tasty_freeze wrote:
           | You don't come off as being a nuanced thinker if those are
           | your two positions on Dawkins. I can understand disagreeing
           | with him, but calling him an idiot impugns you more than him.
        
             | mistermann wrote:
             | Assuming your model of him is correct.
             | 
             | How many videos of him "destroying" theists have you
             | watched on TikTok? I've seen 100+, and agree that he's an
             | idiot, _amazingly_ so. Watch carefully the words he uses as
             | he  "proves" his "facts".
        
               | s1artibartfast wrote:
               | Have you considered that TikTok may not be a full
               | representation of the human being?
               | 
               | It is one thing to say someone spews bullshit on tiktok,
               | and another to claim them an idiot.
               | 
               | Do you use a purity testing approach to determining
               | idiocy?
        
               | mistermann wrote:
               | A full representation is not necessary. If a Human has
               | errors in any single sentence, they have errors in their
               | corresponding model. These details _are the essence of
               | the very point of contention_.
               | 
               | > Do you use a purity testing approach to determining
               | idiocy?
               | 
               | If one is claiming logical and epistemic superiority, as
               | he literally _and explicitly_ does, and _arrogantly_ so
               | (followed by roars of applause from the audience), I will
               | judge him by those standards. I will also mock him,
               | _because he is sooooo dumb_ , while he chastises others
               | for the same thing (which he is typically not wrong
               | about, to be fair).
               | 
               | Live by the sword, die by the sword.
        
               | s1artibartfast wrote:
               | Would you agree that this may make the error of judging
               | them by their worst output, and not their best, or even
               | average?
        
               | mistermann wrote:
               | Oh yes, if I could bet money on it, I'd say it's even
               | likely! However, I think this touches on an under
               | realized phenomenon: the difference between what people
               | say when they are "speaking their mind" in
               | literature/studies (slow, deliberate, careful error
               | checking, often including other people), versus the
               | _more_ real unfiltered version you get that comes out
               | during real time speech (which emerges from cognition).
               | 
               | In the atheism community in particular, there are some
               | very strange beliefs about human beliefs. For example, a
               | recurring claim I hear from atheists is that what object
               | level atheists believe, in fact, is _the formal
               | definition_ of atheism: merely /only a _lack of belief_
               | in God(s), _but no negative belief_. The silliness of
               | this should be obvious, but I 've had no luck getting any
               | of them to realize (or even contemplate) this, not even
               | one person. The irony is mind boggling. (The same
               | phenomenon exists in science as well interestingly, which
               | is part of why I claim it is a fact that it is an
               | underpowered methodology/ideology for navigating reality,
               | _a false prophet_ if you will.)
               | 
               | I propose that if he (and all other people) didn't
               | actually hold these flawed beliefs in the first place,
               | they wouldn't come out during real time speech... or if
               | they did ~accidentally, they should be realized and
               | corrected (which scientists _in public venues_ tend to
               | do, _begrudgingly_ ).
               | 
               | All people suffer from this problem, and I think our
               | culture buries it, for reasons I have no theory about
               | (some sort of psychological game theory maybe?).
               | 
               | Another way of putting it: if you only consume content
               | from someone that is written offline, you are getting a
               | misrepresentation of that person. I think this is a huge
               | deal, yet another important part of reality that our
               | culture behaves as if it is not there. We are like
               | teenagers at best on an absolute scale imho.
        
               | gjm11 wrote:
               | I think _both_ what you get in real time _and_ what you
               | get when someone has time to consider and check for
               | errors and polish are  "real". It's not that each person
               | has a perfectly well-defined set of beliefs and ideas,
               | and (1) you get to see them unfiltered in real time and
               | only see a fake version when the person has time to
               | polish, or (2) you get to see them accurately when the
               | person has time to make sure they're getting it right,
               | and only see a rough draft full of mistakes in real time.
               | People are more complicated than that.
               | 
               | Some people are good at getting things right in real
               | time. Some people are good at getting things right when
               | they're careful. I think it is a mistake to call someone
               | stupid because they're bad at one of those things --
               | unless for some reason one of them _matters_ more than
               | the other (e.g., they 're in a position where they have
               | to make quick decisions all the time; or they are
               | primarily a writer who can always think and check and
               | polish before publishing).
               | 
               | I'm not convinced that the definition-of-atheism thing
               | has much to do with the real-time versus polished
               | distinction. But:
               | 
               | I think both atheists and theists sometimes cherry-pick
               | definitions for tactical purposes. "Oh no, I can't
               | possibly be expected to offer evidence that there are no
               | gods, because being an atheist just means not positively
               | believing in gods." "Of course so-and-so's evil actions
               | don't reflect badly on Christianity -- if he were a
               | _real_ Christian he 'd be acting better." "Why should I
               | care what liberal Muslims say? The most authentic Muslims
               | are obviously the ones who behead people while shouting
               | _Allahu akbar_. " Etc., etc., etc.
               | 
               | But it's not _silly_ to define  "atheist" as "not
               | positively believing in any gods"; that's a property that
               | many people have, it's perfectly reasonable to have a
               | name for it, and the only problem with using "atheist"
               | for it is that most people use that word to mean
               | something a bit different. And the trouble is that there
               | aren't "enough" words; we've got "atheist" and "agnostic"
               | and "deist" and "theist" but there are more gradations of
               | belief than that. (Absolutely certain there are no gods.
               | Of the opinion that on balance there are probably no
               | gods. Of the opinion that whether there are gods is
               | unknowable in principle. Of the opinion that it's
               | knowable in principle but one doesn't in fact have enough
               | evidence. Inclined to think that there's probably
               | something godlike but we can't know anything much about
               | it. Thoroughly convinced that there is something godlike,
               | but that we can't know much about it. Somewhat convinced
               | by arguments for a particular religion but still unsure.
               | Firmly committed to a particular religion and confident
               | that it's right. Etc.)
               | 
               | Personally, I use "atheist" to mean "overall of the
               | opinion that there are no gods, whether or not certain of
               | this", "non-theist" to mean "not positively believing in
               | any gods", "agnostic" to mean "substantially unsure
               | whether there are gods, whether or not one thinks it's
               | knowable in principle", "theist" to mean "overall of the
               | opinion that there is at least one god, whether or not
               | certain of this", and I qualify those terms in whatever
               | ways might be necessary if I want to say that someone's
               | certain there's no god or kinda-halfheartedly-Hindu or
               | whatever. I am somewhat prepared to argue that all those
               | choices are better than the alternatives. But if someone
               | else only uses "atheist" and "theist" for people who feel
               | completely certain about whether there are any gods, or
               | uses "atheist" to mean "not positively believing in
               | gods", or something, that's a defensible choice so long
               | as they take the trouble to be clear about what they mean
               | and refrain from cheating by equivocation.
        
               | mistermann wrote:
               | The distinction, or better: the _phenomenon_ I am trying
               | to get at is that people (including genuinely smart
               | people) commonly mix up ~intent and ability  / actual
               | behaviour.
               | 
               | For example, consider someone who says "I am moral,
               | because I am _a Christian_ ", but then sneaks off and
               | cheats on his wife. Or, consider a physics teacher who
               | says "You can learn physics from me, because _I am
               | knowledgeable in physics_ ", but then starts lecturing
               | and the content is incorrect.
               | 
               | So too with "atheists" who believe that _simply declaring
               | oneself to be_ a certain way is adequate to _achieve_ the
               | intent.
               | 
               | Note that atheists (also Scientific Materialist
               | fundamentalists like Sean Carroll or NDT, etc) are just a
               | particularly common (and hilarious, because of the irony
               | + self-confidence) manifestation of this abstract
               | phenomenon, it is common in any ideology, derived from
               | fundamental flaws in how our culture teaches (or not) how
               | to think ("Use logic, evidence, critical thinking,
               | etc"...except no methodology accompanies the motto,
               | people think simply _declaring it_ to be so _makes it_
               | so).
               | 
               | Or maybe another angle to think of it from: rewind 200
               | years and consider how Western culture was broadly ok
               | with racism and slavery - that's how we currently are
               | with our cultural norms on thinking.
        
               | gjm11 wrote:
               | This seems like a separate thing from what you were
               | talking about before. (Unless you're referring again to
               | the "no positive belief in gods" definition of "atheism",
               | but I don't think you can be since in fact if you
               | sincerely declare that you have no positive belief in any
               | gods then that _does_ pretty much mean that you in fact
               | have no such belief.)
               | 
               | If all you're saying is that some people make a lot of
               | fuss about being rational, informed by evidence, etc.,
               | and then fail to be sufficiently rational, informed by
               | evidence, etc. -- well, yes, people are fallible and
               | think too highly of themselves, and I expect that state
               | of affairs to continue at least until the Glorious
               | Transhuman Future, should any version of that ever
               | arrive, and probably beyond. I don't see any particular
               | reason to think that atheists overestimate themselves
               | more than theists do.
               | 
               | Also, I think more highly of e.g. Sean Carroll's
               | rationality than it seems you do, and I see _absolutely
               | no reason_ to think that he isn 't genuinely attempting,
               | with more success than most, to apply logic and critical
               | thinking to evidence. If you claim he's just mouthing
               | those words and thinks that saying them makes him
               | rational, then I would be interested to know on what
               | grounds you think so.
               | 
               | Also also, although I don't find Richard Dawkins super-
               | impressive when he argues against religion[1], if you
               | think he isn't substantially more rational and more
               | evidence-led than most of the people he argues with[2]
               | then I fear you're severely overstating his shortcomings.
               | 
               | [1] His writing on evolutionary biology is generally very
               | good.
               | 
               | [2] Most, not all. I am not claiming that there is no one
               | reasonable on Team Theism.
        
               | mistermann wrote:
               | > since in fact if you sincerely declare that you have no
               | positive belief in any gods then that does pretty much
               | mean that you in fact have no such belief.
               | 
               | If one _declares_ one 's mind to operate in a specific
               | way, it operates that way? How does that work? And what
               | should one think about the plentiful evidence available
               | online of Atheists (Scientists, Rationalists, Experts,
               | etc) _demonstrating_ that their perceived /intended
               | cognition isn't how their actual cognition works? _Shall
               | we ignore it_?
               | 
               | I am pointing to a _genuinely_ interesting phenomenon
               | here....it is _always and everywhere, right in front of
               | (behind?) our noses_. Perhaps there 's something about it
               | that makes it ~not possible to see it?
               | 
               | > If all you're saying is that some people make a lot of
               | fuss about being rational, informed by evidence, etc.,
               | and then fail to be sufficiently rational, informed by
               | evidence, etc.
               | 
               | It isn't, and the evidence of that is right there above.
               | 
               | > well, yes, people are fallible and think too highly of
               | themselves, and I expect that state of affairs to
               | continue at least until the Glorious Transhuman Future,
               | should any version of that ever arrive, and probably
               | beyond.
               | 
               | This is an interesting _and common_ (a literal philosophy
               | professor succumbed to it under testing not more than a
               | week ago) behavior when the idea of improving upon
               | cultural defaults is suggested: framing it as an
               | uninteresting,  "everyone knows" fact of life, or an
               | absurd strawman, _or both_.
               | 
               | Out of curiosity: do you ever observe patterns of
               | cognitive behavior in ( _all!_ ) Humans? It's really
               | quite interesting, I highly recommend it!
               | 
               | > I don't see any particular reason to think that
               | atheists overestimate themselves more than theists do.
               | 
               | Have you gone looking for it?
               | 
               | Regardless: this is not the point of contention, let's
               | try to avoiding sliding the topic.
               | 
               | > Also, I think more highly of e.g. Sean Carroll's
               | rationality than it seems you do, and I see absolutely no
               | reason to think that he isn't genuinely attempting, with
               | more success than most, to apply logic and critical
               | thinking to evidence.
               | 
               | The point isn't whether he's better than most, the point
               | is that he suffers from the very same problems he mocks
               | others for - it is usually to a lesser degree, _perhaps_
               | , but on an absolute scale, _how bad is he_?
               | 
               | It seems to me that Theists, Conspiracy Theorists, Trump
               | supporters, _all the usual suspects_ are always fair game
               | for criticism, but when the same is done to The Right
               | People, for some reason that 's inappropriate, _if not
               | outright disallowed_. And yet: is openness to criticism
               | not often held up as _why_ these superior disciplines are
               | superior?
               | 
               | > If you claim he's just mouthing those words and thinks
               | that saying them makes him rational, then I would be
               | interested to know on what grounds you think so.
               | 
               | More like: "mouthing those words and thinking that _he
               | is_ rational " (in an ~absolute sense, as opposed to
               | _more rational_ ).
               | 
               | > Also also, although I don't find Richard Dawkins super-
               | impressive when he argues against religion[1], if you
               | think he isn't substantially more rational and more
               | evidence-led than most of the people he argues with[2]
               | then I fear you're severely overstating his shortcomings.
               | 
               | To me, his debates are like the move Dumb and Dumber.
               | Dawkins is dumb, his opponents are _typically_ dumber.
               | Have you ever seen him go up against someone with some
               | mental horsepower? I haven 't, but check out this video
               | with NDT opining on philosophy, in discussion with Kurt
               | Jaimungal (a heavyweight in my books):
               | 
               | "Philosophers Are USELESS!" Neil & Curt Clash on Physics
               | 
               | https://www.youtube.com/watch?v=ye9OkJih3-U
               | 
               |  _How embarrassing_. But also useful - Neil (and Rich,
               | and to a lesser degree Sean) are like walking poster boys
               | for the phenomenon I 'm discussing.
        
               | tasty_freeze wrote:
               | I agree Dawkins shouldn't be talking about theology in
               | general as he obviously hasn't studied it. He is fine to
               | great in explaining the evolutionary evidence why young
               | earth creationism is wrong, but is out of his depth when
               | discussing the bigger picture. That doesn't make him an
               | idiot.
        
               | mistermann wrote:
               | Ok, what term do you believe should be applied to someone
               | who professionally mocks people's intellectual
               | shortcomings (real or imagined...this gets into another
               | variation of the phenomenon), and in so doing
               | demonstrates that he too suffers from the very same
               | abstract problem, to the cheers of audiences who also
               | suffer from the same problem (many of whom will then
               | recommend his work, spreading this mind virus (both data
               | & methodology) ever further)?
               | 
               | Is the irony of this circle jerk of delusion not a bit
               | thick?
        
             | tanepiper wrote:
             | Well my position comes from some of the positions Dawkins
             | has publicly stated, when he really didn't need to speak up
             | in those circles.
             | 
             | Maybe you find 'idiot' a strong word? The problem with
             | someone like Dawkins is he is 'clever' - someone who
             | doesn't understand that in his position it's better to not
             | wield it like a weapon. This is why I prefer someone like
             | Sean Carroll, who absolutely entertains some bonkers ideas,
             | but never from a position of superiority or dismissal - but
             | rather challenges it rationally.
        
         | carrozo wrote:
         | What are your thoughts on what Donald Hoffman has been
         | pursuing?
         | 
         | https://en.wikipedia.org/wiki/Donald_D._Hoffman
        
           | brotchie wrote:
           | Compelling.
           | 
           | I really buy his argument about our interior experience being
           | a multimodal user interface (MUI) over stimulus from some
           | information system. We describe the universe in terms of a 4D
           | space-time with forces and particles, but this is really the
           | MUI we've constructed (or evolution has constructed) that
           | maximizes our predictive power when "actuating" our MUI (e.g
           | interacting with that external system).
           | 
           | I haven't thought about this before, and kinda rejected it on
           | first reading of Hoffman's work, but think I grok it now.
           | Because our internal experience is a MUI, and that MUI (4D
           | space time, particles) can't be considered a "true reality",
           | it's just an interface, then other conscious entities are
           | more "real" than our MUI. That is, the fundamental true
           | reality that really matters is other conscious agents (e.g.
           | Conscious Realism).
           | 
           | A slightly more wacky theory I like to think about is how
           | this intersects with the simulation argument. If our reality
           | isn't ring 0 (e.g. there's an outer reality that is actually
           | time-stepping our universe), then the conscious interior
           | experience we have in our reality may be due to the
           | properties of reality in the outer universe "leaking through"
           | into our simulation.
           | 
           | This actually aligns well with the Hoffman's MUI argument. We
           | live in some information processing system. Through evolution
           | we've constructed a MUI that we see as 4D space time. But
           | this doesn't at all reflect the true reality of our universe
           | being a process simulated in the ring 0 reality. Conscious
           | Realism then arises because ring 0 reality has properties
           | that imbue pattern of information processing with interior
           | experience.
        
         | CuriouslyC wrote:
         | Assertion 1 is quite weak. The stronger version is that
         | consciousness is the mechanism by which the universe processes
         | information, and choice (as we experience it) is the mechanism
         | by which the universe updates its state. Under this assertion,
         | the laws of physics are nothing more than an application of the
         | central limit theorem on the distribution of conscious choices
         | made by all the little bits of the universe involved in the
         | system. This view also implies that space and reality are
         | "virtual" or "imaginary" much like George Berkeley suggested
         | 300 years ago.
        
           | brotchie wrote:
           | I'm starting to buy this argument after rejecting it before
           | (primarily thought ignorance of the meaning of "consciousness
           | is the mechanism by which the universe processes
           | information").
           | 
           | Also intersects with Hoffman's argument re: Conscious
           | Realism. The only real thing is conscious experience, and
           | "reality" as used in common parlance is just a multimodal
           | user interface constructed to maximize evolutionary fitness.
        
         | naasking wrote:
         | > The interior experience of somebody who vehemently denies the
         | hard problem must be so much different from my interior
         | experience to the extend that the divide can't be bridged.
         | 
         | Internal experiences are probably a bit different, but it's a
         | mistake to think this is the only reason to deny the hard
         | problem. We all experience perceptual illusions of various
         | types, auditory, visual, etc. In other words, perceptions are
         | useful but deeply flawed. Why do you think your perceptions of
         | subjective, qualitative experience doesn't have these same
         | issues? I see no factual reason to treat them differently,
         | therefore I simply don't naively trust what my perception of
         | conscious experience suggests might be true, eg. that
         | subjective experience is cohesive, without gaps, ineliminable,
         | ineffable, etc.
         | 
         | Once you accept this fact, the hard problem starts looking a
         | lot like a god of the gaps.
        
           | kkoncevicius wrote:
           | To me it seems your reply is conflating consciousness with
           | perception. The perception is the consciousness. Auditory
           | illusion, for example, is just signals to the senses and your
           | senses miss-representing the inputs. The consciousness is the
           | part which is aware of these sensations. If they are accurate
           | or not - is not the point. The point is that you are aware of
           | them.
        
         | kkoncevicius wrote:
         | I have an alternative, more generous explanation of your 2nd
         | point - the people you talk about haven't done much reflection
         | about consciousness yet and are so immersed in it that they
         | cannot separate their own conscious experience as an entity to
         | talk about. Like fish and water. Just like you said - it was
         | also your position when you were younger. The people you label
         | as having a different interior experience might be in the same
         | position as your younger self.
        
       | tasty_freeze wrote:
       | I've never understood why Chalmer's reasoning is so captivating
       | to people. The whole idea of p-zombies seems absurd on its face.
       | Quoting the article:
       | 
       | (quote)                   His core argument against materialism,
       | in its original form, is deceptively (and delightfully) simple:
       | 1. In our world, there are conscious experiences.         2.
       | There is a logically possible world physically identical to ours,
       | in which the positive facts about consciousness in our world do
       | not hold.         3. Therefore, facts about consciousness are
       | further facts about our world, over and above the physical facts.
       | 4. So, materialism is false.
       | 
       | (endquote)
       | 
       | Point 2 is textbook begging the question: it imagines a world
       | which is physically identical to ours but consciousness is
       | different there. That is baking in the presupposition that
       | consciousness is not a physical process. Points 3 and 4 then
       | "cleverly" detect the very contradiction he has planted and
       | claims victory.
        
         | codeflo wrote:
         | If you believe that what we describe as "consciousness" is
         | emergent from the ideas a material brain develops about itself,
         | then it's in fact not logically possible to have a world that
         | is physically identical to ours yet does not contain
         | consciousness. So indeed, premise 2. sneaks in the conclusion.
         | 
         | To illustrate this point, here's an argument with the same
         | structure that would similarly "prove" that gravity doesn't
         | cause things to fall down:
         | 
         | 1. In our world, there is gravity and things fall down.
         | 
         | 2. There is a logically possible world where there is gravity
         | yet things do not fall down.
         | 
         | 3. Therefore, things falling down is a further fact about our
         | world, over and above gravity.
         | 
         | 4. So, gravity causing things to fall down is false.
        
           | goatlover wrote:
           | But Chalmers doesn't think that approach works, nor any other
           | physicalist attempt to explain consciousness. The problem
           | with what you stated is that you're substituting ideas about
           | consciousness for sensations. And those aren't the same
           | thing. We experience sensations as part of being embodied
           | organisms, and then we think about those sensations.
        
             | codeflo wrote:
             | I'm making an argument about the validity of an argument. A
             | rebuttal to that can never be "but the conclusion is true".
        
             | tsimionescu wrote:
             | It's quite clear if you approach these things logically
             | that Chalmers doesn't do a lot of thinking before coming up
             | with these arguments. All of his arguments boil down to "if
             | we assume that consciousness is different from everything
             | else, then it's different from everything else". He gets
             | way, way too much attention for someone who is sub mediocre
             | in his reasoning.
             | 
             | He also doesn't understand what computation is, even though
             | he often makes confident statements about it. He thinks
             | computation is a subjective process, that something only
             | counts as a computation if someone interprets it as such,
             | which is simply wrong, not a debatable topic. And this is
             | the core of one of his other arguments about why
             | consciousness can't be a computational process.
        
               | mensetmanusman wrote:
               | There are a number of ways to determine that
               | consciousness is not a computational process. Roger
               | Penrose is a good source on this.
        
               | tsimionescu wrote:
               | There is not. It's by the far the most likely
               | explanation, and even if you don't agree with that, it is
               | at least completely consistent with everything we know
               | about computation.
        
           | mistermann wrote:
           | > If you believe that what we describe as "consciousness" is
           | emergent from the ideas a material brain develops about
           | itself, then it's in fact not logically possible to have a
           | world that is physically identical to ours yet does not
           | contain consciousness.
           | 
           | This sneaks in an implicit axiom: that the brain is not only
           | necessary, but is also sufficient, _necessarily_ , for
           | consciousness (implicitly ruling out some unknown outside,
           | non-materialistic force(s)).
        
           | empath75 wrote:
           | I don't think your point 2 is directly analogous to his point
           | 2.
           | 
           | Because a world where things do not fall down is not
           | physically identical to a world in which they don't.
           | 
           | I think the point of arguing about p-zombies is this. Do you
           | believe it's possible for a human being to exhibit all the
           | external characteristics of consciousness without an internal
           | conscious experience? And if you believe that's true, then
           | you can posit a world which is physically indistinguishable
           | from our world through any experiment in which consciousness
           | simply does not exist, because, as far as I know, there is no
           | test that can prove that an individual does have an internal
           | consciousness, and isn't merely mimicking it. Most arguments
           | that p-zombies don't exist sort of rely on the internal
           | conscious experience of the person making it, which no one
           | else has access to -- "I have an internal conscious
           | experience of the world, other people are similar to me and
           | so they must also have those experiences."
           | 
           | That is _not_ true about a world in which gravity does not
           | exist for obvious reasons. That universe would be very
           | different from ours and easily distinguishible through
           | experiment.
           | 
           | I think his point does hinge on whether it's possible for
           | p-zombies to exist, but it's not as silly as you all are
           | making it out to be, and it is not begging the question.
           | 
           | I actually think his weakest point is part 3 and 4, because I
           | think mostly the problem is that we don't really have good
           | definitions of consciousness and related concepts let alone a
           | complete physical explanation of their origins, and his whole
           | argument hinges on the fact that we currently don't have a
           | way to test for internal conscious experience, but I think
           | that might not always be true.
        
             | observationist wrote:
             | You experience your own consciousness - your own model of
             | your self, time, and the world as perceived through your
             | physical sensory apparatus. This could give you a
             | probability of 100% certainty of your own consciousness.
             | You're a good skeptic, though, and after much
             | consideration, you decide that, despite absence of evidence
             | to the contrary, you'll allow for 2% uncertainty, since you
             | might be a simulation specifically designed to "feel"
             | conscious, or some other bizarre circumstance.
             | 
             | Knowing this, you compare your own experience with reports
             | by others, and find that, despite some startling variety in
             | social and cultural practice, humans all more or less go
             | through life experiencing the world in a way that more or
             | less maps to your own experiences. You find that even Helen
             | Keller, despite her tragic disability, wrote about
             | experiences which you can simulate for yourself. You
             | conclude that if you somehow swapped places with her, she
             | would be able to map the sensory input of your physical
             | sensors to her own experience of the world, and vice versa.
             | 
             | This leads you to think that our physical brains are
             | performing a process that models the world, and it does so
             | consistently. After reading up on people's experiences, you
             | also learn that our subjective experiences are constructed,
             | moment by moment, by a combination of these world models,
             | real-time stimulation, abstract feedback loops of conscious
             | and unconscious thinking.
             | 
             | The more you read, the more evidence you have of this
             | strange loop being the default case for every instance of a
             | human having a brain and being alive.
             | 
             | The Bayesian probability that you are conscious, because of
             | your brain, given all available evidence, approaches 100%
             | certainty. You conclude your brain is more or less the same
             | as anyone else's brain, broadly speaking, and this is
             | supported by the evidence provided by a vast majority of
             | accounts from other similarly brained individuals through
             | all of human history.
             | 
             | Since your brain doesn't have a particular difference upon
             | which to pin your experience of consciousness, and the
             | evidence doesn't speak to any need for explanation, Occam's
             | razor leads you to the conclusion that the simplest
             | explanation is also the best. The living human brain is
             | necessary and sufficient for consciousness, and
             | consciousness is the default case for any living human
             | brain.
             | 
             | The posterior probability that any given human (with a
             | "normal" living brain) is conscious approaches 100%
             | certainty, unless you can specifically provide evidence to
             | the contrary. Saying "but what if p-zombies exist" makes
             | for a diverting thought experiment, but it's rationally
             | equivalent to saying "but what if little invisible unicorns
             | are the ones actually experiencing things" or "what if
             | we're all in the Matrix and it's a simulation" or "what if
             | we're just an oddly persistent Boltzmann brain in an
             | energetic nebula somewhere in the universe."
             | 
             | Without evidence, p-zombies are a plot device, not a
             | legitimate rational launching off point for theorizing
             | about anything serious.
             | 
             | Humans are conscious. We have neural correlates, endless
             | recorded evidence, all sorts of second hand reporting which
             | can compare and contrast our own first hand experiences and
             | arrive at rational conclusions. Insisting on some arbitrary
             | threshold of double blind, first hand, objective replicable
             | evidence is not necessary, and even a bit shortsighted and
             | silly, since the thing we are talking about cannot be
             | directly shared or communicated. At some point, we'll be
             | able to map engrams and share conscious experience directly
             | between brains using BCI, and the translation layer between
             | individuals will be studied, and we'll have chains of
             | double blinded, replicable experiments that give us
             | visibility into the algorithms of conscious experience.
             | 
             | Without direct interfaces to the brain and a robust
             | knowledge of neural operation, we're left with tools of
             | abstract reasoning. There's no good reason for p-zombies -
             | they cannot exist, given the evidence, so we'd be better
             | served by thinking about things that are real.
        
               | pixl97 wrote:
               | >since you might be a simulation specifically designed to
               | "feel" conscious
               | 
               | I would argue this is actually consciousness also. If
               | (and yea, it's a big if) consciousness is an internal
               | model/simulation of how we experience reality, then a
               | simulation of a simulation is still a simulation.
        
               | observationist wrote:
               | I agree - once you've settled your math on consciousness,
               | you can go back and modify the priors based on new
               | evidence. One of the crazier suppositions that actually
               | makes a dent in the posterior probability is the
               | simulation hypothesis.
               | 
               | If all civilizations that develop computation and
               | simulation capabilities converge to the development of
               | high fidelity simulations, then it's highly likely that
               | they would create simulations of interesting periods of
               | history, such as the period of time when computers, the
               | internet, AI, and other technologies were developed. We
               | just so happen to be living through that - I still put my
               | odds of living in base reality somewhere above 98%, but
               | there is a distinct possibility that we're all being
               | simulated so that this period of history can be iterated
               | and perturbed and studied, or some such scenario.
               | 
               | Maybe someone ought to start studying the science of
               | universal adversarial simulation attacks, to elicit some
               | glitches in the matrix. That'd be one hell of a paper.
        
               | pixl97 wrote:
               | >That'd be one hell of a paper
               | 
               | Until 'they' restore the checkpoint and arrange your
               | teams plane to fall out of the sky.
        
             | pixl97 wrote:
             | Is a video game a conscious experience for a computer?
             | 
             | Now imagine an internal video game in a computer system
             | that is being generated from the real world inputs of what
             | it sees/hears/feels around it. You take outside input,
             | simulate it, record some information on it, and output
             | feedback into the real world.
             | 
             | Many people would say this isn't consciousness, but I
             | personally disagree. You have input, processing,
             | introspection, and output. The loops that occur in the
             | human brain are more complex, but you have the same things
             | occurring. There is electrical processing and chemical
             | reactions occurring in the humans mind. Just because we
             | don't understand their exact processing doesn't mean they
             | are unrelated to consciousness. Moreso we can turn this
             | consciousness off with drugs and stop said electrical
             | processing.
        
             | lumb63 wrote:
             | To elaborate on your statement, we all think in very
             | different ways. Recently there was an academic test posted
             | here that evaluated "how" a person thinks (internal
             | monologue, use of images, how memories are recalled, etc.).
             | After my girlfriend and I both took the test and I saw how
             | differently we both think, I was shocked. Had we not taken
             | this quiz I'd have assumed the inside of her mind
             | fundamentally worked the same way as mine does. But that is
             | seemingly very far from the truth.
             | 
             | As where I can visualize, use internal monologue, vividly
             | recall my memories, etc. at will, by default I do none of
             | the above, and my thoughts are opaque to me. For her, she
             | almost exclusively uses her internal monologue when
             | thinking, and her entire thought process is consciously
             | visible to her. It's entirely conceivable that other people
             | might not have an experience of "consciousness" resembling
             | anything like what my idea of consciousness is.
        
               | marcellus23 wrote:
               | That sounds like a really interesting test, do you have a
               | link?
        
             | nonce42 wrote:
             | Midazolam/Versed sedation seems pretty close to a p-zombie.
             | You can have someone who seems completely awake, walking
             | around and interacting normally, but if you ask them later
             | they were completely unconscious from their own
             | perspective. So self-reported consciousness isn't always
             | accurate. And it also seems that consciousness is very
             | closely tied to memory.
             | 
             | (I'm not arguing a particular position, but trying to
             | figure out what to make of this. Also, this is based on
             | what I've read, not personal experience.)
        
               | david-gpu wrote:
               | _> You can have someone who seems completely awake,
               | walking around and interacting normally, but if you ask
               | them later they were completely unconscious from their
               | own perspective_
               | 
               | Were they unconscious, or are they now unable to remember
               | what they did? I.e. amnesiac.
        
             | RaftPeople wrote:
             | > _Do you believe it 's possible for a human being to
             | exhibit all the external characteristics of consciousness
             | without an internal conscious experience?_
             | 
             | Nobody knows whether conscious experience is a requirement
             | or not to "exhibit all of the external characteristics".
             | 
             | It's possible that the only way to get from state N to
             | state N+1 is to include the consciousness function as part
             | of that calculation.
             | 
             | A counter to this would be that a lookup table of states
             | would produce the same external characteristics without
             | consciousness.
             | 
             | A counter to that counter would be that the consciousness
             | function is required to produce state N+1 from state N. The
             | creation of the lookup table must have invoked the
             | consciousness function to arrive at and store state N+1.
             | 
             | The thing we just don't really know is whether state N+1
             | can be derived from state N without the consciousness
             | function being invoked.
        
           | mensetmanusman wrote:
           | Your Step 2 is so off as an analogy, it's possible you don't
           | understand Chalmer's point.
        
             | chr1 wrote:
             | What is Chalmers saying then? As i understand he is saying
             | that there can be a world where conciousness does not
             | exist, but all the possible physical experiments cannot
             | distinguish between that world and our world. But that
             | simply means the conciousness he is looking for has
             | absolutely no consequence, and therefore his point has no
             | value...
        
         | patrickmay wrote:
         | Well and succinctly put. One would have to be a philosopher to
         | be willing to consider p-zombies further.
        
           | RaftPeople wrote:
           | Some of the arguments I've read by philosophers seem like
           | they are focusing on pure logic to prove a point or find
           | weakness in another, but the linkage between those logic
           | stmts and reality don't always seem to be considered.
           | 
           | It's almost like they are purely focused on symbolic logic,
           | even if the stmts and symbols don't truly map to reality
           | unambiguously, or without contradictions.
        
           | empath75 wrote:
           | I actually think it's worth asking the question if these
           | advanced AIs are a kind of p-zombie.
        
             | Filligree wrote:
             | P-zombies are supposed to be physically identical to
             | humans. They're problematic because they lack
             | consciousness, yet mysteriously talk about it anyway --
             | among other reasons.
             | 
             | Advanced AI is definitely not physically identical to
             | humans, and there's a well understood reason why they might
             | talk about consciousness despite lacking it which doesn't
             | apply to p-zombies.
        
         | amelius wrote:
         | > it imagines a world which is physically identical to ours but
         | consciousness is different there
         | 
         | So a world where people discuss consciousness but where it does
         | not exist? That sounds very implausible.
        
         | sornen wrote:
         | Chalmers in point 2 is not saying to imagine such a world, but
         | that such a world is logically possible. Chalmers gives as an
         | example of a logical impossibility a male vixen since it is
         | contradictory. He states "... a flying telephone is
         | conceptually coherent, if a little out of the ordinary, so a
         | flying telephone is logically possible. Nevertheless, that
         | zombies are logically possible, may be begging the question,
         | that consciousness is non physical.
        
           | tasty_freeze wrote:
           | I'm still missing the point, I guess, as I don't think the
           | question is logically possible.
           | 
           | One might as well say: it is logically possible to have a
           | universe where the physics are identical to our present
           | world, except the core of the Sun is chocolate... therefore
           | fusion can't be the explanation for why our Sun radiates so
           | much energy.
           | 
           | Getting back to the zombies, presuming there could be a
           | zombie clone of me which is indistinguishable from the real
           | me but it isn't conscious is one that needs far more support
           | than just asserting it. I've heard people try to explain:
           | well, imagine if a powerful computer was simulating you in
           | every respect, that would be a p-zombie. But that is question
           | begging, as it presumes that such a creature wouldn't be
           | conscious.
           | 
           | I feel the same way about Searle's Chinese Room -- the power
           | of the argument is there only if you have already decided
           | that consciousness is mystical.
        
           | tsimionescu wrote:
           | But it's _not_ logically possible if consciousness is a
           | material process, a consequence of computation in the human
           | brain (and potentially other places). So you can 't prove
           | that consciousness is not materialistic by assuming it's not
           | materialistic.
        
           | idiotsecant wrote:
           | I have known some pretty vixen-y males...
        
         | goatlover wrote:
         | No, you have to take that argument in context of his other
         | arguments against physical explanations for consciousness. What
         | he's saying is that the physical facts do not adequately
         | account for conscious experiences, which is why we can conceive
         | of a universe physically identical lacking consciousness. Why
         | he (and some other philosophers) think this is so is part of
         | their larger arguments.
        
           | Ukv wrote:
           | > which is why we can conceive of a universe physically
           | identical lacking consciousness
           | 
           | I can conceive of it about as well as I can conceive of an
           | identical universe lacking the Internet. Both seem like
           | fairly direct logical contradictions, slightly disguised by
           | referring to concepts that we generally think about more
           | abstractly rather than their physical composition.
        
           | tsimionescu wrote:
           | If you examine all of his arguments, they all end up in the
           | same place. He assumes that consciousness is not necessary
           | for human-like or even animal-like intelligence, and he works
           | from there back to the same conclusion.
        
         | hackinthebochs wrote:
         | > Point 2 is textbook begging the question: it imagines a world
         | which is physically identical to ours but consciousness is
         | different there.
         | 
         | It's not begging the question. He gives reasons for (2) that
         | support the premise. That the premise essentially leads
         | inexorably to (4) is a feature of the argument structure, not a
         | bug. You have to engage with his reasons for (2) in determining
         | whether or not the argument succeeds.
        
         | root_axis wrote:
         | Why is phenomenological subjective experience a thing at all?
         | Unless you're a proponent of panpsychism, we have to ask why
         | living beings have it, but other natural processes do not. From
         | this perspective, it's actually easier to conceive of a world
         | like ours _without_ subjective experience than one with it.
        
           | dragonwriter wrote:
           | > Unless you're a proponent of panpsychism, we have to ask
           | why living beings have it, but other natural processes do
           | not.
           | 
           | No, we don't.
           | 
           | Because, while an individual may experience it, the
           | conjecture that other things do or do not have it is without
           | exception an unverifiable assumption, not an observed
           | phenomenon which calls for an explanation.
           | 
           | > it's actually easier to conceive of a world like ours
           | without subjective experience than one with it.
           | 
           | This is demonstrably false, because no experiencer can
           | observe subjective experience other than their own, and,
           | despite that, describing the world in terms of subjective
           | experience occurring in a vast array of other beings beside
           | the speaker is near-universal, to the point that we
           | pathologize _not_ viewing the universe that way.
        
             | root_axis wrote:
             | > _the conjecture that other things do or do not have it is
             | without exception an unverifiable assumption_
             | 
             | Yes, but ultimately, either they do or they don't, and the
             | default assumption is that they don't, unless you favor
             | panpsychism.
        
           | jstanley wrote:
           | The real problem with the p-zombies thing is _why do
           | p-zombies talk about consciousness?_.
           | 
           | You and I have the experience of _talking about being
           | conscious_ because we have the experience of _being conscious
           | in the first place_. But for a p-zombie to behave exactly the
           | same as us, it would have to have some other mechanism for
           | talking about being conscious that is not dependent on being
           | conscious. So that would mean that _our_ talking about
           | consciousness can be explained by some mechanism other than
           | our being conscious in the first place! Our experience of
           | consciousness, and our _talking about_ experiencing
           | consciousness, could just happen to be one massive
           | coincidence rather than being causally linked! Doesn 't seem
           | likely, ergo no p-zombies.
           | 
           | https://www.lesswrong.com/posts/kYAuNJX2ecH2uFqZ9/the-
           | genera...
        
           | naasking wrote:
           | > From this perspective, it's actually easier to conceive of
           | a world like ours without subjective experience than one with
           | it.
           | 
           | Welcome to eliminative materialism!
        
           | pixl97 wrote:
           | >we have to ask why living beings have it, but other natural
           | processes do not.
           | 
           | The entropy gradient is the first one. Of course lots of
           | natural processes have a bit of energy generated so it's not
           | the only part.
           | 
           | Memory is the next one. Being able to take some of that
           | entropy and put it in a more orderly fashion while rejecting
           | the disorganized part. Crystals can use energy to create
           | ordered systems, but they don't have any means of error
           | correcting.
           | 
           | Execution/replay/updating of said memories based on input
           | from the world around it.
           | 
           | Now before humans came along 'life' was the only thing that
           | could readily manage these traits, and that was after a few
           | billion years of trial and error. Humans now have been going
           | down the path of giving other non-living objects these
           | traits. Advanced computer systems are getting closer to the
           | point of recording information and acting on it in a manner
           | that certainly seems like a subjective experience.
        
             | mensetmanusman wrote:
             | Entropy gradients are non explanations.
             | 
             | Pointing to statistical mechanics and saying the derivative
             | of an exponent is an exponent is an odd way to explain
             | qualia.
        
               | pixl97 wrote:
               | They are not an explanation, they are a requirement. You
               | don't have consciousness in a rock because nothing is
               | happening in a rock.
        
         | nabla9 wrote:
         | > 4. So, materialism is false.
         | 
         | Physicalism is conditionally false.
         | 
         | If p-zombies are logically incoherent, the consciousness does
         | not exist. It's an illusion. This is the argument by Daniel
         | Dennet. We are zombies.
         | 
         | I mean it's obvious to any physicalist that we don't really
         | feel anything. There is no I, soul, no suffering in a in a
         | rock, peas soup, or a human. It's all physical process.
        
         | theptip wrote:
         | A simple counter-argument to p-zombies that I like (I first
         | encountered from Yudkowski) is:
         | 
         | If there was no conscious experience in this identical p-zombie
         | world, it would be impossible to explain why everybody falsely
         | claims they have conscious experience. If people stop claiming
         | this, then the world is physically different, as people no
         | longer act and produce artifacts such as HN posts discussing
         | the phenomenon.
         | 
         | Or, my summary would be: conscious experience is causal, and so
         | you cannot get the same universe-wide effects without it.
        
           | naasking wrote:
           | Not impossible actually. People often claim falsehoods are
           | real, eg. demons, ghosts, deities, etc. If you believe it's
           | possible that p-zombies could generate false beliefs about
           | these other things, then consciousness would just be another
           | falsehood.
        
             | theptip wrote:
             | You would need some coordinating mechanism to ensure that
             | all the p-zombies have the same hallucination. Including
             | some detailed state machine that takes sense inputs, and
             | processes it to produce the qualia present in each waking
             | moment, plus the valence attributed to each moment of
             | qualia.
             | 
             | Since a non-p-zombie can sit down and interrogate the
             | details of their conscious experience, then write a book
             | about it, which others read and agree upon the fine details
             | - I find it hard to come up with a p-zombie substitute that
             | wouldn't just be consciousness by another name.
        
               | naasking wrote:
               | > You would need some coordinating mechanism to ensure
               | that all the p-zombies have the same hallucination
               | 
               | You mean like talking and writing? As I said, how would
               | p-zombies invent organized religion or other common
               | spiritual beliefs that are false?
               | 
               | > Since a non-p-zombie can sit down and interrogate the
               | details of their conscious experience, then write a book
               | about it, which others read and agree upon the fine
               | details
               | 
               | First, assuming you think that we're not p-zombies, not
               | everybody agrees on the properties of consciousness even
               | though we presumably have such an introspective ability.
               | Therefore what you describe is a kind of fictional just-
               | so story, and clearly doesn't actually happen in all
               | cases.
               | 
               | Second, when there is no fact of the matter as with
               | p-zombies and consciousness, any argument about
               | consciousness only has to be rhetorically persuasive, not
               | factual. Why do so many people agree on the broad
               | properties of ghosts, eg. translucency, pass through
               | physical objects, etc. despite such things not existing?
               | There is no fact of the matter being described, so people
               | just need to like the story being told, or perhaps how
               | it's told.
               | 
               | In such a world, Chalmers and other philosophers of
               | consciousness are just persuasive writers that spin a
               | convincing just-so story connecting fictional internal
               | states to real world observations, and people/p-zombies
               | run with it thinking they learned something meaningful.
               | 
               | Suffice it to say, I think this is uncomfortably close to
               | our world.
        
         | naasking wrote:
         | Chalmers' argument is more of an intuition pump to clarify your
         | thinking. If premise 2 seems plausible, then you probably
         | cannot be a physicalist. If you're a die-hard physicalist, then
         | you probably have to deny premise 1 and/or 2.
        
         | FrustratedMonky wrote:
         | "logically possible world physically identical"
         | 
         | "Point 2 is textbook begging the question"
         | 
         | -> It is a thought experiment.
         | 
         | He is proposing a possible postulate to spur talking about the
         | ideas. It isn't a 'proof'.
         | 
         | Just like we don't argue with Schrodinger about the absurdity
         | of a half dead cat.
        
         | sireat wrote:
         | The thing is we are acting like 99.9% p-zombies for most of our
         | interactions with the world - that is we are acting
         | unconsciosly for most of what we do.
         | 
         | The question is where does that subjective 0.01% - the rider on
         | top of the elephant come from?
         | 
         | We generally do not pay attention to walking, breathing or
         | brushing teeth and so on.
         | 
         | We can do more complex tasks as well once we achieve
         | unconscious mastery in some subject.
         | 
         | With proper training we can play chess or tennis ("The Inner
         | Game of Tennis") at a high level, without paying attention. In
         | fact it can be detrimental to think about one's performance.
         | 
         | It is the Dennett's "multiple drafts model" - but where does
         | the subjective experience arise when some model is brought to
         | foreground "thread"?
         | 
         | Thus allure of Chalmer's zombies. Why not have a 100% zombie?
         | 
         | There are many stories of people seemingly being conscious, yet
         | not really.
         | 
         | Black out drunks driving home from a bar.
         | 
         | Hypoglycemic shock is another example - my wife's diabetic
         | friend was responding seemingly logically and claiming to be
         | conscious, yet she was not and paramedics were barely able to
         | save her.
         | 
         | A human being can achieve very high levels on unconscious
         | mastery in multiple subjects.
         | 
         | A very tired me gave a remote lecture (on intermediate Python)
         | during Covid where I switched spoken languages mid-lecture and
         | even answered questions in a different language. Meanwhile I
         | was was half asleep and thinking about a different subject
         | matter. I was not really aware of the lecture material - I had
         | given the same lecture many times before - I was on autopilot.
         | 
         | Only after watching the Zoom recording I realized what had
         | happened.
         | 
         | Thus, are there are some actions that our zombie (unconscious)
         | states unable to produce?
         | 
         | Presumably, subjective experience helps in planning future
         | actions. That could be one avenue to explore.
        
           | kkoncevicius wrote:
           | Just wanted to say how much I like your answer. To me one of
           | the bigger puzzles is why people have such different takes on
           | consciousness. To some the idea of p-zombies is immediately
           | clear. To others it is nonsense. But during my conversations
           | about the topic I was never able to explain it adequately to
           | someone sceptical. And from my perspective they (the
           | sceptics) always conflate being conscious with talking,
           | learning, thinking, remembering, etc.
        
           | causal wrote:
           | Good points. Part of my issue with these discussions is the
           | poor vocabulary we have for consciousness. Your comment alone
           | probably touches on several types of consciousness. The kind
           | of consciousness we're experiencing undoubtedly varies
           | greatly by time and circumstance.
        
         | mensetmanusman wrote:
         | You can't skip over step 1, this is an axiom. I see more
         | materialists denying they have consciousness and free will.
         | Kind of funny actually.
        
         | andoando wrote:
         | P-zombies is a thought experiment to demonstrate the hard
         | problem of consciousness, it is not, in itself, an argument
         | against materialism.
         | 
         | I can certainly imagine a robot that imitates all of human
         | behaviors. If you hit it, it goes "Ow" and retracts, if you ask
         | it if its conscious it says "yes". We can this imagine being
         | created out of our completely physical electrical components,
         | so the question becomes what is the difference between the
         | imitation of consciousness and the consciousness we experience?
         | This is interesting as in this day and age, we can totally
         | imagine building such a robot, yet we'd have a tough time
         | believing it is actually conscious.
         | 
         | Now, whether you are a materialist or not depends entirely on
         | whether or not you believe conscious experience like yours can
         | emerge out of physical components.
         | 
         | My take on this is: Materialism/physicalism is ill defined and
         | materialism/physicalism and dualism are compatible. We consider
         | completely mysterious properties like energy and now even
         | randomness (things happen one way or another for no reason) as
         | being fundamental physical facts about the universe.
         | Theoretically, how does this differ from saying consciousness
         | is a fundamental physical property?
         | 
         | Moreover, you have to consider the fact that the "material"
         | world IS an imagination of the mind. Whatever facts or
         | attributes we assign to being material is limited to the mental
         | facilities of the brain.
         | 
         | At the end of the day, the question is, what fundamental facts
         | do we need to explain the observations that we make? I can
         | observe that I feel, see, hear things. Can the fundamental
         | facts of the current physics model explain this? No? Then we
         | must add to it an additional property, making it part of the
         | standard physics model. If you want proof, you must either A.
         | explain how consciousness can emerge out of existing known
         | "material" processes, B. Admit consciousness as a "material"
         | property and define the process by which it combines, reduces,
         | etc. (the combination problem of panpsychism)
        
       | codeflo wrote:
       | As a thought experiment, imagine we were to scan the position of
       | every molecule in the human body to the Heisenberg limit of
       | accuracy. Imagine we were to plug the resulting model in a
       | physics simulation that models every biochemical and physical
       | interaction with perfect fidelity. This is a thought experiment,
       | and what I suggest isn't ruled out by physics, so let's assume
       | it's technologically possible. Would the simulated being be
       | "conscious" in the same way the original human is? Would it
       | experience "qualia"?
       | 
       | If you think the answer might be no, then congratulations, you
       | actually believe in immaterial souls, no matter how materialist
       | or rationalist you otherwise claim to be.
        
         | mistermann wrote:
         | Not necessarily, one could be a Pedant.
        
         | vundercind wrote:
         | Only holds if whatever hardware that's pretending to be the
         | matter can act exactly like the matter without _being_ the same
         | thing.
         | 
         | For the distinction, consider the difference between a
         | simulation of a simple chemical process in a computer--even a
         | perfectly accurate one!--and the actual thing happening. Is the
         | thing going on in the computer the same? No, no matter how
         | perfect the simulation. It's a little electricity moving
         | around, looking nothing whatsoever like the real thing. The
         | simulation is _meaning_ that we impose on that electricity
         | moving around.
         | 
         | That being the case, this reduces to "if we recreate the matter
         | and state exactly, for-real, is that consciousness?" in which
         | case yeah, sure, probably so.
         | 
         | This doesn't work if the _thing_ running the simulation
         | requires interpretation.
        
           | s1artibartfast wrote:
           | exactly, The parent post does not address the issue of
           | _representation_ vs reality.
           | 
           | You can simulate every molecular interaction in a fire, but
           | that does not mean the simulation gives off the same heat.
           | You can write a perfectly accurate equation for splitting an
           | atom, but the equation does not release energy.
        
             | codeflo wrote:
             | > You can simulate every molecular interaction in a fire,
             | but that does not mean the simulation gives off the same
             | heat
             | 
             | It would to a simulated being standing next to the fire.
             | 
             | > You can write a perfectly accurate equation for splitting
             | an atom, but the equation does not release energy.
             | 
             | It releases simulated energy inside the simulation.
             | 
             | Every material interaction is simulated. If you believe
             | that consciousness can't exist in the simulation, then you
             | believe that consciousness is not a material interaction,
             | q.e.d.
        
               | vundercind wrote:
               | It's some electrons moving around. Any further meaning of
               | that is only what we assign to it.
               | 
               | Unless your "computer" is identical matter in the same
               | arrangement and state is the original, actually doing
               | stuff.
               | 
               | This is why the "what if you slowly simulated the entire
               | universe on an infinite beach moving rocks around to
               | represent the state? Could anything in it be conscious?"
               | thing isn't my very interesting.
               | 
               | No, you're just shoving rocks around on sand. They don't
               | mean anything except what you decide they do. Easy
               | answer.
        
               | MrScruff wrote:
               | Doesn't materialism imply that a perfectly accurate
               | simulation of the universe would be identical to the
               | universe we live in? If not, in what possible way could
               | the two be distinguished?
        
               | s1artibartfast wrote:
               | >Every material interaction is simulated. If you believe
               | that consciousness can't exist in the simulation, then
               | you believe that consciousness is not a material
               | interaction
               | 
               | I think that is missing the point. You are literally
               | changing the material and medium by conducting a
               | simulation.
               | 
               | Releasing simulated energy within a simulation is not
               | identical to releasing real energy in the real world. The
               | former is purely representational, and even a perfectly
               | simulated object retains this property, and lack of
               | equivalence.
               | 
               | a simulated atom is not a real atom, no mater their
               | similarlity.
        
             | hackinthebochs wrote:
             | When we refer to heat we refer to the increase in entropy
             | that has specific effects to things in our world. We can
             | also describe a generic "disordered" state, which doesn't
             | imply a similar kind of causal compatibility. A simulation
             | of entropy is equally disordered with no caveats despite
             | not being exactly entropy due to the causal compatibility
             | issue. Why think consciousness is like entropy rather than
             | like disorder?
             | 
             | In other words, why think consciousness is a physical
             | property of some specific kind of matter rather than an
             | abstract property that can supervene on any sufficiently
             | robust physical substrate?
        
               | s1artibartfast wrote:
               | >why think consciousness is a physical property of some
               | specific kind of matter rather than an abstract property
               | that can supervene on any sufficiently robust physical
               | substrate?
               | 
               | Im open to the idea that consciousness could arise on
               | different substrates, but hold that the substrate is
               | relevant, and we dont have a working definition of
               | "robustness".
               | 
               | This is part of a bigger challenge in which I claim there
               | is no such thing as a perfect simulation.
               | 
               | Simulation requires representation, which requires
               | requires remaining differences unrepresented. Otherwise
               | you just have just created the same thing, not a
               | simulation.
               | 
               | You can simulate the interaction of physical object with
               | other physical objects using electronic object
               | interacting with other electronic objects, but the two
               | are not the same. The electronic object still can not
               | interact with a physical object.
        
               | hackinthebochs wrote:
               | >Simulation requires representation, which requires
               | requires remaining differences unrepresented
               | 
               | Right. But presumably not all physical properties of
               | brains are necessary for consciousness. For example, the
               | fact that action potentials happen on the order of 1ms to
               | 1 second probably isn't intrinsic to consciousness. That
               | is, there is no in principle problem with having
               | consciousness supervene on neural networks that have much
               | longer cycles for the basic substrate of communication.
               | It also probably doesn't matter that neurons communicate
               | through opening and closing ion channels, are made of
               | carbon atoms, are constructed by protein assemblies
               | produced from DNA, etc.
               | 
               | What we need is to represent every necessary feature and
               | relationship involved in manifesting consciousness. As a
               | first pass estimation, it seems very likely that only the
               | information-bearing structures are necessary for
               | consciousness. But the information-bearing structures are
               | substrate independent. It follows that a perfect
               | simulation of these information-bearing structures
               | engaged in their typical dynamics would be conscious.
               | 
               | >You can simulate the interaction [...] but the two are
               | not the same.
               | 
               | This idea that something needs to be "the same" is doing
               | a lot of work in your argument. But most relationships
               | involved in brains are incidental to consciousness. We
               | need to get clear on what the necessary features of
               | consciousness are. This is the only measure of sameness
               | that matters, not superficial resemblance.
        
           | amelius wrote:
           | Ok, next experiment.
           | 
           | Imagine you took a brain and replaced one neuron by a
           | transistor (or gate) that performs the exact same function as
           | the neuron.
           | 
           | Now replace more and more neurons until all neurons are now
           | transistors.
           | 
           | Would the resulting being be conscious and experience qualia,
           | like the original did? If not, at what point was there a
           | notable change?
           | 
           | https://en.wikipedia.org/wiki/Ship_of_Theseus
        
             | vundercind wrote:
             | The same function, down to quantum and gravity et c effects
             | on everything around it, and accepting and reacting to
             | same? Yeah probably, but we're back to having to "run" this
             | on the same arrangement of actual matter as the original.
             | 
             | [edit] there's an obvious attack on this, but I'll go ahead
             | and note my position on it: the whole premise that we can
             | do any of this without just _using actual matter the
             | ordinary way_ is so far into magical territory that we
             | might as well ask "what if lephuchans simulated it?" or
             | "what if god imagined the simulation?"--well ok, sure, I
             | guess if magic is involved that could work, but what's the
             | point of even considering it?
             | 
             | "What if a miracle occurred?" isn't a rebuttal to the
             | position that consciousness as we know it likely can't be
             | simulated by simulating physics, because you can rebut
             | anything with it. Its admission to a discussion is the same
             | as giving up on figuring out anything.
        
               | mistermann wrote:
               | > there's an obvious attack on this, but I'll go ahead
               | and note my position on it: the whole premise that we can
               | do any of this without just using actual matter the
               | ordinary way is so far into magical territory that we
               | might as well ask "what if lephuchans simulated it?" or
               | "what if god imagined the simulation?"--well ok, sure, I
               | guess if magic is involved that could work, but what's
               | the point of even considering it?
               | 
               | That's one of the main points of using a thought
               | experiment: by declaring axioms explicitly, by fiat
               | ("true" _by definition_ ), it prevents the mind from
               | taking advantage of _thought terminating_ get out of jail
               | free cards like this, it forces people to argue their
               | point.
        
           | swid wrote:
           | Are you questioning if physics is computable? Even if it is
           | not fully, we must be able to approximate it quite well.
           | 
           | Suppose we scan more than just the person, but a local
           | environment around them and simulated the whole box.
           | 
           | The update that occurs as the person sits in the room
           | involves them considering their own existence. Maybe they
           | create art about it. If the simulation is to produce accurate
           | results, they will need to feel alive to themselves.
           | 
           | We agree we can simulate an explosion and get accurate
           | results; if we can't get an accurate simulation of a person,
           | why?
        
           | hackinthebochs wrote:
           | >It's a little electricity moving around, looking nothing
           | whatsoever like the real thing.
           | 
           | Why does "looking ... like the real thing" have any relevance
           | for consciousness? What property of a conscious substance is
           | captured by this "looking like" criteria? Is consciousness
           | (partially) a feature of the substrate itself or how the
           | substrate moves through some background consciousness aether,
           | or something else? If you can't articulate what this special
           | criteria is, then why think a simulation isn't conscious,
           | which by assumption reproduces all information dynamics of
           | the physical phenomena?
        
         | BobbyJo wrote:
         | > If you think the answer might be no, then congratulations,
         | you actually believe in immaterial souls
         | 
         | If you scan a body of water, and simulate it perfectly, the
         | resulting simulation will not be wet. You can't separate a
         | material process from the material _completely_. Consciousness
         | may be a result of carbon being a substrate in the
         | interactions. It might be because the brain is wet when those
         | processes happen. There is plenty of room between believing a
         | perfect computational simulation is not conscious and believing
         | in immaterial souls.
        
           | codeflo wrote:
           | > If you scan a body of water, and simulate it perfectly, the
           | resulting simulation will not be wet.
           | 
           | It will be wet to the simulated being that's swimming in it.
           | 
           | > Consciousness may be a result of carbon being a substrate
           | in the interactions.
           | 
           | Are you conscious? If so, how did you find out that you're
           | made from actual carbon atoms and not simulated ones?
        
             | BobbyJo wrote:
             | > It will be wet to the simulated being that's swimming in
             | it.
             | 
             | Which has an entirely different qualia to us, the beings
             | who consciousness we are trying to unravel.
             | 
             | > Are you conscious?
             | 
             | That's the big question.
             | 
             | > If so, how did you find out that you're made from actual
             | carbon atoms and not simulated ones?
             | 
             | If I assume I'm conscious, whatever my atoms are, they are
             | the atoms of concern with regard to said consciousness.
        
         | amelius wrote:
         | What if you forked the simulator? Would there be two
         | consciousnesses experiencing qualia?
        
         | igleria wrote:
         | I feel like I've seen this exact post this week on the same
         | topic on this website. Am I going mental, or is hackernews
         | merging duplicate posts?
        
         | strogonoff wrote:
         | The "total scan" argument, when presented to further a
         | physicalist stance ("surely if we were to scan you using some
         | fantasy tech X we would get an exact copy of you, including any
         | consciousness if it exists, and to deny that is to believe in
         | ghosts"), is unconvincing on at least two counts: 1) fantasy
         | tech illustrating axiomatic belief in particular physical
         | models that are in vogue today but will not hold in the long
         | run (Heisenberg limit? who is she?); and 2) believing that the
         | only alternative to monistic materialism is body-soul dualism,
         | which is depressingly common among STEM folk philosophical
         | naivete.
         | 
         | The most obvious objection is that perceived time-space is a
         | map of some fundamentally inaccessible to us territory, that
         | modern physical models on which the argument depends are likely
         | only covering (imperfectly) a minuscule part of that territory,
         | and that the map may never be fully precise and complete
         | regardless of technology (since a map that is fully precise and
         | complete is _the_ territory).
        
         | QuadmasterXLII wrote:
         | Instead of one person, simulate the earth starting several
         | million years before the dawn of man. If you think the humans
         | that evolve in some simulation runs are not conscious, you
         | believe in souls. If in the fake world you predict that in some
         | large fraction of simulations of the unfeeling simulacra will
         | nonetheless invent and argue about the concept of qualia, you
         | believe in immaterial souls. If you suspect that because they
         | don't have souls they won't write books about qualia you
         | believe in material souls, souls that both affect and are
         | affected by protons and electrons, souls that physics will
         | eventually find.
        
         | basil-rash wrote:
         | Scott Aaronson's take on this is certainly worth a read for
         | anyone interested in consciousness , quantum mechanics,
         | comparability theory, etc: https://arxiv.org/abs/1306.0159
        
         | root_axis wrote:
         | > _This is a thought experiment, and what I suggest isn 't
         | ruled out by physics_
         | 
         | It is actually ruled out by the uncertainty principle. A
         | simulation with perfect fidelity is not a simulation, it's the
         | thing itself.
        
         | birktj wrote:
         | Not necessarily, this assumes that it is possible to perfectly
         | simulate physics on computers. It is not obvious that this is
         | true. For one it assumes that physical interactions happen in
         | discrete time steps (or at least be equivalent with a process
         | that happens in discrete time steps). It also assumes that it
         | is possible to perfectly scan the all the properties of some
         | piece of matter (which we know is not possible)
        
         | filipezf wrote:
         | As a sibling comment put, the Scott Aaronson post has lots of
         | interesting questions about this. Do the aliens who are
         | watching us being simulated inside matrix think we are
         | _actually_ conscious? What if they freeze the program for 100
         | years, or run the computation encrypted, or if the  'computer'
         | is just a human inside a room shuffling papers? Is the computer
         | simulation of a water drop _wet_ ?
         | 
         | I found this article [0] very insightful, where they basically
         | propose that consciousness is relative to _whom_ you ask. We
         | inside the simulation may attribute consciousness to each
         | other. The aliens running it may not. What is relevant is the
         | degree of isomorphism between our simulated brain processes and
         | their real ones. So things will advance from all these back-
         | and-forth nebulous arguments only when neuroscience becomes
         | able to explain mechanistically _why_ people claim to be
         | conscious.
         | 
         | [0] A Relativistic Theory of Consciousness.
         | https://www.frontiersin.org/journals/psychology/articles/10....
        
         | grishka wrote:
         | The answer might be no because it's neither proven nor
         | disproven that the universe as we all perceive it is the
         | fundamental layer of reality.
        
       | kkoncevicius wrote:
       | To me it seems that the hard problem of consciousness can be
       | stated a lot simpler, like so:
       | 
       |  _How can we tell if another person is conscious or not?_
       | 
       | As far as I see, this is not possible and will never be possible.
       | Hence the "hard problem".
        
         | pmayrgundter wrote:
         | Mechanical telepathy may be possible, and for me could possibly
         | answer this question. You and they put on a device whereby you
         | tap into their conscious experience, see through their eyes,
         | hear the voices in their head...
        
           | TaupeRanger wrote:
           | Even then, you're still only experiencing your own, single,
           | unitary stream of experience, you've just replaced or
           | superimposed parts of it with signals from another nervous
           | system. But even if you can somehow replace/superimpose the
           | signals coming through their optic nerve, for example, into
           | your own experience, that still doesn't answer the question
           | of whether or not they have their own stream of experience to
           | begin with. That is simply unknowable, outside of the
           | reasonable assumptions we all make to avoid solipsism (but
           | they are still assumptions at the end of the day).
        
           | Thiez wrote:
           | That seems unlikely, as neural networks don't all develop
           | exactly the same. There is some natural variance which brain
           | regions perform which function. E.g. Broca's area (very
           | important for speech) only 'usually' lies on your left
           | hemisphere. We know from experiments that stimulating certain
           | areas of the brain produces certain (predictable) feelings,
           | but to stimulate the brain in such a way to transfer an exact
           | thought or a specific vision would seem impossible without a
           | very detailed scan of the source and destination brain, and a
           | complex remapping in between. And some experiences may not be
           | transferable if the target doesn't hawe the required
           | circuitry.
        
         | poikroequ wrote:
         | We don't know if dark matter exists, but we can still observe
         | it by its gravitational effect on large astronomical objects.
         | We can't be sure it's dark matter, but it's "likely" dark
         | matter.
         | 
         | I believe we can do something similar with consciousness. We
         | can make measurements or observations of a person and conclude
         | they are "likely" consciousness.
         | 
         | Maybe we can't ever be 100% certain whether a person is
         | conscious or not. But nothing in science is 100% certain. No
         | matter how much evidence we have, it would only take a single
         | counterexample to disprove a well established scientific
         | theory.
        
           | mistermann wrote:
           | > But nothing in science is 100% certain.
           | 
           | Scientists are arguably "in" (a part of) science, and they
           | are often extremely certain (as a consequence of being
           | culturalized humans).
        
       | detourdog wrote:
       | I'm still thinking about what Helen Keller discussed in her paper
       | on consciousness and language.
       | 
       | https://news.ycombinator.com/item?id=40466814 The paper is linked
       | and disused in this thread. Her description of the void before
       | having language is eye opening.
        
       | pmayrgundter wrote:
       | Robert Kuhn is a really impressive dude. I've been occasionally
       | running across his videos on YT from these interviews. I'm very
       | impressed that he's rolling it all up into a written research
       | project as well.
       | 
       | "I have discussed consciousness with over 200 scientists and
       | philosophers who work on or think about consciousness and related
       | fields (Closer To Truth YouTube; Closer To Truth website)."
       | 
       | https://www.youtube.com/playlist?list=PLFJr3pJl27pJKWEUWv9X5...
       | 
       | https://www.youtube.com/@CloserToTruthTV/videos
        
       | utkarsh858 wrote:
       | Vedic philosophy has an interesting take on the problem of
       | consciousness.
       | 
       | It take consciousness to be emanating from particles the size of
       | atoms. They word those atomic particles as 'atma' ( in english
       | souls, some even call it spiriton!).Those particles are
       | fundamental to the universe and indivisible like quarks, bosons
       | etc. Like radiation emanating from sun, it handles consciousness
       | as 'emanating' from soul.
       | 
       | Each and every living being starting from size of a cell has a
       | soul feeling (partially) about mechanisms of its body. A multi-
       | cellular organism is then explained as a universe in itself where
       | millions of cells with souls are thriving. The organism will then
       | contain a 'chief soul' directing the working of whole body (which
       | will be us case of humans). Further the philosophy expands this
       | concept to the real universe in which all organisms with their
       | individual consciousness are directed by a chief 'super
       | consciousness'( in Vedic terminology it is termed as paramatma,
       | some translate that as equal to God Concept) Although then it
       | further expands by saying that there are infinite (almost)
       | parallel universes but that's other thing...
        
         | Thiez wrote:
         | That's a nice story but does it make any testable predictions?
         | Because it appears to introduce many new concepts that would be
         | measurable with particle physics, yet mysteriously have never
         | been observed. And if these magic soul particles don't interact
         | with matter in measurable ways, how do you know their size?
        
       | robwwilliams wrote:
       | What a massive and impressive coverage. The author, Robert Kuhn
       | of Closer to Truth (https://closertotruth.com), ends this beast
       | with a request to readers:
       | 
       | > Feedback is appreciated, critique too--especially explanations
       | or theories of consciousness not included, or not described
       | accurately, or not classified properly; also, improvements of the
       | classification typology.
       | 
       | I think RK would enjoy Humberto Maturana's take on cognition and
       | self-cognition. Maturana usually does not use the word
       | "consciousness".
       | 
       | Start with Maturana's book with Francisco Valera:
       | 
       | Autopoiesis and Cognition: The Realization of the Living (1970)
       | 
       | The appendix of this book is important ("The Nervous System").
       | Last few pages blew my brain or mind ;-)
       | 
       | Thinking about consciousness without thinking more deeply about
       | temporality is one problem most (or perhaps even all) models of
       | consciousness still have.
       | 
       | Since Robert Kuhn works in thalamocortical activity the theme of
       | timing should resonant.
        
       | Animats wrote:
       | _" The implications of consciousness explanations or theories are
       | assessed with respect to four questions: meaning/purpose/value
       | (if any); AI consciousness; virtual immortality; and survival
       | beyond death."_
       | 
       | This is theology. What's it doing in Elsevier's "Progress in
       | Biophysics and Molecular Biology"?
       | 
       | Most of the classical arguments in this area are now obsolete.
       | The classic big question, presented in the article, was, "Out of
       | meat, how do you get thought?". That's no longer so mysterious.
       | You get some basic processing elements from molecular biology.
       | The puzzle, for a long time, was, can a large number of basic
       | processing elements with no overall design self-organize into
       | intelligence. Then came LLMs, which do exactly that.
        
       ___________________________________________________________________
       (page generated 2024-07-05 23:01 UTC)