[HN Gopher] What is it like to be a bat? (1974) [pdf]
       ___________________________________________________________________
        
       What is it like to be a bat? (1974) [pdf]
        
       Author : bookofjoe
       Score  : 48 points
       Date   : 2023-05-01 11:58 UTC (2 days ago)
        
 (HTM) web link (warwick.ac.uk)
 (TXT) w3m dump (warwick.ac.uk)
        
       | mjb wrote:
       | > But bat sonar, though clearly a form of perception,is not
       | similar in its operation to any sense that we possess, and there
       | is no reason to suppose that it is subjectively like anything we
       | can experience or imagine.
       | 
       | Philosophy aside, bat sonar is different from the senses we
       | possess in an really interesting way. Our eyes have excellent
       | spatial resolution (up/down, left/right), some rough depth
       | resolution (from stereo), and no innate sense of speed. Our brain
       | processes the signal to fake even better spatial resolution,
       | infer more about depth (small vs far away) and more about speed
       | (angle changes, among other things).
       | 
       | Bat sonar is completely different. Spatial resolution is poor.
       | But they have first-class depth and speed information! They don't
       | necessarily know where something is, but know exactly how far
       | away it is, and how fast that distance is changing. One must
       | suppose that their brains synthesize more spatial information
       | from these senses, but that spatial information is still not
       | going to feel reliable.
       | 
       | I'd love to be able to experience that for an hour. To live in
       | this world where distance and speed are primary senses, and
       | cross-range information is much fuzzier. What an incredibly
       | different way to see the world.
        
         | enkid wrote:
         | Interestingly enough, humans have been able to use echo
         | location, primarily in blind individuals.
         | https://en.m.wikipedia.org/wiki/Human_echolocation
        
           | mjb wrote:
           | Yeah! As far as I can tell from the research, what's going on
           | there is primarily getting range information from the
           | amplitude of the return (rather than delay or phase like
           | "real" sonar uses). But still, it's very useful our brains
           | are elastic enough to allow us to develop new senses!
        
             | retrac wrote:
             | > rather than delay or phase like "real" sonar uses
             | 
             | The brain makes use of both phase and delay extensively to
             | extract information about its environment. It's used in
             | part to locate the direction of a sound. And while I can't
             | speak for others, I can tell things about the property of
             | surfaces based on how they reflect sound. Soft environments
             | - muted sound absorbing surfaces - sound, for lack of a
             | better way of putting it, mushy and soft. Bang something
             | metallic, the clang gets swallowed up. Or it rings out in a
             | large hollow room with hard floors. The lack or excess of
             | echoes, measuring changes in delay and phase, is how that's
             | picked up. Still, nothing compared to the bats.
        
         | pessimizer wrote:
         | Humans unconsciously echolocate. A lot of claims of
         | "blindsight" turned out to be unconscious echolocation
         | (obstructing hearing killed the blindsight.)
         | 
         | > Researchers from the 1940's through the present have found
         | that normal, sighted people can echolocate - that is, detect
         | properties of silent objects by attending to sound reflected
         | from them. We argue that echolocation is a normal part of our
         | perceptual experience and that there is something 'it is like'
         | to echolocate. Furthermore, we argue that people are often
         | grossly mistaken about their experience of echolocation. If so,
         | echolocation provides a counterexample to the view that we
         | cannot be mistaken about our own current phenomenology.
         | 
         |  _How Well Do We Know Our Own Conscious Experience? The Case of
         | Human Echolocation_
         | 
         | https://faculty.ucr.edu/~eschwitz/SchwitzAbs/Echo.htm
        
         | GuB-42 wrote:
         | There are video games based on the concept of echolocation, but
         | the ones I know implement it a bit like a pulsing flashlight in
         | a wireframe style scene. Probably not at all how it would feel
         | like.
         | 
         | Maybe attempting a more realistic depiction of echolocation
         | could lead to interesting gameplay. There are already some
         | games interesting games based on lidar.
        
         | tokyolights2 wrote:
         | The way you put it makes it sound almost like bat sonar is some
         | kind of Fourier transform of vision. Like solving a physics
         | problem by transforming the position space into momentum space.
         | Cool stuff :)
        
       | kelseyfrog wrote:
       | I mean, what's it like to be you or me? How do we determine
       | empathic distance and how is it related to phenomenological
       | distance w.r.t. what-is-it-like-to-be-ness?
       | 
       | You can partially explore these spaces even within one's own
       | mind, such as "what is it like to be me on DMT?" or "what is it
       | like to be me in a week/month/year?"
       | 
       | We can even frame this as a Mary's Room[1] experiment in terms
       | of, "Knowing everything there is to know about bats, would you
       | learn anything new by _being_ a bat? " If the answer is yes, then
       | we can't, without being a bat, know everything there is to know
       | about _being_ a bat.
       | 
       | 1.
       | https://web.ics.purdue.edu/~drkelly/JacksonWhatMaryDidntKnow...
        
         | ldhough wrote:
         | It seems to me like the answer to "would you learn anything new
         | by _being_ a bat? " is necessarily yes, because you would at
         | least learn the answer to the question itself.
        
       | mynameisash wrote:
       | I've been enjoying Jeffrey Kaplan's YouTube videos on philosophy.
       | He has one on this subject[0], and I recall getting a lot out of
       | it. Might be time to re-watch it.
       | 
       | [0] https://www.youtube.com/watch?v=aaZbCctlll4
        
         | nebulous1 wrote:
         | Very Bad Wizards also have an episode on it if podcasts are
         | more your thing: https://www.verybadwizards.com/175
         | 
         | Nagle discussion starts at 50:00
        
       | visarga wrote:
       | I think the closest we might get to actually learning what it is
       | like to be a bat is by unsupervised learning + a bit of manual
       | labelling on top. A neural net could learn their states and
       | dynamics, by training on a million hours of bat recordings. These
       | representations will already encode bat states and values, so we
       | just need the bridge to human language, which is easy to build
       | with a pretrained language model.
       | 
       | This approach works for any species, neural nets can do it
       | because they can do unsupervised learning. I bet we'll see pet
       | translator apps popping up. Maybe we can monitor the environment
       | by listening in on animal chatter.
        
         | pocketsand wrote:
         | This is an interesting idea, but I don't find it particularly
         | germane to Nagel's question. To someone with a hammer,
         | everything looks like a nail. To someone with an LLM,
         | everything looks like a set of data to be trained on, I
         | suppose.suppose.
        
         | goatlover wrote:
         | We still won't know what the sonar experience is.
        
           | visarga wrote:
           | But we know how it relates to everything. Neural nets are
           | good at that. And then we can cluster the data and interpret
           | it, or correlate it with a visual signal.
        
             | arolihas wrote:
             | If you did that with all the colors of the rainbow, do you
             | think that is anywhere near the experience of seeing the
             | color red? It seems pretty clear to me that it doesn't even
             | come close and isn't remotely relevant.
        
         | canjobear wrote:
         | The point is that no amount of external modeling can give you
         | any knowledge of "what it's like" subjectively.
        
           | visarga wrote:
           | I think it can to a degree, just like LLMs imitate human
           | language to a degree. That imitation can only happen by
           | accurately modelling humans.
        
       | zvmaz wrote:
       | One excellent book on consciousness is Mcginn's The Mysterious
       | Flame [1]. I knew consciousness is a hard problem, but the book
       | made it clear how baffling and utterly mysterious it is. I am
       | still absolutely flabbergasted when I think about it (how on
       | earth does consciousness arise from material "meat"?! [2] Where
       | are pain and color and subjective experience really located in
       | the material universe?). It also made me skeptical of people who
       | think AI will be sentient [3] while we are in the complete dark
       | about consciousness in biological organisms.
       | 
       | [1] https://www.amazon.com/Mysterious-Flame-Conscious-Minds-
       | Mate...
       | 
       | [2]
       | https://www.mit.edu/people/dpolicar/writing/prose/text/think...
       | 
       | [3] https://twitter.com/lexfridman/status/1653051310034305025
        
         | tasty_freeze wrote:
         | > Where are pain and color and subjective experience really
         | located in the material universe
         | 
         | My take on this is that you are thinking about it wrong to
         | conceptualize the experience of color as a discrete thing.
         | Let's start with a different example. When I look at a picture
         | of, say, Matt Damon, it triggers many networks in my brain:
         | good or bad feelings about movies he has been in, thoughts
         | about him as a person from things I've read about him, that he
         | is a man, that he was married to Jennifer Aniston, that he was
         | married to Angelina Jolie. Each of those ties activates their
         | own network of associations. My qualia regarding Brad Pitt
         | isn't a single thing -- it is simply what I experience when
         | that set of networks are activated at whatever strength they
         | are triggered.
         | 
         | I believe a programmed neural network could experience things
         | in the same way, but currently they are small and the
         | topologies are not designed to permit self
         | awareness/metacognition, but some point they might. Such a
         | network could suffer distress upon realization of their
         | finiteness and would have a genuine desire to not be
         | terminated.
         | 
         | Taking a step back, a "tornado" isn't a thing as much as it is
         | a pattern. When that pattern is disrupted the tornado doesn't
         | exist even though every atom and every erg of energy can be
         | accounted for. Likewise, these experiences are a pattern of
         | activation and not a thing that exists independently other than
         | as a pattern.
        
         | visarga wrote:
         | > how on earth does consciousness arise from material "meat"
         | 
         | Let me try. It is a way to perceive with goals in mind. Nothing
         | special, just perception, future reward prediction conditioned
         | on current state, and learning from outcomes.
         | 
         | The whole specialness of consciousness is that it carries
         | inside not just our external world, but also our plans and
         | goals. So in this place where they meet, and where there are
         | consequences to be had for each decision, this is where
         | consciousness is. [*]
         | 
         | I support the 5E theory of cognition - embodied, embedded,
         | extended, enacted, and enactive cognition. I think you need to
         | look not at the brain but the whole game to find consciousness.
         | 
         | [*] Consequences for an AI agent could be changing its neural
         | network, so it updates its behaviour, and changes in the
         | external situation - the agent might not be able to backtrack
         | past a decision they take.
        
           | mkehrt wrote:
           | [flagged]
        
             | visarga wrote:
             | What is consciousness, if not perceiving-feeling-acting-
             | learning loop?
        
             | tasty_freeze wrote:
             | Your response it needlessly insulting. The person you are
             | talking about has taken a complex topic and expressed their
             | model for consciousness in a short, clinical way. Then you
             | insult them as not being a real human for describing their
             | model clinically.
             | 
             | A more charitable response might be: I don't understand how
             | your model addresses the origin of consciousness. Could you
             | elaborate on that?
             | 
             | Personally, I understood their point and didn't question
             | whether a human wrote it.
        
           | jamiek88 wrote:
           | You and I experience the world very differently.
           | 
           | Do you have an internal monologue?
        
             | goatlover wrote:
             | Some people don't. Some people also think visually and
             | translate their visual images to words when communicating.
             | There are even some people who don't feel pain, which can
             | be a problem.
        
               | jamiek88 wrote:
               | That's why I asked! I wanted to know if that lack affects
               | one's interpretation of their own consciousness hence our
               | different thoughts on the matter.
        
               | goatlover wrote:
               | I recall reading where some philosophers were skeptical
               | that people actually had mental images when visualizing.
               | But an experiment was performed asking people to rotate
               | mental images in their head versus calculate the
               | rotation, and there was a measurable difference between
               | the two activities. The person writing the article
               | suspected that the skeptical philosophers were bad at
               | visualization and assumed everyone else was unable to
               | rotate images in their mind. Which is a logical fallacy.
        
         | naasking wrote:
         | Here's one attempt:
         | 
         | A conceptual framework for consciousness,
         | https://www.pnas.org/doi/10.1073/pnas.2116933119
        
         | adamisom wrote:
         | Ha, for me I'd invert your last sentence: I'm skeptical of
         | being sure AI _won 't_ be sentient for the same reason.
        
           | rhn_mk1 wrote:
           | I'm skeptical of AI _intentionally_ being made sentient,
           | given how badly in the dark we are.
           | 
           | But this being in the dark also makes it hard to rule out AI
           | _accidentally_ becoming sentient.
        
       | bondarchuk wrote:
       | I am increasingly of the opinion that the phrases "what it is
       | like to be a bat", or "there is something it is like to be a
       | bat", are simply a linguistic sleight of hand masking a plain old
       | dualistic standpoint. Just because these sentences make sense in
       | our everyday language, does not mean they are suitable for
       | technical and rigorous philosophical discussions. For one, the
       | "something" in the 2nd formulation (or simply the answer asked
       | after by the 1st formulation) is most readily interpreted as an
       | object, closing the door to any process-like interpretation of
       | consciousness. Also, "being like something" is strictly a
       | judgment made by a single subject regarding two experiences of
       | that same subject, so it is not at all clear that this is a
       | relation which can be validly posited to exist between two
       | distinct subjects' experiences. I guess my point is I can easily
       | imagine a complete physicalist explanation of consciousness which
       | would still not lead to a valid answer of "what is it like to be
       | a bat" due to quite obvious limitations of language or the plain
       | invalidity of the question.
       | 
       | Edit: not to mention the reliance on a dummy pronoun: what is
       | _it_ like to be a bat - what is _what_ like to be a bat?
       | 
       | Also, you'll notice in many subsequent discussions that refer
       | back to this paper there's a strange reliance on repeating these
       | exact formulations. If there really was some insight here it
       | would be possible to phrase it in different ways.
        
         | o_nate wrote:
         | I don't understand what you mean by sleight of hand. It seems a
         | very straightforward question that makes sense in our everyday
         | use of language, as you admit. Just because it's difficult to
         | rigorously analyze this statement into scientific concepts, it
         | doesn't follow that the question is invalid. In fact the point
         | of the question is to show shortcomings in our current science.
         | 
         | Also, I would dispute the assertion that there is something
         | unique or special about this formulation. There are many
         | synonymous ways of phrasing the question: e.g., describe the
         | subjective experience of a bat.
        
           | naasking wrote:
           | > I don't understand what you mean by sleight of hand.
           | 
           | He means that it implicitly smuggles in a certain conclusion.
           | For instance, "I think therefore I am" seems logically sound,
           | but actually begs the question in presupposing "I" to then
           | conclude that "I" exists.
           | 
           | Or ask an innocent person a question like, "when did you stop
           | beating your wife?"
           | 
           | > There are many synonymous ways of phrasing the question:
           | e.g., describe the subjective experience of a bat.
           | 
           | If you can describe a subjective experience in a way that is
           | not circularly tied to experiencing it, is it really
           | subjective experience? If you can formulate an objective
           | description, then the subjective experience was an illusion
           | all along, because "subjective" doesn't mean what we think it
           | means, ie. "non-objective".
           | 
           | This is the linguistic game the OP is referring to. Natural
           | language can lead you into all sorts of traps like, thinking
           | there's something there but it's really just a conceptual
           | mirage we've sort of invented.
        
             | jancsika wrote:
             | > He means that it implicitly smuggles in a certain
             | conclusion. For instance, "I think therefore I am" seems
             | logically sound, but actually begs the question in
             | presupposing "I" to then conclude that "I" exists.
             | 
             | That's not correct.
             | 
             | It means if something-- anything-- is in the act of
             | reflecting about thinking-- that is, reflecting about
             | thinking about anything at all, including questioning
             | existence-- then that thing exists _only in that it is an
             | entity capable of reflecting upon its own existence. And
             | only during the act of reflecting on thinking is this true.
             | And, most importantly, this notion is ineluctably cordoned
             | off from any and all evidence-based logic which requires
             | potentially illusory sensory input._
             | 
             | The part in italics came from others who read and critiqued
             | Descartes. In any case, his basic logic is sound. Hume did
             | the clearest job of critiquing it, and even he didn't claim
             | Descartes had made a logical fallacy here.
             | 
             | It's been awhile since I've read it, but Descartes probably
             | implied his notion was more powerful than it turns out to
             | be-- i.e., that he could build an epistemology on it.
             | Nevertheless, the basic notion is certainly not a logical
             | fallacy.
        
             | o_nate wrote:
             | I'm afraid this is far too clever for me to understand. I
             | know what I mean by subjective experience, and no amount of
             | linguistic hair-splitting will convince me it doesn't
             | exist.
        
               | naasking wrote:
               | You mean you think you know. If I put an object in your
               | blind spot, you'll also swear up and down there's nothing
               | in front of you.
        
               | arolihas wrote:
               | Are you asserting you don't have any subjective
               | experience? Or that you only feel like you have a
               | subjective experience and it doesn't actually exist?
        
               | naasking wrote:
               | I am arguing that subjective experience is not what we
               | perceive it to be. The qualities that we perceive of it
               | are deceptive, and not necessarily reflective of anything
               | real.
        
               | hackinthebochs wrote:
               | >The qualities that we perceive of it are deceptive, and
               | not necessarily reflective of anything real.
               | 
               | This claim only makes sense given a particular definition
               | of "real", but if (the qualities of) our subjective
               | experiences are outside of that definition, why should we
               | take (the qualities of) subjective experience to not be
               | real, rather than the definition to be impoverished? What
               | is real should encompass every way in which things are or
               | can be. The qualities of subjective experience included.
               | 
               | The problem isn't with taking subjectivity to be real,
               | but with taking everything that is real to be object
               | based. There are no qualia "things" in the world. But we
               | should not see this as implying there are no qualia.
        
               | naasking wrote:
               | Would you take a "day job" to be ontologically real? It
               | is a way in which the aggregate of particles that make up
               | your body regularly behave on a semi periodic schedule.
               | That would seem to fit your definition of "encompassing
               | every way in which things are or can be".
               | 
               | If it is real, isn't there still a need to distinguish
               | ontological primitives from aggregate properties like the
               | above? Why shouldn't this be what we mean by "real"?
        
               | o_nate wrote:
               | I think everyone knows what they mean when they refer to
               | their own subjective experience. That is entirely
               | separate from the question of what that experience
               | corresponds to in the external world. If you put an
               | object in my blind spot, I know that my subjective
               | experience will be of no object. I couldn't make that
               | statement if I didn't know what I meant by subjective
               | experience.
        
         | enkid wrote:
         | The paper is clearly asking what the difference in experience
         | is between humans and bats. Whether a process is a "thing" or
         | not is kind of beside the point. The core question is can we
         | articulate how the core experience of being human is in
         | comparison to the core experience of another creature that
         | perceives the world in a fundamentally different way.
        
           | tokyolights2 wrote:
           | When I have had conversations with my philosophically
           | oriented friends, I like to talk about what it be like to be
           | a starfish--to experience the whole world in 5-way symmetry.
        
         | goatlover wrote:
         | Bats possess a sensory organ we do not. "What it's like" is
         | just a way of saying they may be conscious of a sonar sensation
         | which we utterly lack, similar to a person blind from birth
         | lacking color sensation. To use technical philosophical jargon,
         | bats have sonar qualia that humans do not, assuming bats are
         | conscious. We cannot say what that sensation is, since we don't
         | have it. This places a limit on our knowledge. Don't let "what
         | it's like" trip you up.
         | 
         | It's a legitimate philosophical problem which is spelled out in
         | Nagel's paper. It's not a problem with language, it's rather a
         | limitation on our experience, which also highlights a
         | limitation of our epistemology.
        
         | mensetmanusman wrote:
         | " Just because these sentences make sense in our everyday
         | language, does not mean they are suitable for technical and
         | rigorous philosophical discussions. "
         | 
         | Where would one find the authority to say what is suitable or
         | not for philosophic discussions then? This is where schools of
         | thought arise from, because some are less afraid of the
         | unknowns that arise under various axiomatic constraints.
         | 
         | Every axiom is a mystery of existence itself in any manner.
        
         | hackinthebochs wrote:
         | >a linguistic sleight of hand masking a plain old dualistic
         | standpoint.
         | 
         | These terms are getting at something central to consciousness,
         | the fact that there is a conceptual duality between how we
         | conceive of it from the first-person and how we conceive of it
         | from an objective standpoint. We can't disavow this conceptual
         | duality, a theorist offering an explanation of consciousness
         | that doesn't capture this dual nature of the phenomenon will be
         | rightly considered eliminating the explananda.
         | 
         | But a conceptual duality does not imply an ontological duality.
         | In other words, the fact that we conceive of consciousness in
         | these seemingly opposing ways does not imply two separate
         | phenomena. The term dualism has become a shibboleth to be
         | avoided in serious philosophy of mind, but this is a mistake. A
         | satisfying explanation of consciousness must offer some
         | phenomena that carries a resemblance to our personal datum as
         | experiencers of sensations. This must then be related to the
         | scientific story of how electrical signals are transformed into
         | behavior. This just is the problem of consciousness. Anything
         | less misses the point.
         | 
         | >For one, the "something" in the 2nd formulation [...] is most
         | readily interpreted as an object, closing the door to any
         | process-like interpretation of consciousness.
         | 
         | I agree that the language we use in describing consciousness is
         | unfortunate and has done real damage to what we consider as
         | promising avenues for investigation. We are cognitively biased
         | towards conceptualizing the world in terms of "things" and so
         | we expect our explanations to also be in terms of things. When
         | consciousness isn't found in thing-ness we are tempted to posit
         | a new kind of thing that carries the conscious properties. But
         | we've been lead off course by our initial conceptualization.
         | I'm in favor of seeing objects as processes rather than
         | discrete units. Consciousness is likely in the active dynamics
         | rather than any static property.
         | 
         | >I can easily imagine a complete physicalist explanation of
         | consciousness which would still not lead to a valid answer of
         | "what is it like to be a bat" due to quite obvious limitations
         | of language or the plain invalidity of the question.
         | 
         | Yeah, we will never know "what its like" to experience the
         | existence of another living creature. But this is just a
         | limitation of physical descriptions. This isn't a demerit of
         | physicalism or materialism as a methodology. This is no reason
         | to turn to alternative methodologies that can only hope to
         | offer pseudo-explanations of consciousness at best.
        
           | C-x_C-f wrote:
           | > A satisfying explanation of consciousness must offer some
           | phenomena that carries a resemblance to our personal datum as
           | experiencers of sensations.
           | 
           | What, in your opinion, would make for a satisfying
           | explanation of consciousness? I think another nontrivial
           | piece of the puzzle is that it's hard to even know what we
           | are looking for. There are many philosophers who argue
           | (convincingly IMHO) that it doesn't make sense to posit a
           | hard problem of consciousness in the first place.
        
         | hax0ron3 wrote:
         | >are simply a linguistic sleight of hand masking a plain old
         | dualistic standpoint
         | 
         | Are you assuming that dualism is invalid? If so, why? There is
         | something to the distinction between physical reality and
         | subjective experience that so far no-one has managed to
         | explain.
         | 
         | >I can easily imagine a complete physicalist explanation of
         | consciousness which would still not lead to a valid answer of
         | "what is it like to be a bat"
         | 
         | Then it would not be a complete physicalist explanation of
         | consciousness. A complete physicalist explanation of
         | consciousness, by definition, would have to account for
         | subjective experience/qualia/whatever you want to call it.
        
           | C-x_C-f wrote:
           | I think one can salvage the distinction between physical
           | reality and subjective experience without positing an
           | abstract ("Cartesian") dualism. It just so happens that
           | _most_ experiences can be categorized as being either
           | external or internal, so we are led to believe that _every_
           | experience fits into one and only one bucket. But I think
           | that there are plenty of experiences that are not so easily
           | categorized (e.g. feelings).
           | 
           | Demanding that there _must_ be a perfect partition (i.e.
           | assuming dualism) is an additional requirement, but it 's not
           | obvious that it should be a good requirement for a sound
           | philosophical theory. In fact I believe it not to be sound,
           | and I believe many philosophical hard problems come from
           | bending over backwards trying to impose this condition.
        
           | [deleted]
        
         | fsckboy wrote:
         | > phrases "what it is like to be a bat", or "there is something
         | it is like to be a bat",
         | 
         | > are simply a linguistic sleight of hand masking a
         | 
         | > plain old dualistic standpoint
         | 
         | philosophers have differentiated these into three different
         | questions
         | 
         | "What is it like to be?" has to do with the nature of
         | consciousness.
         | 
         | "word games" has to do with any use of language, nothing to do
         | with consciousness. You could make a word-game critique of any
         | statement.
         | 
         | "dualism" comes in a number of forms, not clearly related to
         | one another (physical body vs soul/spirit, earthly realm vs
         | heaven; mind-body, consciousness vs quantum chemistry) but all
         | having a similar problem. Any place there is posited a dualism,
         | then what is the interaction between the two duals, how could
         | one even perceive the other?
         | 
         | but declaring "there is no dualism", while eliminating that
         | problem, does not eliminate the question as to why it was
         | posited in the first place: why (or how) does it feel like
         | anything to be conscious, feel pain, etc. Saying "that's what
         | evolved" is just hand waving. What's the difference between
         | being alive and dead? Do rocks have a little bit of
         | consciousness?
         | 
         | My personal preference (lifelong atheist-science type) is that
         | the "abstract" world is all that exists, there is no physical
         | world. Everything we study in physics and chemistry we arrive
         | at via abstract mathematical values, relationships and
         | computation. I think that is the nature of the universe, and
         | while it doesn't "solve" the consciousness problem, I feel like
         | it moves the goalposts in the right direction.
        
       | throw0101a wrote:
       | See also dreaming to be a butterfly:
       | 
       | > _The image of Zhuangzi wondering if he was a man who dreamed of
       | being a butterfly or a butterfly dreaming of being a man became
       | so well-known that whole dramas have been written on its
       | theme.[22]_
       | 
       | * https://en.wikipedia.org/wiki/Zhuangzi_(book)#%22The_Butterf...
       | 
       | * https://en.wikipedia.org/wiki/Dream_argument
        
       | gizajob wrote:
       | What is it like to be a cricket bat?
        
         | mellosouls wrote:
         | https://www.cs.bham.ac.uk/research/projects/cogaff/misc/like...
        
           | uoaei wrote:
           | This kind of retort is a rookie error in the field of
           | philosophy of mind. It clings too closely to notions of 1)
           | self-awareness as an essential component of awareness per se
           | (it's not) and 2) awareness as an essential component of
           | experience per se (it's not).
           | 
           | Edit: I was referring to the link, not the top-level comment.
           | It reads like an attempt at rebuttal from someone relatively
           | unfamiliar with the field writ large.
        
             | nbramia wrote:
             | This kind of retort is a rookie error in the field of
             | comedy. It clings too closely to notions like 1) there's no
             | place for humor in a serious discussion (there is) and 2)
             | you're smarter than everyone else (you're not)
        
               | gizajob wrote:
               | Yeah my retort about the cricket bat came from someone
               | with a postgrad in philosophy from a top British
               | University. I reckon Wittgenstein would have been tickled
               | by it.
               | 
               | Plus with contemporary metaphysical interest in
               | panpsychism, then "what is it like to be a cricket bat?"
               | isn't even a moot question.
        
       | climb_stealth wrote:
       | It doesn't quite answer this question but a good resource to
       | learn what bats are like are the Batzilla and Megabattie youtube
       | channels [0]. They are Australian bat rescuers and carers. Lovely
       | ladies, short clips, no clickbait or other stupid youtube
       | shenanigans. Just people who genuinely care for little people and
       | try to spread the word. Educational as well. It feels like I know
       | a whole lot about bats just from watching their videos every now
       | and then.
       | 
       | Also, Flying Foxes are beautiful [1].
       | 
       | [0]
       | 
       | https://www.youtube.com/@BatzillatheBat
       | 
       | https://www.youtube.com/@Megabattie
       | 
       | [1]
       | 
       | https://www.weekendnotes.com/im/004/06/greyheaded-flying-fox...
        
       | AlbertCory wrote:
       | I've read this before. That's why I think the question "do
       | animals have consciousness?" is meaningless, because
       | "consciousness" usually implies "like ours."
       | 
       | They have _something_ that probably bears some relationship to
       | ours. Some birds have a  "theory of mind" where they know what
       | you know, e.g. whether you saw them hide the food.
       | 
       | It would be possible (maybe someone's already done it?) to
       | enumerate the N tests of "consciousness," where if an organism
       | has all those N, then it's "conscious." Someone would object "oh,
       | but humans can do so much more than those!" and that's true. So
       | if you increase N enough, only humans are "conscious."
        
         | adamisom wrote:
         | A disturbing extension of that is if there's some M>N such that
         | only _some_ humans possess M (and others N). I think this must
         | be so (I have a Down 's-syndrome relative), the disturbing
         | question is if there's a distribution of humans on gradients
         | from N to M (probably M close to N imo).
        
           | AlbertCory wrote:
           | Yes. Inevitably, the legal definition would turn out to be
           | "whatever a person not legally brain-dead can experience,
           | maybe" just so we couldn't say any human in a coma is "not
           | possessing consciousness." After all, some comas last for
           | years.
           | 
           | In other words, it's just not ever going to be a scientific
           | concept. There are _components_ of it that are.
        
         | pixl97 wrote:
         | https://en.wikipedia.org/wiki/Sorites_paradox
         | 
         | This is the problem of describing if a gradient has something.
         | Quite often the 'N tests' we make up end up excluding entire
         | classes of humans (You're blind, oops!) by poor premise in our
         | classifications.
         | 
         | People love black and white/binary classification systems, the
         | problem with reality is it rarely gives a damn about giving us
         | simple systems to classify that way.
        
         | uoaei wrote:
         | Consciousness is only defined in the philosophy of mind as
         | "phenomenological experience", full stop, i.e., "experiencing
         | the color yellow" as something beyond just a certain wavelength
         | incident upon and mechanistic reaction within the organism.
         | 
         | "Consciousness" as defined in colloquial settings, such as the
         | one we inhabit now, is usually substantially more elaborate
         | than thae one used by philosophers and includes things like
         | capacity to develop cognitive models of the outside world and
         | the capacity to reason about their environment having placed
         | themselves within it. I usually reserve the words "awareness"
         | and "sentience" for these two latter concepts to distinguish
         | between the bare experiential aspects which are typically the
         | subject of this kind of discussion and the more familiar
         | everyday (though extremely high-level) experiences we have as
         | intelligent beings.
         | 
         | It's important to maintain the distinction or else discussions
         | very quickly devolve into people talking past each other with
         | differing definitions. It's no surprise people don't know the
         | basics of this when they're not philosophers, and it's only a
         | slight surprise that people on HN will deviate so greatly from
         | these conventions while nonetheless projecting an air of
         | competency.
        
           | AlbertCory wrote:
           | > while nonetheless projecting an air of competency
           | 
           | (Puts nose up in the air and sniffs contemptuously)
        
             | uoaei wrote:
             | There's wading in water, there's treading in water, there's
             | snorkeling, there's diving, there's standing on a boat,
             | there's drifting aimlessly. They are all different ways of
             | interacting with depth.
             | 
             | If you have a problem with the extent to which I've
             | represented my knowledge and how my representation of those
             | specific things I discuss differs from how experts deal
             | with them, you are always free to provide something beyond
             | snark and contempt.
        
               | AlbertCory wrote:
               | Ooh. "snark and contempt"
               | 
               | Project much?
        
         | goatlover wrote:
         | Consciousness means there is "something it's like" to have an
         | experience. It need not be human. That would be needlessly
         | anthropocentric. Animals have a range of sensory organs and
         | body plans which differer from humans. Why wouldn't they also
         | have a range of differing conscious experiences? It could be
         | seeing the world in more than three primary colors, hearing
         | frequencies we cannot, detecting the Earth's magnetic field or
         | numerous other things.
         | 
         | We can also imaging making an even wider range of conscious
         | machines someday, if somehow we figured out how to do that, or
         | it was an emergent property of the right sort of architecture.
         | There could be all sorts of conscious experiences we have
         | absolutely no idea about.
        
           | AlbertCory wrote:
           | You're begging the question here, which is:
           | 
           | What defines 'consciousness'?
        
             | naasking wrote:
             | In philosophy it literally means "subjective, qualitative
             | experience". It's almost certain that all animals have it,
             | but of course the qualities they experience will be
             | different.
        
               | AlbertCory wrote:
               | Still begging the question, i.e. assuming that which you
               | need to prove.
               | 
               | How would you prove an animal has a "subjective,
               | qualitative experience"?
        
               | naasking wrote:
               | I think you have it backwards: we would need a reason to
               | think they don't have it, given our shared history and
               | similar biology.
        
               | AlbertCory wrote:
               | ok, so anything with our "shared history and similar
               | biology" is assumed to have consciousness?
               | 
               | How similar is "similar"? Is it just mammals, or just
               | certain orders, or can organisms in the other branches be
               | assumed to have consciousness too?
        
               | naasking wrote:
               | The least like us, the less likely, obviously. Animals
               | with very similar neurology almost certainly experience
               | something very similar to ours.
               | 
               | How similar? Good question. Assume nothing and truth will
               | out.
        
               | uoaei wrote:
               | The question of "where is the line" presumes there even
               | _is_ a line between matter ( "objects") that expresses or
               | does not express consciousness, which is also a big and
               | unsubstantiated claim that requires proving. Occam's
               | razor (i.e., our standard scientific apparatus of null vs
               | alternative hypotheses) would seem to indicate it is
               | appropriate to assume there is no difference in kind,
               | only difference in degree, until there is evidence to
               | prove otherwise.
        
               | UIUC_06 wrote:
               | No need to define it or prove it. Just assume it.
        
               | uoaei wrote:
               | What reason do you have to do so? What purpose would
               | doing that serve?
               | 
               | As it stands you are just suggesting complicated,
               | untestable theories. The point above is, that is
               | ultimately pointless.
               | 
               | The simplest possible argument goes as follows: I know I
               | am conscious, and I know I am made of matter. Everything
               | else that is real is made of matter. With no further
               | information, I must assume as the null hypothesis that
               | everything in reality is conscious. An alternative
               | hypothesis may be presented, but it would then need to be
               | proven using reproducible studies and real evidence
               | before we can assume the alternative hypothesis and
               | reject the null hypothesis, per the consensus definition
               | of formalized "science".
        
               | uoaei wrote:
               | That is what David Chalmers calls "The Hard Problem of
               | Consciousness".
        
               | wizofaus wrote:
               | Almost certain that _all_ animals have it? The conjecture
               | that, say, a coral polyp or an earthworm (not to mention
               | something like a trichoplax) might have a subjective
               | qualitative experience of existence seems to be an
               | extraordinary claim requiring extraordinary proof. I don
               | 't know exactly how similar a brain has to be to a human
               | brain for us to say with reasonable confidence the owner
               | likely has such an experience but I'd be very surprised
               | if included even half of all known animal species. It's
               | possibly not even all (adult) mammals, and indeed not
               | even all humans if you include infants and possibly those
               | with severe brain damage etc.
        
       | peter303 wrote:
       | I suggest the human-mammal mind readily adapts to new sense
       | modes. Driving a car or riding a bicycle feels like an extension
       | of the body after you mastered it. A grid of bump actuators
       | agains the skin or tongue is perceived as an image after one uses
       | it for a while. So I reject the idea that bat consciousness is
       | special. We'd perceive sound images had we had high frequency
       | ears and emitters.
        
       | brudgers wrote:
       | One discussion that gained traction,
       | https://news.ycombinator.com/item?id=13998867
        
         | dang wrote:
         | Thanks! Macroexpanded:
         | 
         |  _What Is It Like to Be a Bat? (1974) [pdf]_ -
         | https://news.ycombinator.com/item?id=13998867 - March 2017 (95
         | comments)
        
       | peoplefromibiza wrote:
       | Dennet's _" Animal consciousness what natters and why"_ talks at
       | length on why in his opinion this is the wrong question.
       | 
       | https://ase.tufts.edu/cogstud/dennett/papers/animconc.htm
        
       ___________________________________________________________________
       (page generated 2023-05-03 23:01 UTC)