[HN Gopher] What is it like to be a thermostat? (1996)
       ___________________________________________________________________
        
       What is it like to be a thermostat? (1996)
        
       Author : optimalsolver
       Score  : 52 points
       Date   : 2024-12-30 12:18 UTC (10 hours ago)
        
 (HTM) web link (annakaharris.com)
 (TXT) w3m dump (annakaharris.com)
        
       | GiorgioG wrote:
       | Being a thermostat is fucking exhausting. My wife and I are the
       | equivalent of a thermostat for our type 1 diabetic son's blood
       | sugar. It's in our face 24/7.
        
         | frxhvcdtgf wrote:
         | My son has a dozen or so food allergies. It is exhausting.
         | 
         | Then, I know people with autistic kids. Wow.
        
           | thebruce87m wrote:
           | "Health is a crown that the healthy wear, but only the sick
           | can see."
           | 
           | This applies to so many things, not just health. You only
           | appreciate it when it happens to you, or you get a snapshot
           | of someone else's life and it humbles you.
        
         | Trasmatta wrote:
         | I'm type 1 myself, so I feel for you. People don't realize that
         | the hardest part is NOT needles or anything like that, it's the
         | constant mental overhead of thinking about and managing your
         | blood sugar every moment of every day. Every decision you make
         | is informed by it. You make literally hundreds of micro and
         | macro decisions related to your diabetes every day.
         | 
         | A CGM and insulin pump have made my life easier.
         | 
         | How old is your son?
        
           | GiorgioG wrote:
           | He is 13 and he wears a Tandem X2 w/Dexcom G6 (soon to be
           | G7?). It certainly makes life easier, but like you said
           | there's constant mental overhead. We have SugarPixels all
           | over the house, etc. His pump beeps at him so often he's just
           | become accustomed to it and ignores it until he starts
           | feeling bad (typically when he's going low). But that also
           | means that it limits his ability to do things, he's got only
           | a handful of friends that we trust leaving him alone at their
           | home because their parents are willing to keep checking on
           | him, etc.
        
             | Trasmatta wrote:
             | Being a caregiver for a kid with t1 is exhausting, so kudos
             | to you and your wife.
             | 
             | The mental burden is SO HIGH for both the person with the
             | condition and their caregivers. At some point if it feels
             | right, it might be worth having him see a therapist or a
             | counselor who specializes in chronic diseases. There's a
             | very high correlation between depression and type 1
             | diabetes (last I read, you're like 300% more likely to
             | experience severe depression as a T1D.) Especially once he
             | goes into his teenage years, that's a really hard time for
             | a lot of us diabetics (and sometimes during the
             | "rebellious" phase, a kid will intentionally stop taking
             | care of their diabetes).
             | 
             | > His pump beeps at him so often he's just become
             | accustomed to it and ignores it
             | 
             | This is a real problem with insulin pumps and CGMs, because
             | alarm fatigue is a real thing. It makes me mad that they
             | insist on putting in all these alarms you can't configure,
             | because eventually you just start ignoring it. For my CGM,
             | I use xDrip which gives me much more customization around
             | my alarms: I just leave on the ones that are important to
             | me.
        
         | cperciva wrote:
         | Closed loop. Seriously, my t1d-related mental exhaustion is 90%
         | reduced now that I'm using a closed loop. 95% if you don't
         | count the "why did androidaps disconnect from my pump and which
         | bit do I need to restart to get it working again" headaches.
         | 
         | In a way it's eliminating _too much_ mental effort; while it 's
         | useful as a backup, the fact I sometimes completely forget to
         | take insulin with meals is not ideal, even if the closed loop
         | notices and takes care of it for me (since there's inherently
         | more lag when relying on the loop than if I dialed in the
         | insulin at meal time).
        
           | Trasmatta wrote:
           | Do you still enter carbs, or do you just announce meals? I
           | know some people with closed loops just do the latter. If you
           | don't mind sharing, what's your time in range?
           | 
           | I have the Omnipod 5 which is decent, but it's "closed loop"
           | abilities are extremely conservative and a bit disappointing.
           | I've considered going the full way with androidaps, but
           | haven't taken the leap yet.
        
             | cperciva wrote:
             | Androidaps says I'm 85% within 3.9-10.0 (70-180) in the
             | past week. I enter carbs but only to a resolution of 10g;
             | and as I mentioned earlier I sometimes forget to enter them
             | entirely.
             | 
             | Are you using a Dexcom with your omnipod? My impression is
             | that the quality of sensor is the most important factor;
             | with a flaky sensor looping algorithms will be more
             | conservative since they (reasonably) prioritize avoiding
             | lows.
        
               | Trasmatta wrote:
               | > Androidaps says I'm 85% within 3.9-10.0 (70-180) in the
               | past week
               | 
               | Nice!
               | 
               | > I enter carbs but only to a resolution of 10g; and as I
               | mentioned earlier I sometimes forget to enter them
               | entirely.
               | 
               | I'm pretty bad at accurately entering carbs. I eat out
               | enough that I've just gotten to the point where I just
               | come up with a rough estimate (which is often wrong).
               | Which means I have to adjust later pretty frequently with
               | snacks or correction doses. Not ideal, but I got so much
               | "carb counting fatigue" over the years.
               | 
               | > Are you using a Dexcom with your omnipod? My impression
               | is that the quality of sensor is the most important
               | factor; with a flaky sensor looping algorithms will be
               | more conservative since they (reasonably) prioritize
               | avoiding lows.
               | 
               | Yeah, I'm on the G6 still. I think the quality of my
               | readings is pretty good, but the Omnipod 5 algorithm is
               | pretty conservative anyway: rather than naturally
               | bringing you down from a high, you usually have to
               | explicitly give yourself a correction dose. Which I
               | suppose makes sense, but I have read that some people
               | have had a lot more success with looping in Androidaps vs
               | the Omnipod 5's algo.
               | 
               | I also wish you could set the "target" a bit lower in the
               | 5. You can't tell it to target below 120: I feel like 100
               | or 110 would be more reasonable, but again, it makes
               | sense that they prefer being conservative.
        
               | cperciva wrote:
               | Ah yes, I have my target set to 100. IIRC the first off
               | the shelf closed loops were hardwired to 120 but newer
               | ones are configurable.
        
         | kouru225 wrote:
         | Let's hope that smart insulin comes to market soon (and is
         | affordable)
        
       | smokel wrote:
       | I find Thomas Nagel's "what it is like to be" [1] concept
       | fascinating. I have spent quite some time trying to imagine what
       | it is like to be a _rock_. Mind you, not from the perspective of
       | a human ( "A rock will probably experience time very quickly,
       | because it erodes, etc. etc"), but from the perspective of the
       | rock itself. That is, it has no senses, no memory, no
       | capabilities for abstraction, no consciousness.
       | 
       | This ruminating has led me to believe that time and logic are
       | human concepts, and are not as universal as is commonly believed.
       | With recent insights into neural networks, I wouldn't even be
       | surprised if the laws of physics follow from the way our brains
       | are wired, instead of the other way around. Perhaps this is
       | simply a modern take on idealism or realism, but I can't find a
       | strand of philosophy with which I feel at home.
       | 
       | Obviously, there is a bootstrapping problem with trying to reason
       | from something that cannot reason. And I am well aware that my
       | brain must exist in some form of reality. To conclusively prove
       | some apparatus for that is way out of the scope of science.
       | Scientifically there is probably very little to learn from this
       | anyway, apart from opening one's mind to some alternative
       | possibilities. It's a fun exercise, though.
       | 
       | However, the entire discussion about what _consciousness_ is,
       | strikes me as less interesting. Is this really more than being
       | able to conjure up memories of past experiences?
       | 
       | [1] https://en.wikipedia.org/wiki/What_Is_It_Like_to_Be_a_Bat%3F
        
         | Trasmatta wrote:
         | > However, the entire discussion about what consciousness is,
         | strikes me as less interesting. Is this really more than being
         | able to conjure up memories of past experiences?
         | 
         | I don't think memory and consciousness are intrinsically
         | linked. Memory is something consciousness can be aware of, but
         | it's not consciousness itself. Someone can have their ability
         | to process and remember memories permanently or temporarily
         | damaged, and yet still have a conscious experience. An AI can
         | have memory, but not have a conscious experience. (Or at least
         | it seems that way - if something like Integrated Information
         | Theory is true, then maybe AI does have some sort of first
         | person conscious experience. I tend to doubt that, but I could
         | be wrong.)
         | 
         | EDIT: although I might be conflating short term and long term
         | memory. I wonder if consciousness requires at LEAST some form
         | of memory processing, even if it's just the past couple of
         | seconds. Perhaps the "Strange Loop" needs at least that to
         | arise. I'm not sure.
        
           | robwwilliams wrote:
           | Yes, consciousness requires some embedding in time
           | (duration). It requires a capacity of recursion. Hofstadter's
           | stance loop is a temporal process.
           | 
           | There is no atemporal "frozen" state we can call
           | consciousness. It is dynamic.
           | 
           | That is what bothers me about the Chalmer's piece. There are
           | not three states of a thermostat's "What's it like". He is
           | showboating his writting chops and parroting Nagel's dualism.
        
             | vidarh wrote:
             | > There is no atemporal "frozen" state we can call
             | consciousness. It is dynamic.
             | 
             | That is a huge assumption we just have no way of knowing.
             | For starters, we don't know whether we are in an atemporal
             | "frozen" state or not.
        
               | robwwilliams wrote:
               | It is a highly pragmatic assumption like absolutely
               | everything else, even cogito ergo sum. Tell me what is
               | not an assumption and then maybe we can talk.
               | 
               | My take: Epistemology is metaphysical bs, and "truth" is
               | a convention we agree to within communities of speakers.
        
             | sdwr wrote:
             | Thank you for bringing some sense into the conversation.
             | 
             | Panpsychic "everything is alive" is better than "nothing
             | else is conscious because I'm not them", but only by one
             | degree.
             | 
             | > Where does it hurt?
             | 
             |  _Pokes leg, ow_
             | 
             |  _Pokes arm, ow_
             | 
             |  _Pokes stomach, ow_
             | 
             | > Everywhere!
             | 
             | > I think you sprained your finger
        
             | bglazer wrote:
             | What do you mean by dualism?
             | 
             | From the article, emphasis mine: "But we should not be
             | looking for a homunculus in physical systems to serve as a
             | subject. *The subject is the whole system*, or better, is
             | associated with the system in the way that a subject is
             | associated with a brain."
             | 
             | Is this dualism because it retains the idea of the subject
             | at all? But Chalmers states "the subject is the system",
             | and so seems to reject the notion of mind/body duality. I
             | haven't read about this in much depth, so I don't have a
             | very sophisticated understanding here.
        
         | dmbche wrote:
         | I can recommend Being No One, by Thomas Metzinger, for essays.
         | 
         | For sci-fi, have a look at Blindsight, by Peter Watts (for free
         | on his website: Https://www.rifters.com/real/Blindsight.html)
        
           | smokel wrote:
           | Thanks for the recommendations!
           | 
           | I have read Thomas Metzinger's Ego Tunnel (2009), but as far
           | as I understand it, he takes on a naturalist standpoint, and
           | assumes that consciousness arises from that.
           | 
           | I prefer to take a radical agnostic point of view, where
           | consciousness does not even have meaning outside of those who
           | experience it. Implying that "meaning" or "reasoning" make no
           | sense universally.
        
             | robwwilliams wrote:
             | You just communicated meaning to me. Defining "outside of
             | those [plural] who experience it" is the tricky part.
        
           | joloooo wrote:
           | Thanks for the recs,
           | 
           | Mindware by Andy Clark is also a great book on these topics.
           | 
           | https://global.oup.com/academic/product/mindware-97801998281.
           | ..
        
         | stronglikedan wrote:
         | > trying to imagine what it is like to be a rock
         | 
         | That is how I perceive meditation to be. At least, the end goal
         | that I have yet to achieve, anyway.
        
           | vidarh wrote:
           | That depends very much on the school of meditation. It sounds
           | like you're focusing on concentration practice.
           | 
           | Mindfulness practice is toward the other side of the
           | spectrum: The goal is _not_ to suppress your own sensations
           | or thoughts, but to be mindful of them and observe them in a
           | detached manner. The best analogy I 've come across is to sit
           | at the side of the river and watch the boats go past, instead
           | of jumping on them and racing down. But you're not trying to
           | clear the river of boats.
           | 
           | There's overlap, in that the finer control you want to have
           | over your ability to be mindful of specific aspects of your
           | thoughts, emotions, body etc., the more concentration you
           | need to be able to muster to calm yourself enough.
        
         | TaupeRanger wrote:
         | Scientifically, there is _a lot_ to learn. If we understand
         | alternate forms of consciousness, we can potentially alter our
         | own and open up new avenues of experience.
         | 
         | Your last comment strikes me as strange for someone who seems
         | to be well read on the topic. Saying that consciousness is an
         | ability to recall memories doesn't really describe what it _is_
         | in the natural sense. The memories themselves are composed of
         | conscious experiences, so that definition is circular. An
         | explanation of what consciousness _is_ would include an
         | explanation about why, for example, chocolate tastes the way it
         | does, rather than like vanilla, or some other completely
         | unknown taste. Until we can explain its character (rather than
         | just describe it), we can't explain what it is. It's sort of
         | like dark energy: we can describe the phenomenon but we haven't
         | fully explained what it is.
        
           | smokel wrote:
           | My last comment on consciousness should probably also be
           | interpreted from the rock perspective. It does not make
           | sense, because I assume that a rock has no consciousness to
           | begin with, and no memory to entertain it.
           | 
           | A sister comment suggests that memory and consciousness are
           | less intrinsically linked that I tend to believe. It might be
           | fruitful to come up with a decision tree on what people
           | believe consciousness to be :)
           | 
           | Personally, apart from the rock meditation, I am not as much
           | interested in the _definition_ of consciousness, because I
           | think it is a trap, based on a categorical mistake. I 'd
           | rather get away from the anthropocentric viewpoint. Then
           | again, I sometimes doubt that a scientific method will get us
           | there.
        
         | vidarh wrote:
         | > And I am well aware that my brain must exist in some form of
         | reality. T
         | 
         | To mess with your head a bit more:
         | 
         | We know of no other way that we know the flow of time than
         | indirectly through memory of the flow of events and sensory
         | inputs.
         | 
         | And so while it seems probable that our brains must exist,
         | consider that e.g. a simulated mind that is paused, and where
         | the _same step_ is executed over and over, with no ability to
         | change state, would have no way of telling that apart from the
         | same step being executed only once, to move on to the next.
         | 
         | In that case it's not clear that there'd need to be any full
         | brain, just whatever consciousness is, in a state that has a
         | notion that it has a memory of a past instant.
         | 
         | Put another way: Your consciousness could be just a single
         | frame of a movie, hinting at a past and future that might not
         | exist.
         | 
         | Forever repeating the same infinitely short instant, and
         | nothing else. Maybe the universe is just a large tableau of
         | conscious instants statically laid out and never changing or
         | interacting with each other. We _wouldn 't know any different_.
         | 
         | Of course that is entirely untestable, and so just a fun
         | philosophical question to ponder, mostly as a means to point
         | out just how little we can _know_ , and so how willing we need
         | to be to accept that what matters isn't what we can know, but
         | what is likely to be able to affect our observed world.
         | 
         | E.g. I see myself as a materialist (philosophically speaking)
         | not because I believe it is or can be proven, but because it is
         | our _observable_ reality. If that materialist reality is shaped
         | by our minds and only exists in some abstract form, or we 're
         | all in a simulation etc., then that is irrelevant unless/until
         | we find a way to pierce the veil and/or those aspects "leaks"
         | into our observable reality somehow.
        
         | metaxz wrote:
         | This one resonates very well with me:
         | https://www.organism.earth/library/document/simulation-consc...
         | 
         | You have to give it a chance. He is first building up an
         | argument about why consciousness cannot depend just on the
         | physical substrate itself but rather on the "interpretation" of
         | this. It is very important to understand this part/argument.
         | What follows is something that resonates with you namely how
         | our consciousness is now 'tuned' to the current physical laws.
        
       | cubefox wrote:
       | 1996
        
         | frxhvcdtgf wrote:
         | I find this interesting, and I think dates should always be
         | included.
         | 
         | But I have this question- why am I interested in the date? What
         | changes in this message when I know the date?
         | 
         | I feel like knowing the date sets up a conflict between "this
         | is out of date" vs. "this is before the era of mass garbage
         | generation" (more generously described as "with time only the
         | classics survive)
        
           | mannykannot wrote:
           | There has been much debate since this was written, but not
           | much movement towards a consensus.
        
           | cubefox wrote:
           | Chalmers also may have changed his view in the past 28 years.
        
             | robwwilliams wrote:
             | I wish.
             | 
             | This kind of showboating in 1996 is what allowed him to
             | achieve his professional goal---to be the modern voice of
             | archaic Cartesian dualism.
             | 
             | There actual is no hard problem. Ask neuroscientist "What
             | is the hard problem" and you will get blank stares. They
             | just do not know, or like me, do not care. It is a muddy
             | residue of Cartesian dualism. Boring.
        
               | cubefox wrote:
               | That's not convincing to me. If you say some problem
               | "doesn't exist", I need an argument which makes a
               | compelling case for that, which presents some kind of
               | dissolution.
        
       | HPsquared wrote:
       | Title reminds me of Tim Hunkin's BBC series "The Secret Life of
       | Machines", which he's put on YouTube. There is, funnily enough,
       | an episode on central heating systems:
       | 
       | https://www.youtube.com/watch?v=PnQ9zkBzbYc
       | 
       | EDIT: there is of course a bit about thermostats:
       | https://www.youtube.com/watch?v=PnQ9zkBzbYc&t=1137
        
       | justlikereddit wrote:
       | This is why no one actually likes philosophers
        
       | johann8384 wrote:
       | Wouldn't the thermostat be more closely aligned with a nerve in
       | the overall system and the control board be more aligned as the
       | brain? The brain can get signals from multiple thermostatats in a
       | system to control the temperature.
        
       | upghost wrote:
       | Does anyone know of a smart thermostat that actually has this
       | function? Every thermostat I've looked for has "heat mode" where
       | it decides if it should be blasting heat or not, and "cool mode",
       | where it decides it should be blasting the AC or not. I have not
       | found the mythical smart thermostat that does the job of "keep
       | the temperature around here" +/- a few degrees.
       | 
       | I live in an area where its cold at night and hot during the day
       | and I am bad at remembering to change the thermostat from mode to
       | mode and haven't found a programmable IOT thermostat I can write
       | a script for, recommendations welcome!
        
         | ajoberstar wrote:
         | Nest thermostats have a heat and cool mode where you have
         | setpoints for each. On older ones there was a limit on how
         | close the two could be set.
        
         | ewhanley wrote:
         | Ecobee thermostats can run in dual heat/cool mode
        
       | tananan wrote:
       | This kind of panpsychistic talk to me ends up feeling more closer
       | to a reductive materialism than what I would firstly associate
       | Chalmers with ("hey, did you forget you can experience stuff?"),
       | which is probably just my ignorance with his work.
       | 
       | Because yes, you acknowledge "experience", but you make it a
       | function of a physical state described in such and such a way. In
       | the same way that a set of particles at points A, B, C, ..
       | correspond to such and such a (e.g. electric) field strength at
       | point Z, we now imagine it could correspond also to such and such
       | an experience.
       | 
       | It's just barely "experience" on its own terms. and elicits a
       | kind of epiphenomenalism and powerlesness. The thermostat*, after
       | all, doesn't choose anything nor does it profess to have any
       | agency. So agency ought to end up some kind of ephiphenomenal
       | "observable" of a system.
       | 
       | But besides being deflationary in this distasteful way, what
       | bothers me with pictures like this is that they make use of
       | entirely subject-made divisions between objects and their
       | environments, and presume that they might correspond to
       | experiences because - why not? Why not thing of the bottom and
       | upper half of thermostat as corresponding to two fields of
       | experience? Or the quarters, sixtheents, and so on until we get
       | to individual atoms.
       | 
       | The thermostat doesn't "care" if I think of it as the wax and
       | glass separately, or as a single object containing both. But we
       | do have a unified field of experience, and it doesn't matter how
       | another person "cuts us up" in their mind, whether it is as atoms
       | interacting, organs behaving in unison, or just as a "body".
       | 
       | It seems silly to say that between me and Bob having our separate
       | experiences, there is an experience corresponding to "me and
       | Bob", supposedly free-floating somewhere just by virtue of the
       | two of us being cognizable as a physical system.
       | 
       | It turns "experience" and that infamous "qualia" from something
       | that's the most direct and obvious to a weird phantom as the
       | output of a presumed equation which maps some description of a
       | physical state to an "experience".
       | 
       | No wonder you'll find people who'll retort that they don't
       | experience things or that their consciousness is illusory - they
       | have these weird detached notions of experiences to fight
       | against.
       | 
       | * I imagined a thermometer throughout reading this piece, hence
       | the mention of wax and such. It doesn't really change the point
       | so I'm leaving it.
        
         | nuancebydefault wrote:
         | Each time someone tries to explain in a scientific way what
         | they think is consciousness, you see this analysis phase,
         | breaking it down in steps to the bare minimum. From a
         | scientific point of view, this makes perfect sense.
         | 
         | This leads to two pounts of view - scientific, leading to
         | reduction and more philosophic - there's no way to describe it
         | since it is _super-natural_.
         | 
         | I lean to the more scientific approach, we are not more and not
         | less than the sum of our parts and, each of our parts, at any
         | sub-scale, has some resemblance to a thermostat: some object
         | that reacts on its environment.
        
           | tananan wrote:
           | The whole thing with Chalmer's hard problem is pointing out
           | that reduction doesn't get you very far. But here he
           | formulates a reductive panpsychist proposal (though only "in
           | theory"). What part this piece plays within his broader
           | thought - I am not sure.
           | 
           | Nonetheless, it is far from compelling even as a "weak-
           | problem" hypothesis and is an abstract angels-on-hairpin
           | musing that truly puts experience outside of the bounds of
           | investigation. Because, after all, if experience is an empty
           | epiphenomenon which exists for any, anyhow-delineated
           | physical system out there, where does that get us? We've made
           | an assumption we cannot prod scientifically, yet it hides
           | behind the scientific veneer of reductivism.
        
           | robwwilliams wrote:
           | You might enjoy anything written by Humberto Maturana. There
           | are some sharp lines he draws in defining a living system--
           | what he refers to as autopoietic (self-building) systems.
           | 
           | https://en.m.wikipedia.org/wiki/Humberto_Maturana
           | 
           | For Hofstadter consciousness requires some form of recursion
           | --what he calls strange loops. Our brains are recursion
           | machines "we" can partly control "ourselves".
        
       | mannykannot wrote:
       | I feel that there is an alternative way of approaching the
       | question: to propose that it is only meaningful to ask what it is
       | like to be an X if the X has certain mental abilities, such as
       | some sort of self-awareness of itself as a participant in a wider
       | world. How would we go about evaluating and choosing between
       | these two views, and is there room for there being degrees of
       | 'what it is like' and self-awareness? It is almost as if we are
       | trying to write the dictionary definition before we know enough
       | to complete the job (which is not necessarily a bad thing, unless
       | we assume that by making a choice, we have, ipso facto, filled in
       | the previously-incomplete knowledge.)
       | 
       | I definitely take issue with Chalmers' opening sentence of his
       | final paragraph: "... A final consideration in favor of simple
       | systems having experience: if experience is truly a fundamental
       | property, it seems natural for it to be widespread." I feel he is
       | putting the cart before the horse here - something that seems
       | quite common in the philosophy of mind - by first deciding that
       | experience is a fundamental property, and then using it to
       | justify the assumption that it is widespread. This strikes me as
       | almost circular, as it seems one could at least as reasonably
       | justify it being fundamental on account of the arguments for it
       | being ubiquitous.
        
         | robwwilliams wrote:
         | You are so right! Both circular and if you follow through in a
         | cartesian mode you end up with an infinite stack of
         | "representations" all the way up toward the two neurons that
         | "represent" your two grandmothers. Both Richard Rorty
         | (Philosophy and the Mirror of Nature) and Daniel Dennett
         | (almost any of his works) did us a big favor by demolishing
         | representational dualism---mind vs brain.
        
       | HarHarVeryFunny wrote:
       | It's a dumb click-bait title (riffing on Nagel's "What is it like
       | to be a bat?"), but the actual question presented a bit further
       | down is:
       | 
       | "Moving down the scale through lizards and fish to slugs, similar
       | considerations apply. There does not seem to be much reason to
       | suppose that phenomenology should wink out while a reasonably
       | complex perceptual psychology persists... As we move along the
       | scale from fish and slugs through simple neural networks all the
       | way to thermostats, where should consciousness wink out?"
       | 
       | The author seems to have succeeded in answering her own question
       | (at least in hand-wavy fashion) at the same time as posing it, as
       | well as implicitly defining consciousness. So, yeah, it's not
       | like anything to be a thermostat.
        
         | robwwilliams wrote:
         | Perfect!
        
         | cubefox wrote:
         | Surely you agree that not only thermostats are not conscious,
         | but also simple neural networks are not. E.g. a single layer
         | perceptron. And it's intuitively also not just a matter of
         | number of layers or neurons.
         | 
         | By the way: The headline is, I assume, by Annaka Harris, while
         | the essay is by David Chalmers.
        
           | HarHarVeryFunny wrote:
           | > Surely you agree that not only thermostats are not
           | conscious, but also simple neural networks are not
           | 
           | Sure - there is a difference between merely being cold and
           | also being aware of being cold. A piece of ice (or a cold
           | thermometer for that matter) can _be_ cold, but to
           | _experience_ being cold - to be aware /conscious of it -
           | requires some minimal level of cognitive apparatus ("a
           | reasonably complex perceptual psychology") to process those
           | sensory inputs and contrast them to the differing sensation
           | of not being cold and be able to think about it!
           | 
           | There is some evidence, such as "blindsight" (ability to see
           | without being aware of it - a loss of visual consciousness)
           | that consciousness may not only require the mental apparatus
           | to process a sensory input, but may also require specific
           | neural pathways (which may be missing, or damaged) to gain
           | access to specific internal neural state in the first place.
           | 
           | It's difficult to know exactly where the line is - which
           | animals do have a sufficiently complex brain to able to
           | introspect on and experience their own state, but clearly
           | simple neural nets (e.g. anything without feedback paths)
           | don't, and simple animals like insects don't either.
        
       | mensetmanusman wrote:
       | If you believe a thermostat has consciousness, you would also
       | logically believe that the cascading subset of (your body -
       | n*atoms) would also be conscious. E.g. your arm.
       | 
       | This is one reason the topic is so slippery.
        
       | anfractuosity wrote:
       | https://consc.net/notes/lloyd-comments.html has some more info
        
       | crabbone wrote:
       | I think that we find the idea of thermostat having experiences
       | strange because, subconsciously, we think of experiences only
       | being accessible to someone / something that has a "will to live"
       | (in the words of Schopenhauer).
       | 
       | I.e. I don't think thermostats want anything. They don't have a
       | capacity to care whether they fall apart or not, whether anyone
       | is satisfied with their function or not. But, life, even in the
       | very simplest form wants something. Experiences to living
       | organisms is what makes them more effective at doing what they
       | want.
       | 
       | What makes living creatures want something: I have no idea. I
       | remember hearing a hypothesis tying this to better preservation
       | of energy... but I don't know about that. But, if I had to guess,
       | it must be something that's available to micro-organisms, so, it
       | has to have some physical / chemical explanation...
        
         | nuancebydefault wrote:
         | Maybe the property of "wanting" is no more than survivorship
         | bias? Since living things survive, they must have a "will" to
         | survive. Maybe that will is just a label that humans invented
         | to attribute to living things?
        
           | crabbone wrote:
           | Well, wouldn't this bias happen a while ago, to the very
           | primitive organisms? Because the organisms we see today seem
           | to universally want to live. I mean, even if we somehow
           | mislabeled some more fundamental property, on the face of it,
           | it seems to be pretty consistent. Everything about how these
           | organisms are built is indicative of their "will" to remain
           | functional and to procreate. Eg. regrowing parts of
           | organism's body in order to overcome damage (that would cause
           | the organism to cease to exist) seems pretty universal.
           | Similarly, trying to move away from hostile environments
           | seems to be pretty built-in feature of anything alive.
        
         | robwwilliams wrote:
         | This is the book you will want to read once, twice, or three
         | times to have a good answer to your question.
         | 
         | Humbert Maturana and Francesco Valera(1979) Autopoiesis and
         | Cognition: The Realization of the Living. ISBN 90-277-1015-5.
         | 
         | Not an easy book, but one that answers your question.
         | 
         | Terry Winograd loves this book. I am a neuroscientist and only
         | found this gem late in my career. Damn!
        
           | crabbone wrote:
           | Thanks! Much appreciated. I do love the subject, although I'm
           | not connected to it professionally.
        
       | neogodless wrote:
       | Their use of "phenomenal" and "phenomenology" confuses me as a
       | layman, but I'll lay out their (likely relevant) definitions and
       | hope to use that to better understand what is being proposed.
       | 
       | > phenomenal: Known or derived through the senses rather than
       | through the mind.
       | 
       | > phenomenology: A philosophy or method of inquiry based on the
       | premise that reality consists of objects and events as they are
       | perceived or understood in human consciousness and not of
       | anything independent of human consciousness.
       | 
       | So the claim (highlighted especially in paragraph 3) is that,
       | outside of humans, things that are _perceived_ may also exist in
       | _conscious_ thought (of non-humans).
        
       | isoprophlex wrote:
       | Another useful metaphor that doesn't cross the threshold from
       | human experience into the "experiencing" that non-living things
       | do, would be chemotaxis.
       | 
       | A bacterium finds food with a simple set of states. At it's most
       | basic:
       | 
       | - you are experiencing an increasing concentration of food. you
       | keep swimming straight ahead.
       | 
       | - you are experiencing a decreasing concentration of food. you
       | move in a continuously randomized direction.
       | 
       | this eventually gets them onto a track where they are moving
       | towards food.
       | 
       | Extremely simple like a thermostat, yet effective.
       | 
       | https://en.m.wikipedia.org/wiki/Chemotaxis
        
         | robwwilliams wrote:
         | No, not like a thermostat. A bacterium is a compact and
         | exceedingly complex living system. It has autopietic autonomy
         | like we do. The fact the I navigate each morning to the
         | refrigerator to get milk for breakfast is not so different from
         | a bacterium following a gradient toward its food. Bacteria have
         | complex receptors like we do that are their sensorium---a nano-
         | brain in the words of one scientist who studies their behavior.
        
           | primes4all wrote:
           | And yet, also a thermostat is doing nothing else but
           | following a gradient towards a desired state (albeit the
           | gradient is discrete and it's moving along the dimension of
           | temperature).
        
       | luxuryballs wrote:
       | anyone know what font that is? (on mobile) reminds me of like an
       | old 70s print
        
       | robwwilliams wrote:
       | Chalmers is a dualist living in a Cartesian past. If you like a
       | lively treatment of dead scholasticism of the mind-vs-brain
       | problem then you can do no better. Ditto Nagel.
       | 
       | In contrast if you want modern post-cartesian scientific thought
       | on consciousness then hit Dennett hard for philosophy or Ray W.
       | Guillery if you want hard neuroscience (The Brain as a Tool).
        
       | ruthmarx wrote:
       | For it to be 'like' to be anything, the anything must have some
       | send of self. Without a sense of self, there is just information
       | processing.
       | 
       | For that reason, it isn't 'like' anything to be, say, most
       | insects, let alone a thermostat.
        
       | mode80 wrote:
       | I remember inspecting the thermostat in my parent's house as a
       | child. It was a coil of something metalic which I assume expands
       | and contracts with temperature and physically pushes electrical
       | contacts together to turn on the heat when needed. Knowing how it
       | works, it's hard for me to imagine that this feels like anything.
       | The whole contraption is just an arrangement of molecules doing
       | what molecules do. But then again, so am I.
        
       | ryandvm wrote:
       | "What is it like to be a bat (or a thermostat)?" is too abstract
       | for anyone to thoughtfully grasp.
       | 
       | Instead try asking yourself "what is it like to be asleep?" or
       | "what is it like to be waking up?" or "what is it like to be
       | heavily sedated?"
       | 
       | We all experience various gradients of consciousness every day as
       | we do things like drift off to sleep or slowly gain consciousness
       | in the morning. You don't have to try to imagine the experience
       | of another primitive life form when you can just recall what
       | there is or isn't to your own conscious experience as you drift
       | between states.
        
       ___________________________________________________________________
       (page generated 2024-12-30 23:01 UTC)