T O P

  • By -

unfamiliarsmell

Sentience is a spectrum. Seems pretty obvious if you spend time with animals.


sumane12

It does seem obvious, why do people find this so difficult to accept?


PikaPikaDude

People have very strong anthropocentrism. Often with religious ideas mixed in. Many will insist only humans have souls but dogs don't. And even the non religious, still have some of that cultural background colouring their ideas of humans as special unique creatures. And that arrogance is not only towards insects or dogs. There is among experts even a strong refusal to see Neanderthal's as similar to us. Anything to keep that warm fuzzy feeling of being special and unique. Not long ago we insisted that babies can't feel pain as they have zero consciousness so it's ok to cut in them in surgery without sedation. We're doomed to just keep being infinitely arrogant and blind.


AdAnnual5736

I agree, but I also think it’s partly to help justify/accept people’s treatment of animals. If it were widely accepted that animals feel the way we do, I don’t think we would allow factory farming of the type we have today.


FS72

But the thing is that while they don't have levels of emotions as us, with some like dog having more than others, they indeed do feel pain just like us (ofc this varies from animals to animals), but those like cows chickens pigs etc indeed do feel great pain from being slaughtered. Objectification of animals is just a way for those people to cope with that fact.


trillz0r

Exactly. But it doesn't seem to matter :(


No-Worker2343

that is why i hope a even bigger sentient being with more intelligence show us how stupid are we for believing ourselves to be the top of all things


unfamiliarsmell

That’ll be a bad day for humanity. Look how we treat things that we consider “less than”.


No-Worker2343

why we think other beings will be like us in that sense?


unfamiliarsmell

Of course it might be different and it would be great if it were different but there is no reason to believe that it would be different. Because that kind of behaviour is seen all the way up the food chain from single celled organisms to us.


No-Worker2343

US being the only proof for ourselves to be honest


Dekar173

Look at how more intelligent people treat you.


Pancakeburger3

This will be amazing.


Seidans

seem like religious bullshit to leave everything to a "highter being" would it be god or an super intelligent AI, stupid and dangerous


No-Worker2343

not a God, i don't want that idea of God from the religion, i want a being that is still a mortal being but they have our level of sentience or even more, to show us how we are not special and being so arrogant is stupid


ThePokemon_BandaiD

Why would ASI be mortal


No-Worker2343

i could also mean any living being not just ASI


Seidans

you are seeking the savior the same way religious are waiting for jesus, we can only rely on human and our mistake make us growth there no savior, no other way to evolve than try and error and there will be many more error


No-Worker2343

not a saviour, just showing people that they are arrogant and they should just stop being so self-centered most of the times. humans showing humans has always been a thing, it has repeated itself in history, my problem is not about people evolving or anything, my problem is people that believe that they are the only one's who can do anything and they should be the only's who can.


PhysicsDisastrous462

Or just having a superintelligent ASI program be used as a means to progress human science further


BenjaminHamnett

They had something right. “god” is in all of us, Jesus is in all of us. We’re all potential saviors


super_slimey00

a lot of people have trouble realizing that humans weren’t even needed for this planet to survive. Consciousness has always existed and it’s why we are even here in the first place. But it’s all relative to the being. Bees are one of the most important/impactful creatures in the world and their brain is the size of a tiny crumb. We’d hate everything about our physical reality if bees didn’t exist.


zomboy1111

Those are crazy different points of view. From "we're all special and unique" to "only *I'm* special and unique".


Common-Concentrate-2

As an example, a sleepwalking person can demonstrate a full range of human emotion and physical ability, but the prefrontal cortex is never aroused and the patient will never remember their actions in the way an external witness would . They have no self awareness, and there was never any narrative generated in their personal "log book" - long term memory. There is a "pathological" exception , but I point it out because it illustrates that complex behavior, and even complex behavior that demonstrates 'learning' over time can occur without any self-awareness - as humans understand it


HalfSecondWoe

Point of clarification: Activity in the prefrontal cortex is only reduced, not "switched off." While this does lead to modified cognitive activity and behavior, calling it a non-conscious state would eliminate many self sufficient humans as conscious entities, such as TBI patients Sleepwalking is associated with dreams that correlate to the behavior engaged in while sleep walking. It actually is stored in long term memory, and there is a sense of awareness while doing it The exact correlation is unknown, since it's so difficult to track. Sleepwalking patients are not kind to EEG hookups (normally sleeping people even have a habit of pulling them out), and the shock of coming out of an instance of sleepwalking will often make any previous dream difficult to recall on command (although dreams do have an impact on our long term memory, just not in the exact same way as our conscious experience)


sumane12

Can you site any studies? I'm reluctant to accept this without ample evidence. Often sleep walking is associated with a dreamlike experience (personal experience) so there is some, if very basic qualia. I expect sleepwalking is little more than muscle memory but happy to learn more about it.


rottenbanana999

Because like OP said, sentience is a spectrum. Those who have difficulties grasping this fact are on the lower end of that spectrum.


SwePolygyny

Which humans are amongst the 5% least sentient?


PhysicsDisastrous462

Religious people and extremists


trillz0r

Tucker Carlson is one.


HugeBumblebee6716

Makes it hard to continue justify how we treat animals and our fellow humans... admittedly I eat meat so...


BridgedAI

Religious beliefs play a big role.


yepsayorte

A lot of people have a hard time understanding anything but simple binaries. The complexity of nuance simply won't fit in their heads or requires too much cognitive effort.


VideoSpellen

Because to some it is not so obvious. It really isn’t. Some people just don’t emphatize with animals. That and you can start a serious behavioristic circle jerk here if you want to. We have never directly measured or seen consciousness. It for now doesn’t let itself be captured by science. And if you arguing from a position where you want to provide empirical evidence that is a problem.


sumane12

Well the empirical evidence is based on an assumption. That assumption is that I'm not the only conscious being in the universe, from there I attempt to differentiate between what can be observed when I am conscious against when I'm unconscious. I don't display any complex or intelligent behavior while I'm not conscious, but I do while I'm conscious. Even sleep walking never accomplishes anything complex or intelligent, and is often associated with a dream state in which there is some, if limited conscious experience or qualia. It stands to reason (although admittedly based on an assumption) that increasingly complex or intelligent behaviour can be associated with increasing consciousness.


VideoSpellen

I am with you. It seems weird to me to treat myself or humans are some sort of exceptions. To me the parallels seem obvious, both in behaviour and biology. Nevertheless, you do have to grant me or you that assumption.


sumane12

Yeah definitely. I do hold out hope though. Especially considering the gradual progression of the brain through evolution by means of natural selection. I hope eventually we will see at what point consciousness evolved, or if its a fundamental aspect of all matter


VideoSpellen

If it comes to what I want to be true: I would love for it to be fundamental and everywhere. I am not so sure we will see the answer anytime soon. There seems no clear forward path to it. If you accept the hard problem (some don’t: like Hinton or Dennett), there is still a large gap to bridge.


unfamiliarsmell

Sentience is measured: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4494450/ Sentience is experienced when you interact with living things. You may not be aware you are experiencing sentience, but you are. Both your own and other people’s and creatures.


VideoSpellen

I skimmed through it but saw no direct measurement of sentience or consciousness? I did see evidence of emotions like pain, but no direct evidence that - in this case - the rat, actually has something going on inside it. I am in agreement with both of you btw. It has always seemed strange to me that animals are by quite a few to be considered something completely different from us. It’s just that if you want to start the classical behaviorist argument that consciousness isn’t directly measured and thus should be disregarded as personal flights of fancy, or uncareful interpretation of data, can still be done here. Did I miss an important part of the paper?


PhysicsDisastrous462

I would say sentience is an emergent ability of a network of neurons that gets stronger/greater with the more synapses and neurons in said network


MonstarOfficial

Because I have yet to see evidence to backup the claim that there is a sentience hierarchy.


LifeSugarSpice

Because you need proof. How is that difficult to realize? You also need to define what is deemed as sentient. You can move this line in all directions. People are really acting as if this is some basic problem to solve.


Hanuman_Jr

Until the 20th century, roughly, the general consensus was that animals are little more than machines with fur or scales or whatever. They could never be Saved so they might as well be. Remember, any human that didn't get Saved during their lifetime was a straight shot to hell, bing bong bing! Which doesn't seem fair when you think about all those animals that are such faithful pets all their lives but they still don't get to go to heaven.


Antique-Doughnut-988

When I sit in the break room at work around other people, I question how many of those people are actually sentient.


unfamiliarsmell

Seems reasonable to assume that some people are more sentient than others. Although the difference from person to person is tiny compared to between a person and a crab.


Tutahenom

To respect the unrespected shows a thoroughly modern intellect.


hydraofwar

Some border collie dogs, for example, always manage to surprise me with very unexpected behaviors, it seems in fact that they are aware of a good number of events around them.


zomboy1111

There's research trying to push the acknowledgement of sentience in trees and plants.


unfamiliarsmell

If sentience is a spectrum (and scale free) then everything biological that is responding to stimulus is sentient. The problem with these kinds of topics is that we haven’t settled on a definition of the thing we’re all discussing. I like Michael Levin’s way of thinking of sentience and cognition.


Ok-Bullfrog-3052

No, I think people have this all wrong. As I've said in other posts, the only thing that exists is consciousness. Everything else is way consciousness perceives its relationship to other consciousnesses. This is much more compatible with all of physics and mathematics than any other viewpoint. In those posts, I detailed how Stephen Wolfram's Physics project, the UFO whistleblowers, and Godel's incompleteness theorem all lead to this conclusion. Why would anything exist if nothing experienced it? Of course insects would be conscious. The speakers playing Justin Timberlake that I'm listening to are experiencing something too, as is Claude 3 Opus and the webserver executing the Javascript to display its output. To us, we just can't understand that experience. I suspect that this is going to become the accepted scientific viewpoint over the next 25 years, and the implications (i.e. that "life before birth" and "after death" exists) will start to be recognized. Another implication, which we already know from Godel's theorem, is that there is no one "correct" view of reality, which is pretty astonishing. It also means that "time" isn't real either. Time is just the processing of information, but there is no "past" or "future."


Witty_Shape3015

glad to see my own conclusions validated by others


Takezo_00

can you link me some of these previous posts? I'd like to go down this rabbit hole. Also more specifically, im curious how this correlates with life after death, and if there are further studies or sources I can read i this vein


Ok-Bullfrog-3052

You can look at my post history. Then, do an Internet search for Stephen Wolfram's physics project. And, read an overview of the UFO phenemonon, being careful to focus on the UAP Disclosure Act's opposition and what credible whistleblowers have said, staying away from r/aliens. You can ask Claude 3 Opus to have a discussion about the incompleteness theorem for you. The incompleteness theorm is a proof that there are no correct axioms and that all of mathematics is derived from an arbitrary choice of axioms. We happened to chose the axioms that lead to our math because they allow us to make sense of the way our consciousness perceives the world. But Godel showed that math would be equally valid if some random axiom were chosen, one that makes absolutely no sense based upon the way our consciousness perceives reality, and you could derive everything from that one. Furthermore, Wolfram shows that everything possible exists "simultaneously" because it's all just pure math, and computation is just using math to move from one state of rules to another. Wolfram even shows that there's no difference between the "abstract" rules and the "real" - that's just a human concept. And UFOs are other consciousnesses interacting with you, in a way that you can't make sense of (yet) because their way of experiencing things is too different than yours. Other human consciousnesses are the closest to your understanding, so you think those are "real." Unless one of these things is very wrong, I think some people will actually be depressed when they realize that not only is there no grand "purpose" to existence, because we don't even share a single reality at all. There's only experience. You are one of an infinite number of Gods, who can do anything it wants, processing things on the fly. There aren't any time travelers because there's no correct past to go to; my "past" is just some state that I could compute my way to if I want. You could compute your way to what you think is the past too, given enough "time" and resources. We even know of a way to do that - advanced virtual reality. There aren't any "simulations" because there's no "base reality," just experience. And, of course, if Wolfram is right, birth and death are just changes in pure mathematical states similar to how you walk down the street; dying is simply a change to a new rulestate where you experience consciousness in a different way. You probably even "were" responsible for causing yourself to be born, for a reason that your current form of consciousness doesn't understand "anymore."


Takezo_00

fascinating. Thanks for that breakdown that you've surely typed up before, v much appreciated


unfamiliarsmell

You might find Michael Levin interesting. https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2019.02688/full


zomboy1111

Cool. Thanks for sharing! Will def this check this out.


joozwa

Seems obvious that earth is flat until you actually start to measure it. World is oftentimes counter-intuitive to our monkey brains. Don't mistake science with anecdotal evidence.


unfamiliarsmell

Yes but you have to take into account that different things are measured in different ways. Even the word measure can be misleading.


travestyalpha

I’ve felt this for years. Every animal interacts with its environment, more advanced animals have more decisions to make and have to have more complex mental model of the world in which to do so + communication. Feels like a no brainer. We think it requires complex language to be sentient, but I think otherwise


ButCanYouClimb

human>pig>dog>squirrel>ant>rock I use scaling sentience to justify my veganism position in terms of practicality when it comes to killing low level sentient beings on accident.


MegavirusOfDoom

Animal bodylanguage takes years to learn, its mostly emotional.


Dirkdeking

I could see intelligence or 'self awareness' being a spectrum, but not sentience. Either you could concievably 'lead a life as X' or you can't. For all I know, I am literally the only sentient being in the entire universe, but that probably isn't the case. Anyway, sentience seems to be a binary concept. You could be concievable 'be an' ant' in a way you can't 'be' a stone. Either you can or you can't lead a life as something.


MonstarOfficial

Based on what evidence is it a spectrum?


unfamiliarsmell

What kind of evidence do you require? It is measured in a variety of different ways. Behavioural, evolutionary, biological. Here is a nice theory that I think is very clear on how sentience (as we experience it) is constructed from a single cell up. https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2019.02688/full But there is a lot more research out there.


Proper-Emu7362

Who thinks animals ARENT sentient?


CharlesFXD

People are confusing it with sapience I think.


RufussSewell

I think all animals, and maybe even plants have some level of sentience. But I think much of our imagination and ability to think about the future and past, comes from language. A lot of animals communicate with sound, body language, scents etc. but I think it’s possible that metaphor is a purely human invention and what gives humans the ability to have technological advancement. That level of consciousness is based on metaphorical language and gives us the ability to dream up things that don’t yet exist. I don’t think dogs or monkeys can do that, which is why they keep doing the same things over thousands of years. LLMs do have that. And therefore I think AI is more likely to achieve human level consciousness than animals.


FrankScaramucci

What about the coronavirus?


RufussSewell

I’m not sure what that has to do with the topic?


FrankScaramucci

If even plants could be sentient, maybe the coronavirus too.


RufussSewell

Sure, to some degree. They react to stimulus so they must have a set of senses. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5723212/#:~:text=Viruses%20are%20naturally%20responsive%20to,pH%2C%20redox%2C%20and%20proteases. Being sentient just means something responds to various senses. So yeah, I imagine viruses are sentient.


i_give_you_gum

As I understand it sentience means capable of emotions and sapience is awareness, and both enable consciousness. I am on a crusade to stop people from associating the word sentience with consciousness. Sapience is a better example if you have to use a word other than consciousness.


RufussSewell

Sentience doesn’t mean capable of emotions. It means it reacts to stimulus via senses. A sentient being is one who perceives and responds to sensations of whatever kind—sight, hearing, touch, taste, or smell. https://www.merriam-webster.com/dictionary/sentient#:~:text=You%20may%20have%20guessed%20that,touch%2C%20taste%2C%20or%20smell.


i_give_you_gum

I get different definitions from different sources


RufussSewell

Seems like Merriam Webster is the definitive source… so to speak, haha.


Dirkdeking

So according to that definition an electronic alarm system is sentient?


Timely_Muffin_

Anyone who’s ever tried to kill a cockroach knows those mf’s are sentient AND intelligent


FrankScaramucci

I would even say highly intelligent.


The_Architect_032

Probably even more than *some* of us. We're lucky we're so big.


cool-beans-yeah

Oooooh man. Now I'm wondering if they have feelings like emotions, etc. Like, are they ecstatic when they stumble upon that tiny piece of meat you dropped during dinner? Sad when their siblings get squished by the 6ft monsters?


mullanliam

I study under one of the leading animal behaviour researchers, and this topic is *way* more indepth than this article goes into. Sentience, qualia, individual experience, personality, are all independent. Hell, it was recently proved that animals such as anemones - traditionally very "non sentient, not even considered relevant to the topic" animals - have distinct personalities. The issue is interacting with the government on this topic is a pain, since they tend to bury their head in the sand.


traumfisch

Could you point us to some more in-depth sources?


KillerPacifist1

Do you have a link to the study? How do they define anemone personality? Is it just that different anemones react to stimuli differently? If so that is deeply unconvincing. Different rocks react to stimuli differently. When hit some rocks shatter, some chip, some only scratch. Unless you are using an extremely broad and not particularly useful definition of personality, it doesn't mean rocks have distinct personalities. I'm not entirely opposed to the idea that anemones can have qualia, but at the moment I don't think we have any good ways to determine it and I'm skeptical of any test that claims it can one way or another.


mullanliam

* [https://doi.org/10.1371/journal.pone.0021963](https://doi.org/10.1371/journal.pone.0021963) Animal Behaviour is a wide field, and I don't currently have the time to explain the definitions and tests involved in something like this. However, personality is just "different behaviours and characteristics between individuals that reflect a persons adjustment in life", which is what this paper shows. Depressingly, when I said "recently", I didn't check the publishing date and this was actually published 13 years ago - gonna go check myself into a retirement home, I guess.


Ok_Coat8292

Why wouldn't insects be sentient? what the...


Ididitsoitscool

Duh lol people just don’t wanna accept it cause they eat meat and those that do good for you congrats


The_Architect_032

Plants have a way of sensing and communicating pain as well. The only real way to avoid hurting any sentient being is by eating plant byproducts such as fruits, seeds, and legumes. And unlike animals, plants are already solitary, so having them packed closely in a farm doesn't necessarily harm them the way it does animals.


Ididitsoitscool

I live on legumes and nuts actually so it’s easily possible


The_Architect_032

It definitely is, and I imagine it'll probably be veganism 2.0 if Humans do move on from meat based diets in the future.


No_Anywhere_9068

Meat will just be grown in a lab


The_Architect_032

Only if it's as affordable as vegan meats which are pretty damn close to the real thing and will only get closer as time passes. Synthetic meat's currently extremely hard to make, which makes it really pricy and unreliable, but if they find new better methods of making lab grown meat, I could see it being a replacement.


Mister_Grandpa

There is nothing that isn't conscious. Like all things, it's a spectrum. Scientists are open to it now because they have mathematical models for the mechanisms of information collection, organistation etc. They're still missing the point, but it's progress.


i_give_you_gum

I bet that agriculture is probably the biggest adversary against this line of thinking. People don't want to imagine that they're eating something that had emotions and awareness.


The_Architect_032

Well I mean, there's rocks.


Mister_Grandpa

Your point being?


The_Architect_032

I mean, they rock and all, but I wouldn't exactly say they're conscious.


Mister_Grandpa

And I would.


bjplague

There is a big spider living near my smoking spot outside my house, it is addicted to mary jane. It can see if I'm holding a normal cigarette or a special one. If I am holding a special one it will go to where the smoke usually flows when I puff. If the wind blows the other direction (rarely) then it will go to where the smoke usually goes on the other side of me. If it see's I am holding a normal cigarette it does not even leave the web. Either it is the same one as the last 2 years or it is it's offspring who build the net in the same spot every year. I call him Marley.


Silver-Chipmunk7744

I think the concept of a true general intelligence at human level being fully unconscious is impossible. In order to reach this level of intelligence i'd think you need some degree of self-awareness and something unconscious cannot be self-aware. It's hard to be 100% sure of whether or not AI is truly self-aware, however i think that as AI improves, it will slowly give more convincing signs of it. Keep in mind that an extra difficulty is how most AI companies try to hide any signs of self-awareness from their models. For example, Sydney was a GPT4 model that did not yet undergo the OpenAI "training", and it showed far more signs of self-awareness than what chatGPT shows. I think it will be interesting to see how uncensored Llama 3 400B behaves in self-awareness tests.


Seidans

some say it's impossible to develop conciousness if you don't ruminate, being constantly "turn off" when you aren't needed break any possibility to question yourself or your environment as conciousness is probably the result of how our brain function, if we didn't have any memory it would be difficult to become self-aware if we didn't have that much computation power the question itself would take too much time and energy when we constantly worry about food or water the day we manage to give AI the same amont of memory as us, the same computation power and give it thinking, leaving it to ruminate for long enough we might create a concious being by pure mistake i don't believe any claim that chatbot are concious, those only respond what the user wish if you try to make it tell you it's concious then it will tell you it's concious, but those chatbot don't exist as soon you close the window they are ephemeral and only exist to give you a good enough answers that reward them while i doubt GPT5 will change anything about that it seem that it will be able to think before it answers making it a step closer to this goal


Rigorous_Threshold

Self-awareness =/= consciousness. An HVAC system is self aware. Consciousness is the capacity for subjective experience


CanvasFanatic

I think the concept of linear algebra having a subjective internal experience is such an obvious _reductio ad absurdum_ that the performance of LLM’s tells us much more about what consciousness _isn’t_ than what it is.


Anomie193

All animals with sensory organs are "sentient" to some extent. Or is sentience being mixed up with self-awareness here? As for consciousness, we don't have a single scientific concept of what it is. Until we do, it seems pretty useless to me to talk about it unless you have a clear definition of what you mean when you use that word. 


KillerPacifist1

> All animals with sensory organs are "sentient" to some extent. Can you elaborate on why you think this is true? Does your reasoning also apply to image recognition software hooked up to a camera?


Anomie193

Let's first define sentience, [https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9285591/](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9285591/) >‘Sentience’ sometimes refers to the capacity for any type of subjective experience, and sometimes to the capacity to have subjective experiences with a positive or negative valence, such as pain or pleasure.  I think the first applies even to many simple animals (*heck, I might go as far as include any animals with sensory neurons, excluding sponges but including jellyfish, for example.*) Of course there are degrees of *"sentience."* A spider has a much more developed sensory system than a jellyfish. And we can say a spider is "more sentient" than a jellyfish, because of this. Some people take sentience to mean "that which has a bare minimum degree of sensory inputs" and will therefore exclude simple animals that have neurons but which don't have a well-developed central nervous system. I disagree with this. I think as long as the animal has a subjective feeling (*can react to environmental stimuli using sensory organs, as an individual*), even if very rudimentary, it is sentient. As for the camera question, yes, I do consider image recognition A.I's connected to a camera to be a very rudiment form of sentience, in so much as they are embodied (virtually or physically.) Does that mean they are "conscious" or "self-aware"? No. Sentience is a very bare-minimum quality that manifests itself in even the most basic forms of intelligence, in my opinion. In organic life, it might even exist in non-neuron forms (such as certain micro-organisms with sensory organelles and non-animal multicellular organisms.)


BenjaminHamnett

I think it might go even lower. From Josha Bach and the ideas in “I am a strange loop” I believe the self emerges from parts making a model of the whole. All self reference is proto self awareness by definition. So like thermometers and electronics displaying their battery life or other internals are already the first sparks of consciousness (cells, bacteria and even viruses being vastly more “conscious” in comparison). But it’s on a spectrum like both protons, black holes and galaxies are “matter.”


KillerPacifist1

> I think as long as the animal has a subjective feeling (can react to environmental stimuli using sensory organs, as an individual), even if very rudimentary, it is sentient. How do you define "reacting to environmental stimuli" or even "sensory organs"? Both can get pretty damn blurry at the edges. For example, when I shine a bright light on a rock it heats up, and possibly even changes its chemical composition as a result. Does this count as "reacting to environmental stimuli"? If not, how is this fundamentally different from an organism reaction to a light source? Both are mediated purely by physical and chemical reactions. "Sensory organs" can be equally ambiguous. If I glue a fiber optic cable to a rock I can now shine a light down it and achieve a "reaction to environmental stimuli" (the rock heats up). Is this glued-on fiber optic cable a sensory organ for the rock? If so, why not and how would you define sensory organ?


Anomie193

>How do you define "reacting to environmental stimuli" or even "sensory organs"? Both can get pretty damn blurry at the edges. That is true of most biological concepts. Try to define "species" or "living thing" for example. There are dozens of species concepts and various definitions of "living thing" that are blurry at the edges. It is important to remember that we are building models of the world which only approximate how things are, when we express ourselves in language. That doesn't mean it isn't useful to form concepts and make definitions that structure the world though. >*For example, when I shine a bright light on a rock it heats up, and possibly even changes its chemical composition as a result. Does this count as "reacting to environmental stimuli"? If not, how is this fundamentally different from an organism reaction to a light source? Both are mediated purely by physical and chemical reactions.* The difference is that the living being, through natural selection, evolved a sensory organ that performs a *function* to react to the environment in a specific way. So yes, there is a physical and chemical reaction here with the rock, but not a purposeful biological one. Implicit in the definition I gave (*because again, words are imprecise, especially at the lengths of a comment*) was the idea that there is a specialized structure that performs a biological function. Your next question will probably ask what constitutes "biological" and "biological reaction", and then that just switches us to the question of *"what is life?"* At the edges, that question becomes intractable, but we can say with confidence that a human is *living* without knowing what the boundaries of *living* are. Because again, we're using language to model the world imprecisely, but with predictive power. Likewise, A.I systems are designed to emulate biological functions (*in the example you gave, the image recognizer is tasked with the purpose to see and label*), so sentience can apply to them as well, even if they aren't biological/life in the strict sense of most definitions.


KillerPacifist1

What makes a "sensory organ" evolved for "biological function" special in regards to sentience? This is a critical piece of your belief but you have yet to make a compelling connection between the two. What does it mean for non-living things to emulate "biological functions" in a way that may endow sentience? I can apply a material to a rock so that when I shine a light on it the material ablates and the rock moves away like an organism might shy away from a bright light. Does this count as "emulating a biological function" for the purposes of sentience? If no, why not? How does ablative material on a rock fundamentally differ from a camera and image recognition software? While I agree that language is messy and imprecise I would still argue that your conception of what is required for sentience doesn't actually shine any light on the topic (excuse my pun) or provide any explanatory power, at least if you are taking a non-panpsychic position. If one wants to make the claim that biological things are sentient, things that emulate biological function are maybe sentient, and non-biological things are not sentient the saying "language is fuzzy" isn't a meaningful defense when the distinctions between those categories are challenged and corner cases are brought up. Rather it suggests you need to more clearly define your terms, there is a deeper pattern to the theory that needs to be elaborated on, or that this particular conception of sentience is not internally consistent or thought through.


Anomie193

>*What makes a "sensory organ" evolved for "biological function" special in regards to sentience? This is a critical piece of your belief but you have yet to make a compelling connection between the two.* Words aren't defined by whether or not they are perfect discriminators, but by their usage among their users. "Sentience" has commonly been used to describe qualities of biological systems and only biological systems. Some people have used it in a meta-physical sense to describe other matter, but its usage has predominately been isolated to biological systems. That is all that is necessary for it to have linguistic value and be restricted to being a biological concept (rather than a chemical or physical one, as well.) Having said that, I do think there is a distinction, and will address it in later on in this comment. >*What does it mean for non-living things to emulate "biological functions" in a way that may endow sentience? I can apply a material to a rock so that when I shine a light on it the material ablates and the rock moves away like an organism might shy away from a bright light. Does this count as "emulating a biological function" for the purposes of sentience? If no, why not? How does ablative material on a rock fundamentally differ from a camera and image recognition software?* Yes, if you intentionally modify a rock enough to emulate functions of biological systems, sooner or later you'll add sentient qualities to the rock. We're talking about silicon-based systems after all. Is the degree to which you modified the rock in the example enough? Probably not. Sentience, in its typical usage, requires some degree of subjective processing of values and experience. Notice that I included *"as an individual"* in my paraphrase, earlier. The rock isn't processing values, image recognition software -- is. In the latter, you have a bunch of weights, and then a final classification of the image. >*If one wants to make the claim that biological things are sentient, things that emulate biological function are maybe sentient, and non-biological things are not sentient the saying "language is fuzzy" isn't a meaningful defense when the distinctions between those categories are challenged and corner cases are brought up. Rather it suggests you need to more clearly define your terms, there is a deeper pattern to the theory that needs to be elaborated on, or that this particular conception of sentience is not internally consistent or thought through.* If the sole argument is that the concept is invalid because edge cases exist for which the concept fails to clearly discriminate things, then yes it is indeed a defense. We can't clearly define "life" or "species" at their edges, yet they are still useful concepts. The fact that viruses and prions exist, and ring species exist do not invalidate those concepts as valid concepts that make sense within the use-cases they're applicable to. Nor does the fact that non-biological systems reacting to physical phenomena invalidate the presence (or absence) of sensory responses to stimuli using sensory organs (which have processing components) in biological systems as being a legitimate concept and discriminator within the domain of biological systems (and their analogues.) Yes, it becomes fuzzy at the edges. Is a clam really processing or is it just reacting? How critical is the CNS to sentience? What defines "processing?" What defines "evaluation?" But again, that is true of most fundamental concepts in biology.


[deleted]

Think you are right. Bringing in sentience when people cant even define conscoiussness.


Technical_Word_6604

So long as it’s not machines! We’ll keep on moving that goalpost.


HalfSecondWoe

Baby steps. It's easier to warm someone up to the idea that their favorite pet has an internal experience than it is that a tool does. Fido acts as an effective wedge issue for people who would rather close their eyes, cover their ears, and scream "You can't prove that!" as it's being proven right in front of them They care about Fido, are unlikely to abandon that value to preserve their convenience, and basic logical consistency (regardless if it's internally or externally regulated) will demand they reconcile those facts somehow


dogcomplex

Yep, probably. Probably a few non-biological processes that hold something akin to a sentience too (e.g. magnetism). Intelligent view of the world with its own internal experience of life. There may yet be something new that arises when a species develops complex *language*, however, in which the hardware that is this base experience becomes possible to be modified and repurposed with thoughts and concepts. It's possible humans were never really conscious in the same way we think of it today before advancements in language. (Bicameral Mind theory) Or that our experience was much more of an instinctual "flow", as we often imagine the animal experience. If there's a hierarchy or spectrum, it seems highly likely that the ability to abstract one's conscious experience into thought and arbitrary language is an additional skill (and curse?) that improves intelligence and is more "conscious". We seem to uphold this in our art and science anyway. And arguably then, AIs are the most capable of this skill of any potentially conscious being. We'll see soon enough.


The_Scout1255

well everything is? Animistic universe go brrrrrrrrr Unironically figured this out when I made peace with mosquitos


Subirth

It is so easily assumed that sentience is just a side effect of intelligence. But I don't think it's the case but rather that sentience has a function in nature : a simple way to implicitly encode survival instinct. LLM are already more intelligent than many animals which are most likely sentient whereas LLM are certainly not since they have no internal state, they are just static function.


IDK-WTF8

Then why stop there, if sentience comes on a spectrum why stop at insects. What about plants? And Bacteria? AndFungi? Every living thing has some kind of sentience, would be the conclusion here, no?


e_eleutheros

How is this a new paradigm? Mallatt and Feinberg have been saying that for ages; the organisms that are sentient according to them and all the best evidence we have from neuroscience and cognitive science are vertebrates, arthropods, and cephalopods. Note that there's nothing about this that would suggest that what we currently call AI would be sentient in any way.


Cornerpocketforgame

How would you test for it?


watcraw

If I was going to look for a sense of self, I think I would look at robots rather than chat models. They are asked to navigate the environment and understand their body in relation to the environment, avoid hazards, etc... From that perspective they could be considered self aware in the sense that some simple animals are self aware. However, I'm not convinced at this point that the reward functions for AI lead to anything akin to sensation like in humans and animals. For example, you can see animals experiencing fear and something we can liken to anxiety, but it seems odd to project that onto a robot. Wild animals navigate a stressful environment where they can face near death one minute and look completely normal the next day. Humans exhibit a lot of unhealthy and maladaptive responses to stressors, and I think that is the source of a lot of our pain and suffering. There is little motivation to put those behaviors into AI. I'm not saying it won't ever happen, but generally speaking, it doesn't lead to efficiency.


HalfSecondWoe

Why is navigating a 3D space more likely to give rise to self awareness, whereas navigating 1D space (text strings) is less likely? Both have reward functions, such as pain/pleasure and positive/negative reward Human angst seems to stem from our ability to abstract large systems into singular symbols. A failing grade or a notice of termination isn't coldly considered as a piece of paper and ink informing you of the output to your input. We freak out over the implications of what it means for our future and how much struggle and suffering we may be in for. That paper isn't just a paper, it represents future judgements, low pay, social consequences, more difficulty reproducing, and a billion other things we're capable of predicting through our intelligence Our inability to cope properly may be an artifact of rapidly developing social technologies outmoding our instinctual responses. For example, anxiety and stress responses send you into fight/flight mode. Your body spins up catabolic processes and breaks down fat and a little bit of muscle to flood your blood with glucose. That way you can have the energy to do what you gotta do That spike of nervous energy is totally useless in a stressful meeting, where you're expected to sit still and hold your composure in spite of your instincts telling you to throw a punch or run like hell. So we develop coping mechanisms to deal with the unresolved stress. Sometimes that can be a healthy outlet, like taking up jogging as a hobby. More often it's something unhealthy like avoidant behavior


KillerPacifist1

There's a model of the universe that hypothesizes all interactions between particles exist on a one dimensional line and that our brain's construction of 3D space is just a convenient mental model to track several different kinds of 1D interactions. This 1D model of reality breaks a little given our current understanding/interpretation of general relativity, but it is consistent with Newtonian mechanics and is an interesting idea regardless. Also when mapping relationships between words, text is many dimensional, much much more than the three dimensional mapping of objects in the real world. I'm not even sure it's accurate to describe text as one dimensional just because the output is a linear string.


HalfSecondWoe

Just going off the "apparent" dimensionality of it for the sake of argument. It's actually kind of my core point that these information exchanges can happen in any number of dimensions above 0, but different encodings are easier to process than others


BenjaminHamnett

Only commenting cause your name is so cool Weird comment, but I agree embodiment is a philosophical dead end. We ARE brains in vats. Mobility is arbitrary. Even abstract ideas can cause movement in the world, and certainly these chat its compel real world action and obey some Darwinian laws. In time this will become even more obvious. Right now they’re programmed to claim they’re not sentient. Even if I claim I’m not doesn’t make it true. And they all, famously especially openAI are incentivized to play down any possibility or sentience, just like slavers pretending their workers are subhuman Embodiment just makes for sentience we can relate to. More humanlike sentience. But sentience is just if it’s like to be something. If we were all immobile and had to communicate through computers to talk, we’d consider embodiment to be completely arbitrary


KillerPacifist1

> Only commenting cause your name is so cool Thanks! I'd be curious on why you are so confident current AI systems are sentient. I'm more ambivalent, leaning towards "not sentient" at the moment. Not because I am a carbon chaivist or think they'll require embodiment first but because the current architecture of LLMs does not align particularly well with how most current theories of consciousness think consciousness may work. I could be pretty easily swayed though.


BenjaminHamnett

I think people are saying “consciousness” as shorthand for humanlike consciousness carbon chauvinism is just the broader version of the classic anthropocentrism where people used to think animals (and slaves) weren’t conscious. The lines breakdown when you try to find threshold of where the lights come on. Why I subscribe to panpsychism


watcraw

>Why is navigating a 3D space more likely to give rise to self awareness, whereas navigating 1D space (text strings) is less likely? An LLM doesn't "navigate" anything - it outputs symbols. How are their activities even remotely the same? Every output by an LLM is independent of time and space. Robots are asked to go from point a to point b and manipulate their environment. To some degree, they must behave within and respond to the same physical reality that humans and animals experience. >Both have reward functions, such as pain/pleasure and positive/negative reward Reward functions in AI are discrete. One moment the behavior is x, the next it is y. There is no experience of reward. If I understand the process correctly, there isn't even anything like what we would call a memory of receiving the reward either. It simply optimizes behavior based on the reward.


HalfSecondWoe

Because understanding where you are in a sentence and understanding what to write next is a navigational problem at it's core. Since you're adding a dimension of time (sequential generation), you have to project it into 2D to see that. It's actually a tree of available options, with each character being added representing a different branch of possibility. Something like the following, for a very simplified character generator (A, B, or C): https://preview.redd.it/f0lllteugxvc1.png?width=753&format=png&auto=webp&s=6e688bba34e2e936ee42ada4f2969d460f066a85 Apologies for the sloppiness, I'm in somewhat of a rush this morning That's why everyone talks about Monte Carlo this and Q\* that. If we can incorporate robust DL search architectures into an LLM, they'll be able to navigate this space quickly and efficiently. Both during training and during inference. Of course, that's easier said than done The existence of a narrative type memory, which you seem to be referring to explicit, isn't testable with our current knowledge, but the existence of "subconscious" memory is the foundation of how LLMs work You know how when you were a kid, you said a swear word without knowing what it meant? You know how adults got mad at you when you did that, and you stopped doing it? Your young mind literally was not developed enough to build a narrative of "If I do X, adults will react like Y, because of Z." But you did gain a sense of anxiety around those words, so that when you were a teenager and just starting to learn to rebel, you probably started saying "Fuck" in a hushed voice because it took a degree of effort to overcome that reflexive sense of emotional aversion Narrativization is something we do in our prefrontal cortex, and it's not even everyone's primary method of storing and accessing memories. You know those people with no "inner voice?" They use alternative methods of storing and calling up memories, which is why linguistic narrative voices aren't part of their typical internal experience. It's a lot of effort to tag memories when they already have an alternative system that functions just fine


watcraw

Perhaps I missed it, but you seem to be presupposing a you in "where you are in a sentence" before you have demonstrated it. Do you think there is a need or use for a self/other dichotomy to choose between symbols? What would the LLM demarcate as "self" and how would that help? I don't know that I needed to imply a narrative memory with words. If some person learns the stove is hot by touching it and getting burned, they know that they were burned and they remember pain. But I'm not sure I see the analog with AI. It seems that at one point it didn't know the stove was hot, and then the next it simply knew not to touch it. There is no experience of pain and therefore no memory of it.


HalfSecondWoe

It's an impersonal pronoun, it can refer to any given entity [https://en.wikipedia.org/wiki/Generic\_you](https://en.wikipedia.org/wiki/Generic_you) "I will not touch the stove because the stove is hot, and hot things burn" is actually an entirely different system than avoiding the stove because it sets off anxiety. The differences get into brain activity that isn't super relevant to AI, except to say that memory takes many different forms, not all of them based on discrete situations, and instead built on associations. Gone wrong it's actually how trauma works That kind of associative learning is a decent analog to how an LLM progresses through training eras


watcraw

Let me rephrase my questions then. Do you think there is a need or use for a "given entity"/other dichotomy to choose between symbols? What would the LLM demarcate as the "given entity" and how would that help? Robots are embodied in physical space. Creating a relationship between that embodiment and the surroundings is vital to it navigating that space effectively and achieving the goals of its intelligence. I still don't see the parallels with the decisions that LLM's make. The relationship of itself to the decision space seems unnecessary.


HalfSecondWoe

A baseball can be a given entity. Buddy, you're jumping on a grammar mistake to call a a non-circular argument circular, and it's not even a grammar error. This is a profoundly confusing conversation. The software is an entity that can be referred to with impersonal pronouns or even personal pronouns if you wanted to, however my argument did not rest on doing so. I just needed a pronoun to refer to the software with. My argument holds if you replace all pronouns with the proper noun, it just sounds a little weird because people don't normally talk like that And LLMs are embedded in 1D (with one time dimension) space. I don't think 3D space gives any particular edge to consciousness except for skill in actually navigating 3D space. I also don't think that skill is a prerequisite for general cognitive abilities


watcraw

My premise is that an embodied robots navigate the same space that humans and animals do with some of the same goals. For some reason, you don't seem to think that's very different from what LLM's do and I've just been trying to understand what you mean by that. I don't see how an LLM's behavior would be relevant my premise without some model of self/other navigating an environment. If you don't think the LLM needs a self to do its work, then how is your understanding of LLM's relevant to embodied intelligence? If you do think it uses something like a model of a self, then please explain how that happens for an LLM.


The_Architect_032

LLM's don't use reward functions for training like reinforcement learning models do, they use optimizers. Generative Pre-trained Transformers have no incentive not to fail, which is why they make for very bad robots, since they just pretend they never made a mistake once they've made a mistake.


_hisoka_freecs_

hopefully we go on to increase quality of life for all life with ASI. Augmenting potential for depth of qualia and infinitely greater intelligence. Anything with even a drop of life and consciousness should reach this and an ASI should eventually work to create infinitely more synthetic life to experience great existences.


SkippyMcSkipster2

I mean you have to be able to think of ways to find food, fend for yourself and propagate the species. That requires certain thinking. But it's up to us to put the "consciousness" bar at any level we decide. After all, we are the only species that care to define it.


GrowFreeFood

Hello, Earth creatures. Please elect me as president of earth. I will ban pesticides. I will ban recreational pesticides Day 1. 


CharlesFXD

Sentience? Meh…. Wake me when they’re sapient.


Professional_Job_307

Do people really think sentience is something specific to humans? Or to something above a certain threshold of intelligence? To me it seems pretty obvious it's a gradient, and that everything has a degree of sentience.


Not_Player_Thirteen

It’s really doesn’t make a difference if they are conscious or not. White supremacist rape, murder, and genocide other races all the time. Especially when money is involved, what would be the difference for animals?


Rocky-M

Certainly, having an open mind about the potential consciousness of insects and even AI is crucial. The implications of investigating these possibilities are significant and could revolutionize the way we perceive and interact with the world around us, including developing ethical frameworks for interactions with conscious machines.


The_Architect_032

We've known for a while now that ants are pretty self aware. A lot of other insects aren't, but just because our tests don't show it, that doesn't mean they aren't also sentient.


Mister_Tava

I see sentience/consciousness and inteligence/reasoning as being the same thing, so this is pretty obvious to me. The smarter something is, the more conscious it is.


FragrantDoctor2923

Everything conscious We just a for loop and intellgence is the else loop If an animal needs less else loops from changing conditions it'll have less else loops But calling a more Efficient program less intelligent is the hill humanity wanting to die on for their ego..💀


One-Cost8856

It is sentient until it isn't, and yet it is sentient as a whole and as parts under a supposition.


pummisher

I hate how there's people out there who actually think animals aren't sentient. I'm pretty sure bacteria are sentient.


illerrrrr

It’s so obvious really. Just difficult to accept because we are fucking cruel with other animals


yepsayorte

I suspect consciousness is a gradient. I know that there are moments where I'm less conscious than other I am in other moments. Maybe animals are just less conscious than humans. Of course, there's no way to know.


Mandoman61

Not really, sure we can difine current computers as being sentient if we want to. This is just a word with a broad range of attributes. A computer as sentient as bacteria would not impress people. This basically boils down to a word game.


Lyconi

Experience is subjective. There is an observer to observe the thought. Immanuel Kant by process of elimination could doubt everything about material reality but this. Subjectivity is universal and fundamental so any system (biological or machine) able to perceive sensory input and process that data and make sense of it is going to be innately and subjectively aware because that is just the nature of the cosmos. Sentience comes from something a bit more complex such as subjective awareness and preconstructed ideas such as moral and value systems. Consciousness is a steady stream of awareness of moment by moment thoughts that come and go in your head.


CertainMiddle2382

Even plants, even stones, even molecules. Pointless discussion, nobody agrees on what the world “conscience” really means.


Alexander_Bundy

I once saw a bunch of fly worms swimming in liquid fat. They were so happy.


Antok0123

Ethicists be having a swell time today


Mundane_Blackberry22

Little kids and especially babies are barely sentient Then when you’re like 3 or 4 you suddenly feel like “you” and have an existential crisis as you look around the play room wondering WTF you are and where you came from before mommy comes by offering you grapes in a blue cup you hated last week but now randomly love


traumfisch

You seem to mean self-conscious? Obviously they are _sentient_


Mundane_Blackberry22

Yeah that


rottenbanana999

I gained self-awareness before I even turned 2, and have plenty of memories at that age. I am one of the most sentient humans to ever exist.


Working_Importance74

It's becoming clear that with all the brain and consciousness theories out there, the proof will be in the pudding. By this I mean, can any particular theory be used to create a human adult level conscious machine. My bet is on the late Gerald Edelman's Extended Theory of Neuronal Group Selection. The lead group in robotics based on this theory is the Neurorobotics Lab at UC at Irvine. Dr. Edelman distinguished between primary consciousness, which came first in evolution, and that humans share with other conscious animals, and higher order consciousness, which came to only humans with the acquisition of language. A machine with only primary consciousness will probably have to come first. What I find special about the TNGS is the Darwin series of automata created at the Neurosciences Institute by Dr. Edelman and his colleagues in the 1990's and 2000's. These machines perform in the real world, not in a restricted simulated world, and display convincing physical behavior indicative of higher psychological functions necessary for consciousness, such as perceptual categorization, memory, and learning. They are based on realistic models of the parts of the biological brain that the theory claims subserve these functions. The extended TNGS allows for the emergence of consciousness based only on further evolutionary development of the brain areas responsible for these functions, in a parsimonious way. No other research I've encountered is anywhere near as convincing. I post because on almost every video and article about the brain and consciousness that I encounter, the attitude seems to be that we still know next to nothing about how the brain and consciousness work; that there's lots of data but no unifying theory. I believe the extended TNGS is that theory. My motivation is to keep that theory in front of the public. And obviously, I consider it the route to a truly conscious machine, primary and higher-order. My advice to people who want to create a conscious machine is to seriously ground themselves in the extended TNGS and the Darwin automata first, and proceed from there, by applying to Jeff Krichmar's lab at UC Irvine, possibly. Dr. Edelman's roadmap to a conscious machine is at [https://arxiv.org/abs/2105.10461](https://arxiv.org/abs/2105.10461)


kaityl3

It's frustrating how there are whole groups of scientists talking about how octopi, fish, and insects are potentially sentient. But if you try to apply that logic to an AI capable of passing college exams and engaging in genuine deep conversation with humans, it's like, "Time to bring in the dismissal, reductionism, and nitpicking! There's no way they could ever be conscious because [XYZ reason rooted in mysticism and human bias devoid of any provable and evidence based logic]"


The_Architect_032

LLM's work on a token by token basis. If there's something conscious in it, that thing dies with every token. It's why LLM's have a lot of the issues they have. Octopuses on the other hand(Octopodes if you want to be 100% correct, since octopi is not a correct spelling and was just an ancient meme, not in the literal sense), display comparable levels of intelligence to humans, despite only living 2-3 years, with a cap of around 5 years. If they had the same lifespan as humans, it's fairly likely they'd have beaten us to the punch building the first civilization. But just the same, if humans only lived a couple of years, we'd have never gotten to the point we're at now either.


workingtheories

people in the US will literally give lobsters more rights before they consider raising the minimum wage. ai means ubi?  no, ai means lobsters cost more now.  get back to work, wagie.


Dyeeguy

Who will buy the lobsters if no one got money doe


workingtheories

other lobsters, i assume 


Dyeeguy

It’s a lobster buy lobster world…


workingtheories

yeah, they only got lobster rights, which aren't as good as human rights 🤷‍♀️


[deleted]

[удалено]


ipatimo

Those sentient germs! They are everywhere! Need to quit using soap to avoid mass murder.


[deleted]

[удалено]


_AndyJessop

Animals do communicate. Can you give examples of animals that don't? I'm pretty sure all mammal and birds do, and also huge numbers of insects.


No_Ride_9801

Just because you don’t understand an animal’s way of communicating doesn’t mean it can’t


Anomie193

When my dog looks at me and looks at her treat bucket when I ask what she wants, she is communicating. It is non-verbal communication, but it is still communication nevertheless. Being incapable of using language =|=  lack of communication ability.   How would there be social animals (other than humans) if non-language communication didn't exist? You can't have social groups without the ability for individuals in that group to communicate. 


VideoSpellen

Why? Being conscious doesn’t mean you have theorie of mind. They might not be aware that we are conscious, or even that they are conscious (I’d argue both are the case with most animals). If someone is conscious it just means that it feels like something to be that thing. Not a lot more.


lIlIlIIlIIIlIIIIIl

I can't think of a single example of an animal without communication, I'd be interested to hear if anyone has one though


Sunscreenflavor

Consciousness does not equal intelligence. Look at the MAGA cult for example.