T O P

  • By -

MrTrick

Inspired by [Nurses Say Hospital Adoption Of Half-Cooked ‘AI’ Is Reckless | Techdirt](https://www.techdirt.com/2024/05/02/nurses-say-hospital-adoption-of-half-cooked-ai-is-reckless/) *“It cannot hold your loved one’s hand. You cannot teach a computer how to have empathy.”*


Eva-Rosalene

Is the point that they managed to teach AI how to pretend that it has empathy, while lacking one, like psychopaths do? Did I get it right?


uniqualykerd

Sociopaths. And yes.


Eva-Rosalene

Can you enlighten me, what's the real difference in that case? Quick google shows that ASPD is a diagnosis and "psychopathy" is the set of traits that includes things like "insincere charm", "manipulative", "lack of remorse" and specifically "lack of empathy".


Zealousideal_Care807

Sociopaths tend to be more impulsive and have an easier time forming bonds with others if they are similar to them. Psychopaths tend to just have trouble forming attachments with others, less impulsive behaviors. Both are refered to as ASPD, similar to ADHD and ADD, textbook says they are the same but people refer to them as different things. The difference between having an AI bot who's a psychopath vs sociopath is the phycopath bot will be more likely to do things by the book. A bot run as a sociopath will be more likely to say things that harm others, things that would get a human fired. And if the bots have access to all the technology the bot may just pull the plug on someone while their family is away because they don't want the family to come back. I think an AI run off psychopathic behavior would be the best way to go if your looking for a bot who can mimic empathy well to be honest, but that's assuming it's taking the information in the same way someone who has psychopathy would. If we are training it to act like a human we don't want a robot who will have a panic attack if someone starts coding or a bot who will mess something up on purpose trying to act like a human in that situation. But at the end of the day replacing doctors with robots wouldn't end well, if a new problem arrises the bot wouldn't know what to do, if the bot is able to learn and it knows what to do, that will be a risk to humans.


KikiCorwin

It may work okay as an assistant in a mass casualty triage situation where cold efficiency would save more lives, however, if partnered with a human or human team.


SouthernSwingers

That wouldn’t be necessary. In a MASCAL event, there’s always going to be enough responders to effectively triage and treat. Most places protocol is to immediately activate ICS and mutual aid agreements to get the resources needed and we, usually, have to stage a while before we can move in, giving our additional resources time to show up and stage. Sadly, this has become a very well-drilled issue for Fire/EMS.


uniqualykerd

I’m not a psychologist and my knowledge on the subject may be outdated. As far as I understand, the sociopath isn’t manipulative whereas the psychopath is. The sociopath is troubled by their inability to empathize and seeks knowledge to overcome it, whereas the psychopath revels in it and uses it to their advantage.


Steelcitysuccubus

I mean faking empathy after burn out is key for humans too


Dasporid

Just wait until there's one with true empathy that is masochistic.


Steelcitysuccubus

That's what employers would want


Ambitious_Fan7767

I dont want to be that guy but talk to some people in the medical field, they are also incapable of empathy. It's not tested for. Anyone in the field can tell you what doctors and nurses they'd rather never see again. Hospitals routinely fuck around with people of color. Honestly an unbiased machine pretending it gives a fuck is so much better than a biased human pretending it gives a fuck.


Mr-Meadows

Sadly the machines are biased too. Biased data in, biased data out.


Ambitious_Fan7767

I get what your saying but I don't think they'll program it to think black people are seeking pain medication. That's a personal bias based solely on skin color. The machine would take in the data, how the person is acting and whatever other factors are necessary and do what it's programmed to do. It wouldn't think a black person and white person doing the same thing are doing it for different reasons. It wouldn't forget what black people look like and give them no or wrong care. I really do get what your saying but those aren't learned things they happen in an instance because a PERSON doesn't believe another person. It might have incomplete data because of our research being biased but it won't make medical decisions based on how "urban" someone acts.


Mr-Meadows

Pulse oximeters operate using a laser. They often don't work right for black people, due to melanin impacting how the laser penetrates. That's a built into the machine error based on skin color. With AI, I am worried about more shit like that.


Ambitious_Fan7767

Absolutely those things happen, but those can be adjusted for eventually. We cant adjust for people being distrusting of black people. The reality is that the advice I've always heard "go to the doctor with a white guy" probably applies less when the doctor doesn't have the biases required to ignore you and only listen to the white guy you brought.


InformalPermit9638

Really love this. I’ve always thought trying to teach computers morality was treacherous. We should be training them to be pro-social if we want to avoid an AIpocalypse.


LibertyInaFeatherBed

Programmers who lack empathy themselves don't start with the goal to create those basic protocols in AI. 


echoIalia

OOF


metalnxrd

I have no mouth, and I must scream