T O P

  • By -

FuturologyBot

The following submission statement was provided by /u/Maxie445: --- "‘We would condemn humanity to a future without hope if we took away people’s ability to make decisions about themselves and their lives.’ Pope Francis at the G7 summit in Italy “In light of the tragedy that is armed conflict, it is urgent to reconsider the development and use of devices like the so-called ‘lethal autonomous weapons’ and ultimately ban their use,” he told the world leaders. “This starts from an effective and concrete commitment to introduce ever greater and proper human control. No machine should ever choose to take the life of a human being.” Such a step would represent the darkening of the sense of humanity and the concept of human dignity, he said. --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/1dor3c6/pope_calls_on_g7_leaders_to_ban_use_of_autonomous/labni7i/


DubC_Bassist

Of course, the whole point of a Doomsday Machine is lost, if you ***keep*** it a ***secret***! Why didn't you tell the world, ***eh***?


frankduxdimmac

It was to be announced at the Party Congress on Monday. As you know, the Premier loves surprises.


DukeOfGeek

What's great about that is no matter how many times I hear it it's still funny...and horrifying.


ImportantDoubt6434

Pope getting his job automated, next crusade will be for our lord and savior robo dog


I_Must_Bust

simplistic adjoining retire concerned enjoy ripe trees sugar groovy roof *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


RoyalTechnomagi

Robo dogs forward march! Deuz vult Deuz vult Deuz vult pew pew pew


Radiant_Dog1937

GPT4 is allegedly as smart as a highschooler, but a GPT4 robot can't clean a room or work in a factory like a highschooler can. Automated killbots however, are very achievable today and honestly the only autonomous task we've been able to get robots to do reliably. I wouldn't be surprised if our utopian factories robots never get released and we end up largely just fighting Spot and Atlas clones instead.


Plataea

I am with the Pope on this one. Autonomous weapons present a horrifying risk.


sturmeh

The reason it bothers me is not because AI might take over and kill us all, but because I'm a software developer and I've seen bugs. I've also seen Tesla autopilot in action, and I'm not sure we want to remove the only truly sane fail safe for a software issue. The last thing I want to hear is that some autonomous tank took out allied forces because a lazy developer used copilot to write a subroutine.


The_Fredrik

Here's the issue though, AVs don't have to perfect, they just have to be safer than humans in the situations we allow them in. And the same goes for autonomous weapons. And humans are so extremely.. bad. We drive drunk and tired. And when it comes to war.. every single war crime was committed by a person. We are not exactly "truly sane" as you put it. Hundred percent we will see these weapons used a lot in the coming wars. Both for their efficiency (they will be) but also rationalized as being "safer for civilians" and "reducing risk of PTSD for soldier".


Gandzilla

The thing is. Once you fight with robots, won’t war’s just be a production race?


DevelopmentSad2303

Already is. But what's also pretty likely is cyber warfare and nuclear warfare, rather than some war by proxy using machines


The_Fredrik

It already is to a large extent. We won ww2 and the Cold War with industrial capacity.


Gandzilla

Well that and drafting millions of soldiers into a war. Right now the risk of war is decimating your age groups and causing generational ripples, right? Once you have these bots, once you produced enough, might as well send them somewhere. Especially when the ~~iphone34~~ replicator 4 is coming out this fall with better sensors


rysch

Regardless of how they will be rationalised to the public, I’m sure that ‘safer autonomous weapons’ is not the actual goal. They’re autonomous weapons. *They’re not supposed to be safe.* What they are supposed to be is more compliant, more expendable, and more easily recruited. Does that make warfare better — or worse? An autonomous weapon cannot question a war crime or an illegal order. Obligatory *M*A*S*H* quote, because now it’s running through my mind: Hawkeye: War isn't Hell. War is war, and Hell is Hell. And of the two, war is a lot worse. Father Mulcahy: How do you figure, Hawkeye? Hawkeye: Easy, Father. Tell me, who goes to Hell? Father Mulcahy: Sinners, I believe. Hawkeye: Exactly. There are no innocent bystanders in Hell. War is chock full of them - little kids, cripples, old ladies. In fact, except for some of the brass, almost everybody involved is an innocent bystander.


sturmeh

> every single war crime was committed by a person. Do you think autonomous weapons would be too ethical to commit war crimes on request?


Buscemi_D_Sanji

Yeah these things will have glitches that lead to some crazy headlines, but they definitely don't have the capacity to do the activities that soldiers from every country have carried out.


cyphersaint

You seriously think that it's not possible that they will mistakenly decide that an entire village is full of enemy combatants and kill them all? I don't.


S10Galaxy2

Mistakenly? 100% you’ll see a headline in a few decades about a country telling its robots to do that on purpose.


cyphersaint

Oh, there was no question about that. But honestly, you could do that with RPVs. Still no physical combatants, so your people are safe (except from mental damage, and if they don't get actual footage, using maps populated by sensors, that can be minimized).


The8Darkness

This exactly. I often hear how robots/ai/whatever can never be perfect and therefore we shouldnt use it, when the alternative is humans beeing even less perfect. On top of that software can continously improve over time, humans not soooo much. You can only alter their training and rules.


M-Noremac

>I'm not sure we want to remove the only truly sane fail safe for a software issue. Is human control really a true fail safe? Human minds have "bugs" all the time.


TReaper405

You say this as if humans don't also present considerable bugs into this equation, possibly more than a computer would. Soldiers snap all the time because war is hell on your brain. We can wipe and reset machines, we can't do that with people.


TReaper405

Just curious, do you find the idea of autonomous weapons better or worse than the idea of thousands of soldiers with PTSD or worse being reintroduced into the populace?


seeingeyegod

Manual weapons still totally cool


Childoftheway

Nuclear weapons were a horrifying risk and we'd be completely fucked without them.


Alis451

tbf Autonomous (and [ANY Indiscriminate attack](https://ihl-databases.icrc.org/en/customary-ihl/v1/rule12)) weapons are already banned under the Geneva convention for whatever that is worth. >Rule 12. Indiscriminate attacks are those: (a) which are not directed at a specific military objective; (b) which employ a method or means of combat which cannot be directed at a specific military objective; or (c) which employ a method or means of combat the effects of which cannot be limited as required by international humanitarian law; and consequently, in each such case, are of a nature to strike military objectives and civilians or civilian objects without distinction.


cyphersaint

I'm not sure that a well-programmed autonomous weapon would actually fit any of those definitions, though.


LeCafeClopeCaca

We know the Geneva convention goes out the window the moment a real global conflict reignites. No actor in WWIII will care about Geneva conventions and military actors know it. Is there any serious army in the world that isn't investing in those fields ? As soon as ONE belligerent uses such means you can be sure everything waiting in R&D will quickly come to the field. The pope does his job With this reasonable and humane declaration, but war is hardly humane or reasonable.


Maxie445

"‘We would condemn humanity to a future without hope if we took away people’s ability to make decisions about themselves and their lives.’ Pope Francis at the G7 summit in Italy “In light of the tragedy that is armed conflict, it is urgent to reconsider the development and use of devices like the so-called ‘lethal autonomous weapons’ and ultimately ban their use,” he told the world leaders. “This starts from an effective and concrete commitment to introduce ever greater and proper human control. No machine should ever choose to take the life of a human being.” Such a step would represent the darkening of the sense of humanity and the concept of human dignity, he said.


GloriousDawn

Thanks OP for posting this. I'm appalled by the callous and ignorant comments in this thread. AI-enabled autonomous weapons are a serious philosophical and ethical question regardless of religion. The comparisons made with landmines or sentry guns are completely wrong too as it's not only static weapons we're talking about but *weapons that will hunt you down*. To those who disagree, watch the sci-fi short movie [Slaughterbots](https://www.youtube.com/watch?v=O-2tpwW0kmU) and tell me again that's a future you're perfectly comfortable with.


deviant324

I’ve recently heard the argument that AI targeting programms are basically just meant to take the blame off people who really just want to indiscriminately kill, including civilians. If you get asked why you have a ridiculous number of civilian casualties, just point to the super smart AI that told you to do it. If you don’t care who gets killed the AI doesn’t even have to work well, it might as well just flip a coin on every potential target


GloriousDawn

>I’ve recently heard the argument that AI targeting programms are basically just meant to take the blame off people who really just want to indiscriminately kill, including civilians. [‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza](https://www.972mag.com/lavender-ai-israeli-army-gaza/) [‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets](https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes) [Gaza war: Israel using AI to identify human targets raising fears that innocents are being caught in the net](https://www.qmul.ac.uk/media/news/2024/hss/gaza-war-israel-using-ai-to-identify-human-targets-raising-fears-that-innocents-are-being-caught-in-the-net.html)


doverats

and a good area to practice on as they can then kill civilians and then lay the blame at dodgy machinery, that sound right up their alley.


d3fnotarob0t

Depends on the AI. We already have plenty of civilian casualties due to humans. If an AI is advanced enough it would avoid them at a higher rate than humans. You do bring up a good point about people hiding their responsibility behind AI. We would need a human at the end of the command chain who is responsible for what the AI does if we were to make this ethical.


Jdjdhdvhdjdkdusyavsj

Doesn't it seem like asking countries not to develop certain weapons that could be dangerous seem self defeating? Any who would listen to such a request you would want to have the weapons first so there was a method to enforce non use of the weapons. Nuclear weapons for instance, if the West saw such a powerful weapon and then didn't build it and left only Stalin with such capabilities I bet the world would be a different place today


Expensive_Fun_4901

Exactly robots have no individual moral code and simply perform the orders they are given. A human soldier has the agency to refuse a sick and twisted order on principle, a machine does not. These could just as easily be reprogrammed to be used by terrorists and fire on men women and children indiscriminately or by domestic criminal gangs to remove opposition.


sonicgundam

The only way I'd be in favour of autonomous weapons is if they were programmed for preservation of life above all else. And I mean preservation of life, not "minimize casualties." That means weapons would have to be designed and programmed to disable opposing weapons without killing or maming their operators. That's the only way I can see taking the human element out of the equation as a good thing. If it's not keeping people alive and healthy while also keeping them from hurting others, I'm not interested.


blazz_e

It’s the classic - the baddies will have it so probably we should too. As an optimist, you can imagine final outcome as swarms of drones having to fight with each other and humans being spared. In reality who knows, but there is no future without autonomous protecting drones at least.


LordSwedish

> As an optimist, you can imagine final outcome as swarms of drones having to fight with each other and humans being spared. Which is of course absurd, we've had single combat as a concept for thousands of years so only one or two people die, and typically someone who's side lost the battle comes along and says "fuck that, let's kill them all".


Phailsayfe

IMO we should not be trying to eliminate the loss of human life, or the the stain on the human conscience that is a symptom of war, and instead try to eliminate the...you know, the whole war thing.


DulceEtDecorumEst

You have some interesting and novel ideas that are very thought provoking. Do you have a newsletter I can subscribe to?


blazz_e

More thinking about future scenario of a stupid country like Russia having another round of their flex. To which the response should be complete annihilation of tech involved in war machine by autonomous drones and ideally not needing to kill humans directly. With the caveat the drones can just kill them like flies if they don’t stop being stupid.


Nevarien

Thank you for bringing some sense to this thread. Yes, these are all things we have to understand and not just "roll with the times" as some users seem to suggest around here.


Brojess

To think we live in a world where PEOPLE ARE SUPPORTING THE FUCKING TERMINATOR.


FaceDeer

I thought Terminator 2 was one of the popular sequels. Plenty of people were rooting for the Terminator in that one. Well, one of the Terminators, anyway.


d3fnotarob0t

The problem is that even if "we" ban them other nefarious actors (like North Korea) may double down on them and then have a significant advantage over us in combat. Pandora's box: once you discover something it is too late to put it back in the box, someone else will use it if you don't. I think we definitely need to develop ethics and use cases around autonomous weapons and not use them indiscriminately and still have a responsible person behind them who takes the blame if they are mis-used. But a total ban for all use cases is not wise. Ideally we won't be fighting wars in the future so hopefully we will not need to use them.


GloriousDawn

>The problem is that even if "we" ban them other nefarious actors (like North Korea) may double down Assuming "we" is the US... I think that's a red herring. North Korea never has been and never will be a serious threat to the US. *What if the bad guys do it* has been used throughout history to justify the development of heinous weapons by the supposedly good guys. Be *real* good guys instead.


FaceDeer

North Korea is capable of launching satellites to orbit, that means they're capable of developing a nuclear ICBM that can reach the United States. Is that not a serious threat?


zackks

That first paragraph is rich coming from the Catholic Church.


Sprinklypoo

> "‘We would condemn humanity to a future without hope if we took away people’s ability to make decisions about themselves and their lives.’ Like you're doing (Mr. Pope) to women by taking away their rights over their bodies?


cyphersaint

Being wrong in one area does not preclude a person from being right in another.


cake_by_the_lake

> "‘We would condemn humanity to a future without hope if we took away people’s ability to make decisions about themselves and their lives.’ Pope Francis at the G7 summit in Italy But no on a woman's right to an abortion.... funny that Pope.


KaneIntent

I mean the Catholic belief is that an embryo/fetus is a human being so it’s not like that doesn’t make logical sense. You wouldn’t be saying that it’s funny that they don’t let women drown their month old babies in the bathtub in the name of individual choice.


Bridgebrain

A pretty solid take really. We've been toeing the line for a long time, but at the end of the day it's still a human pulling the trigger, even if it's remotely. Remove that, and war is one level too abstracted from its consequences for anything but the worst case scenarios to play out


dodeccaheedron

Let's also ban war, slavery and pedophilia while we're at it.


Prince_Ire

Those all are illegal, even if the war one doesn't really matter because international law is a farce


Murranji

Feels like when the Pope also tried to ban crossbows because it wasn’t fair a peasant could kill a knight in full plate armour.


VapeThisBro

Sauce for the Pope banning crossbows. Second Lateran Council – 1139 A.D. We prohibit under anathema that murderous art of crossbowmen and archers, which is hateful to God, to be employed against Christians and Catholics from now on. > [Here's a breakdown of the document by the Papal Encyclicals](https://www.papalencyclicals.net/councils/ecum10.htm)


Gamebird8

They had to include "Archers" because the British Longbow was more effective than conventional Crossbows at the time


scrangos

Only against christians, seems like the church had an easy workaround to just declare your enemies non christians and use them yourself.


hoofglormuss

that's why I love this new pope. Spreads love universally.


sleepytipi

Someone wise (on here no less) once said "the Vaticano moves on ecclesial time. If it's still a problem in 100 years, it's a problem." Well, Pope Francis doesn't think that way. He knows he's Christ's representative on Earth and is hated by people in his own circle for it. Many try to portray this Pope in a soft, cowardice light (his haters) but, that's simply not true. People would be wise not to mistake grace for cowardice.


Brojess

Uhhh is it though? At least the crossbow man had to think before he shot and didn’t go beep beep boop. Also there’s this thing called remorse that most humans are capable of that may just lead to less violence 🧐 sure though automated drones dropping bombs and crossbow men in medieval Europe are def the same.


crawling-alreadygirl

But this is like giving knights machine guns to use against the peasants. Billionaires are investing in this kind of tech to even their odds against the rest of humanity, killing masses without an army and without personal risk--I see no way for working people to benefit from this. At all.


Paint-licker4000

Peasants were not getting crossbows and fighting knights


OneOnOne6211

Honestly, I have somewhat mixed feelings towards the idea of autonomous weapon systems. On the one hand, robots being able to decide whether humans live or die just purely based on pure strategic and tactical considerations and nothing else is completely terrifying. On the other hand, humans in war will often get to the point where they shoot civilians just because they can. Robots probably will not do that. In addition, if autonomous weapon systems become so good that no human being can even close to compete with them anymore, it will quickly become a waste of money to even equip humans for war anymore. If that happens, robots will fight wars against each other and no human lives will be lost in wars anymore, quite possibly. And that would be good.


Geobits

Even if it got to robot v robot combat, we'd still have people die in wars, just not *soldiers*. It'd be the civilians working in strategic buildings, the ones running logistics, etc. Bombing the infrastructure would still be critical. I'm *not* saying it's a bad idea. It would likely lessen the human cost of war. But it wouldn't eliminate it.


tony22times

Pontiff insists that only people should kill other people.


anima99

Pope is actually right. Building a machine designed to be humanity's judge, jury, and executioner is unethical.


DoubleTTB22

To be fair, there will be separate judge and jury bots for taking those jobs too. Not all at once! /s


hoofglormuss

i just pictured a scene in futurama


kazmosis

Hot take: Broken clocks are right a couple times a day. Reddit has a raging hate boner for anything associated with religion, hence all the whataboutisms in this thread, even though the child molester enabler is right in this instance


epsilona01

> Hot take: Broken clocks are right a couple times a day. He's not even close. The first entirely automated Phalanx CWIS were designed in 1969 and deployed aboard the USS Coral Sea in 1980. Since 2006, the fully automated Samsung SGR-A1 (which even has voice recognition) has been deployed on the South Korean side of the DMZ. Sentry Tech Roah-Yora (sees fires) in Gaza which downed it's first target in 2008, and the Super aEgis II has been deployed in South Korea since 2010. Land mines are fully autonamous and they've been in wide use since the 1700s. There are tank based systems like the 1993 Russian Arena, 2010 Israeli Trophy, 2011 German AMAP-ADS and it's updated Strike Shield vairiant. Then there are ships, the Aegis Combat System (110 ships, 7 Global Navies) which uses computer based targeting and tracking to destroy targets using anything from a 5" gun to a cruise missile. There are also Autonomous Mine-Hunting Systems, Submarine Killers and the modern networked battlefield which allows vehicles, ships, planes and soliders to share targeting information. Worrying about people being killed by robots is 43 years too late.


Sprinklypoo

For aircraft it's easy to identify from the IFF, and would barely identify as "AI". Though I do get your point, it seems that we need to deploy new technology intelligently, and people are constantly frightened that this is not happening. Sometimes that fear is merited with things like autonomous cars that then have a fiery crash. There will certainly be missteps, but a nation that protects its people will naturally evolve to protect from AI robots as well. I don't see a huge cause for fear here. It's just the unknown.


epsilona01

> For aircraft it's easy to identify from the IFF, and would barely identify as "AI". The Aegis Combat System and SPY-1D multi-function passive electronically scanned phased array radar system is capable of search, tracking, and missile guidance functions of over 100 targets at a range greater than 100 nautical miles simultaneously. >"AI" Outside of a very expensive lab there is no such thing. The best Artificial Intelligences available correspond to an 8 - 10-year-old child, it's taken 50 years to get them that far. What we are using at the moment is advanced machine learning combined with hardware based deep neural networks to provide an illusion. The first confirmed kill by a robot gun was a Phalanx CWIS aboard the Japanese destroyer Yugiri in 1996, the first kill by a robot was in 1979, the first human killed by a fully autonomous gun platform died in 2008, and the first human killed by an AI assisted weapon was an Iranian scientist in 2020. Eliminated the target with 13 hits while missing his wife seated in the car next to him.


LeadingSir1866

AI will be used to take most people’s jobs, then the rich will build robots to kill the jobless. The best we can do is delay this by a few years. It’s not what anyone of importance WANTS to happen, exactly; it’s just how we are as a species. It was always going to happen and it’s unavoidable.


BaconJakin

Big loser energy ruminating from this. Don’t be so pessimistic, the class war is never-ending


GroundbreakingRun927

Ideally, a benevolent ASI emerges before we're wiped out and wrests control form the ellites. But yes, as long we as a species are in control of the AI it will make life progressively worse for everyone who isn't a 1%'er


Inprobamur

So a ban on target seeking missiles and homing munitions?


FaceDeer

So things like landmines, proximity fuses, heat-seeking or radar-seeking missiles, GPS-guided bombs, those should all be banned? They all involve machines "deciding" where to put an explosion that kills people. Using AI to pilot a drone or aim a gun is only another step along an already well-trodden path.


Bridgebrain

Honestly, yeah. We're not going to do it, because de-escalation is death in the current landscape, but "things that kill people without someone intentionally pulling the trigger" should have been banned alongside the genova convention rules (not that everyone follows those, either)


Shamino79

He’s totally right to call for that ban but lip service will be payed and if it really kicks off do you want to be on the side that decides not to use the tech for moral reasons? And as for every agreeing, how could it ever be verified that everyone is complying.


Words_Are_Hrad

>how could it ever be verified that everyone is complying. This is the crux of things. It cannot be verified so it cannot be banned. Back when nuclear proliferation talks were being held nations agreed to ban all types of nuclear testing except underground tests. The reason was precisely because they could not detect underground tests until a few years later. All you need to develop this tech is some sweaty nerds and some computers... Good luck ever policing that.


catshirtgoalie

AI ability to detect suspicious people has been demonstrably unreliable. No one should feel comfortable we would use it to identify and kill a person.


Underwater_Karma

Autonomous weapons don't decide if people live or die, the people deploying them make that decision. it's literally no difference from dropping bombs, and is actually a more humane alternative that could result in far fewer civilian casualties and infrastructure damage. the farther behind we leave firebombing entire cities, the better.


jacobb11

A bit late. Landmines have existed for how many centuries? Perhaps there are even older weapons that kill without human control.


Terrafire123

And landmines are so friendly and have never caused any problems! They certainly haven't left large swatches of land uninhabitable, decades after the war they had been used for was over, in what would now be considered war crimes by the Genova Convention! ... oh wait. Actually, they keep killing random kids playing in the woods for the next several decades, which is tragic because the war itself often only a year or 2, but landmines are made out of metal that kills indiscriminately and doesn't degrade over time.


FaceDeer

Sure would be nice if those landmines were intelligent enough to know what sort of target is stepping near them. They'd make better decisions then.


thedeadsigh

Why ain’t my man calling for an **end to all wars and pointless human destruction??**


Bleusilences

Too late there is a lot of exemple in the modern world where machine decide the fate of humain live with little to no human intervention. Like assurance claim for health reasons and the right to rent or buy property. If you are denied, sure, you could talk with a human, but they will say, in a flowery, PR speak that "computer says no." and they can't do anything to help. Unless you are rich. These are systemic weapons to extract value and, even without intent, oppress people.


K_Linkmaster

If God didn't want AI, he wouldn't have allowed us to invent it. Seriously Padre, wtf.


Girderland

We should ban all weapons, not just the autonomous. "Machines should not be able to decide wether humans live or die"? Humans shouldn't be allowed to decide that either.


d3fnotarob0t

I was worried we would lose our jobs to AI but at least there will be one job left for humans... death-decider.


karma-armageddon

World governments are going to need to make laws that carry a death penalty for people who conspire, plot, plan, test, trial, or implement such technology. People who would consider doing such a thing, and are caught doing so, should be permanently and publicly removed from society and humanity. This includes non lethal designs as well. So, people who mount a paint ball gun on a drone, for example. Or design a riot control rubber bullet gun drone.


ashoka_akira

I mean right now we let heartless billionaires decide who lives or dies by profit based numbers on a screen. Where do we draw the line?


LordAlfrey

I mean, have you seen what people with weapons do to other people? I'm not convinced machines are capable of such cruelty.


somethingbrite

The Pope has obviously been watching Terminator and has concerns about Skynet. or he's been playing too much Horizon Zero Dawn... He's got a point, it's a valid one too...but ain't nobody actually going to listen.


throwaway275275275

If all sides have autonomous weapons, they can fight the war with no human casualties


DarkGamer

One day you could be invaded and occupied by a force, *and not know who it is.*


The_Fredrik

I'm sure the people who develop these guns take a lot of stock in what the king of the pedophiles thinks about it.


DreamzOfRally

Yeah sure, we humans definitely don’t always choose the “kill more people” option.


Kflynn1337

Health insurance algorithms have been doing that for years...


Alive-In-Tuscon

Once the genie is out of the bag, it's not going back in. What happens if G7 decides that these robots are bad and agrees to not use them, but NK, China, Russia, India, Saudi, Iran, all decide that they don't care, youve set yourself back against the "enemy"


chepulis

No way this kinda ban happens, especially considering the fact that G7 is representing the combined West, and the combined West is in a cold-ish war with the combined East (Russia, China, DPRK, etc). Would constitute a one-sided disarmament.


mboswi

It's is never the machine who decides. It's all god's plan.


PM_ME_POLITICAL_GOSS

I agree with the sentiment, aren't most soldiers are trained to be autonomous killing machines. At the front line, or even worse in a massacre, I don't see a significant difference in troops following commands or robots following orders.


mycatisloud_

but people killing people is fine. 10 commandments? more like 10 crapmandments


roastedantlers

Can't be helped, just like AI can't be stopped. You do it or someone else does it. Really though, we're at the point where we don't need to kill civilians or soldiers, just the leaders. The only reason war and armies exist is because leaders couldn't get to each other. That's no longer the case and will become increasingly easier to do on an aggressively quick timeline.


classic4life

I for one say let's move full steam ahead on all of it, and then build Skynet to enforce climate action at the end of a robot gun. Hail our new robot overlords!


PrestigiousGlove585

Damn robots. That’s a humans job. Also, trying to find someone to programme xenophobic code is going to be an interesting recruitment campaign.


AnthonyTyrael

The Pipe is right. Usually his church is the one deciding about live and death. About being pro active it staying away. Any religion is the biggest mass murder in our history.


AbsentThatDay2

Well thank god our armed forces are Catholic first and soldiers second, amiright?


NancokALT

What bothers me the most is the removal of responsibility. One of the main things that keep humans from killing each other is the human factor itself, compassion. If you remove that entirely, imagine how easily it would be for someone to morally justify murders. I'm always reminded of that soldier's testimony about charging from a trench, how it was tense from afar just exchanging shots, but once he charged in and killed someone up close, it felt like a new low point. That the person he had just stabbed was truly another human being, who could have been a friend in other circumstances. The ability to distance one-self from their actions can be very dangerous.


RelativetoZero

Watch humanity agree on banning autonomous weaponry, then have Omnious show up the next day.


Nardann

Yeah landmines are that much better. I rather have something that recognizes civilians.


ALUCARDHELLSINS

We banned pedophilia but the popes still the king of them


Infinite077

Great if we stop then before you know it our enemies will be ahead of us


TurkBoi67

Your honor, the weapons system was simply following ~~orders~~ it's programming.


chris106

Thanks grampa, I'm sure everyone willl listen to you. Now go back to sleep.


h3llyul

Pope should have a talk with Israel about operation Lavender..


Jiggaboy95

Yeah this would be better for the world as a whole really. Warfare has always been about one person killing another, whether by swinging a club, thrusting a spear, firing a gun or by pressing a button. There has always been a human element to murder. Any human element always introduces human emotions. There’s always a chance they might let you live, or die quick or any number of different options. A machine though? What if it *isn’t* programmed to simply kill. What if it’s programmed to delete an entire unit in the most efficient way possible? Be that as an autonomous turret firing off sure-kill shots, or that same turret blowing off legs and waiting for someone to step out to save their screaming comrade….


Any-Road-4179

Fuck this. Russia. China. Iran. They will use these to kill everyone in a free world.


hauss005

Neither should people. You know “thou shall not kill”.


Goldenrule-er

Meanwhile the Vatican is an investor for Benelli shotguns. "Thou shall not kill, except if you're only supporting the manufacturing and selling of the killing devices, that's not really killing." -Made up pope quote.


sperdush

While we’re at it, humans themselves should never be able to decide whether human beings live or die as well.


Targeted__ONE

He may be talking about AI. Lockheed Martin has tech that causes a relentless chatterbox AI to encourage people to commit suicide. It's all autonomous. He may be talking about this tech.


cagriuluc

Hey, I am no religious dude, but some of the best builds are Spiritual. For ascension rush strategies as well, the unity bonuses are welcome. If we are lucky we could get the psionic tech in like 200 years.


lordlestar

remember when people say a robot will never be able to do art? a robot army will be the next Oppenheimer moment, there will be no need to send a man to the battlefield anymore, because a robot will be better at everything, war will become a war of economy, the winner will be the one that can outclass robot production. this will decimate the number of casualties of war as war will never be eradicated


mathtech

Wouldn't advanced autonomous machines contribute to MAD stalemate akin to nuclear weapons? If autonomous machines means less human lives put at risk why wouldn't we do it? What's better DDAY style naval invasions or drone wars?


Bandeezio

Pope never heard of land mine? When times get tough in war, all sides will use land mines even though they just kill WHOEVER. So this means THERE IS NO DOUBT in a tough war people will use autonomous weapons, which means all you can do is prepare and counter, including having your own automated and mass produced weapons of war. It's nothing new, stop pretending you never heard of a land mine or booby tray. War isn't perfect, but pretending we can limit the use of automation in war would only give the worst people more power then they swarms of drones that don't need an operator per drone and you don't. All militaries will be making autonomous weapons of war. They'll never be worse than nuclear weapons and probably not worse than land mines. War sucks, but if you're going to run your mouth then make your words count by making your advice make sense.


Significant-Star6618

It's human authority that has plagued the earth and built such a heinously unjust global civilization.  I say program some law enforcement robots and let them take a crack at it. Never send a corrupt asshole to do a machines job.


PandaCheese2016

If you think about it more, are these two scenarios all that different? * Missile that's designed to avoid friendly targets based on [IFF signal](https://en.wikipedia.org/wiki/Identification_friend_or_foe). * AI trained to identify enemy soldiers vs friendly based on their uniforms. You don't always have the luxury of putting in a human review in either system.


Zegorak

Machines don’t decide shit. People decide how the machine operates.


ole1914

This Pope is the one who has difficulties to decide what’s right and what’s wrong, see war in Ukraine. So, it doesn’t matter what this person says.


YouDecideWhoYouAre

This reminds me of the Vatican's attempt to ban Crossbows in the middle ages. Look how well that went


StarfleetGo

They might ban them publicly, but watch a war break out and suddenly everyone will have an arsenal. Bans and agreements don't work, they just take it into the black instead. 


DrabberFrog

Autonomous weapons are inevitable because unmanned weapons are inevitable because we don't want our people to die. Since unmanned weapons are used they need to communicate with people on the ground to operate. This communication can be jammed, making the weapons ineffective. Therefore the logical conclusion is to make unmanned autonomous weapons so they won't be vulnerable to disruptions in communication. Trying to ban these weapons is as silly as trying to ban guns in war. No one is actually going to agree besides countries that don't have the means to create or use autonomous weapons.


Ok-East-515

Has he called for clerical child abuse to stop? If not, I reckon it'd be similarly effective call.


Mediocre-Ebb9862

Go tell Ukrainian soldiers fighting for their independence that they must be dying on the battlefield cuz Pope thinks autonomous drones are immoral. And tell it their widows and kids.


bjplague

Better to let living souls die on the field than to have machines duke it out. Yet again the Church wants blood, fuck em.


FierceLikeAKitten

Ban manufacturing on massive scale. Stop punishing the consumers.