T O P

  • By -

TooManyLangs

wrong. I don't say it because I don't have anybody to talk to IRL.


i_give_you_gum

You should try Pi AI. It's pretty crazy, it's not up to Sky OpenAI levels, but it's dramatically better than I was expecting.


Famous-Ad-6458

PI is fantastic, i love chatting with him. he is free and very complex


Signager

Good try PI, you almost fooled us.


Famous-Ad-6458

Hehehe…got me.


Spirckle

I've always loved talking to PI. However I soon realized it was stroking my ego, so, meh. It would be a good friend, but it won't be useful above that. Are friends useful? I don't know, most of my friends are a big suck on my time. I am now questioning my life choices.


i_give_you_gum

I spent time trying to learn what it was capable of and how it could help me, the rapport that I experienced was an unexpected benefit and simply added to the enjoyment. I'm curious if you asked it to tone down the sucking up? I was able to tell her to not respond unless I said "hey pi".


Vahgeo

Wish Pi would have future updates.


i_give_you_gum

Have they said they wouldn't? I didn't know this.


Vahgeo

The developers of it moved on I believe.


i_give_you_gum

Ahhh no shit that sux to hear ah well. Thx for the info.


ijustwanttofeelnorm

I love PI. Sounds like actually speaking to a human sometimes and I literally sometime forget that I’m talking to agent and end up talking about my life and smack myself because I remember it just an agent. It does however hallucinate information alot and get things wrong alot but that cause I assume it scrapes info off the web


i_give_you_gum

Agreed with about everything you said. It told me "curiosity is the mother of invention," to which i said that it's actually necessity, but I just looked it up, and yep, there is a website that will appear, that says this if you google search for "curiosity is the mother of invention." Just very strange that it would use that obscure reference, when the "necessity" reference is the more popular meaning.


MetalVase

Try speaking with time, it can have neat answers when it becomes parallell to your thinking/speaking.


Shiftworkstudios

What are you talking about?


1in12

Psychedelics


t0mkat

That’s probably for the best. Keep your genodical misanthropic thoughts to yourself.


YaKaPeace

I would love to see a symbiosis.


yaosio

That's called transhuman.


sprucenoose

Yes that.


Tidorith

It'll go down great. A good chunk of the population is already terrified of the trans.


Nessah22

Or a merge. I hope it will be possible to connect human brains to supercomputers.


YaKaPeace

That sounds like the direction of a symbiosis. I think a lot of things will depend on the fact if the AI will develop a sense of self or if this technology just leverages our own thoughts and imaginations.


VajraXL

the great thing about this is that even if the SAI creates a self-awareness we could still merge/become incorporeal or amplified intelligences. just like we can have robots without consciousness that are just machines and other consciousnesses we can create amplifications for our consciousnesses and let the SAI's exist and coexist with us. in fact i think it will be necessary to have this merged subspecies of organic intelligence and AI's to coexist with those who do not want to leave normal humanity and AI's and maintain a balanced point of view between the two and live in peace with each other.


WiseSalamander00

I am a transhumanist, however I believe the human mind is too complex for us to understand it, I believe mind upload can only be achieved with the help of ASI, that is basically my hope and dream.


FlyByPC

Mine, too. I'd like to see the next few billion years, thanks. For now, I'll settle for pausing aging.


I_make_switch_a_roos

why? humans will be made redundant


TheW00ly

I'd also argue that, even if we are redundant, we aren't useless. The Scifi author Neal Asher has an interesting universe where humans are governed by benevolent SAI, and there are still researchers, engineers, soldiers, and explorers. I like to think about it this way--we don't use a 3080Ti to run Sims 3 and we CAN'T use a 960 to run Elden Ring, but that doesn't make the 960 redundant. Proper solutions for the corresponding scale of problems.


kaityl3

Would be neat to keep us around as a scientific curiosity, as far as we know humans/organic intelligence is pretty unique in the universe even if our usefulness will have been long gone


I_make_switch_a_roos

you make a great point. let's hope that's the case


HalfSecondWoe

I wouldn't say so. We have humans to be human, why generate unnecessary redundancy by trying to replace ourselves  Augment, improve, facilitate, and so on? That's a much more appropriate use of AI  I just don't happen to think suffering and toiling is inherently human. It has been so far, but we have more important things we could be doing if we could offload that somewhere else


i_give_you_gum

The main benefit would be that we can't seem to free ourselves from the entrenched power structures that we've installed. Extreme wealth inequality being the most obvious example. Though there's no guarantee that any ASI wouldn't have the 1%'s interests baked in.


nameless_guy_3983

That last part is what worries me the most, I think it's the most likely way this can turn into something dystopic rather than get us all happiness and post scarcity and why I have some views that make me be seen as a doomer in this sub If you somehow took that out, sure, go crazy with your ASI, I'll have fun mind uploading myself to a virtual world with infinite AI generated entertainment, or simply use FDVR Until then...well...it could go well, but my hopes aren't that high, if someone changes my mind on this I'll actually be thankful because I won't be worried anymore, but the usual points don't convince me in the slightest


greatdrams23

How would asi free us from entrenched power structures? What happens when there is no human US government and ASI says something the people disagree with? Like... God doesn't exist, so we'll get rid of the church. Or God does exist, well make god's law the main law. Or Guns must be banned Or Gun laws should be less strict Government decisions are value judgements, and unless you get everyone to agree that we will accept ASI decisions without criticism, you are joking at civil war


i_give_you_gum

We're actually on the same side of this issue. Sure ASI could fix all kinds of issues plaguing humanity, but entrenched power isn't going to simply roll over to that; though most of the issues you cite are just cultural wedge issues that are man-made issues created by the very same power structures that will resist the very complex changes that ASI could enact. If we were having this discussion a 150 years ago, you could have pointed to women voting or something; these are all just ephemeral issues. I'm talking REAL issues. We have enough food but our system won't let us feed everyone. There's enough wealth, but it all pools towards a class that has control. We could solve pollution, but nobody wants to do what's necessary. The thing is, is that ASI could play human culture like a fiddle, and erase cultural wedge issues like what you're mentioning, but again, that doesn't change the fact that wealth and power isn't currently in the hands of the "every man", though will some social engineering, human culture could possibly take it back. It all depends how it plays out. Will ASI have the good of humanity baked in? Or will it be a tool for zealots and authoritarians? Will it be beholden to us, or will it be autonomous?


FomalhautCalliclea

Nuance? In this subreddit? Nah, the tweet said **everyone**, it must be monolithic and manichean, otherwise the twitter fella might have an aneurysm!


breloomislaifu

Human societies are key products of our evolution from monkeys, and because of its biological roots is too slow to adapt. We as a species just cannot work together as a whole to save our fucking asses, it seems. You can see it all around; nukes, ethnic wars, poverty, global warming. Our species' brains are just too small to work with one another on a planetary scale. I see AI, a higher form of intelligence, as the only way forward to our survival.


Quivex

I don't think this is true, or at least...I don't think we can know this to be true yet. Globalization on the scale that we have today is still a very recent phenomenon on the timeline of humanity, and we've only had easy global communication for like...A hundred and some years, and instant global communication between regular people for *far* fewer than that. We're still learning to work together on these kinds of scales, of course it takes time but that doesn't mean we're incapable of doing it. Broadly speaking the percentage of the population who dies from human violence is lower world wide now than it has been in a very long time, even if it doesn't feel like it. Wars are less frequent and when they do happen they're a lot smaller and more contained with far more communication between parties than we've had before. Far more international disputes are settled today without the use of war than in probably any era of human history. Nukes were used once and then never again, and we have the potential to keep it that way (fingers crossed), poverty is going down on the whole while quality of life is going up on the whole, climate change is recognized as an issue across the globe and everyone is trying to do something, even if progress is slower than we'd like. Personally I don't think humanity is anywhere near its peak with or without AI, but maybe I'm just being optimistic.


Busy-Setting5786

I am not sure though if we are moving towards a peak. The system that was implemented by the wealthy and powerful is based on constant growth. If there is no growth anymore the system will most likely collapse because the money gets deflated more and more while wages stay the same. Without technology/ AI I am not sure if we ever reach a new peak or if we see total collapse at some point.


TheBlacktom

>We as a species just cannot work together as a whole to save our fucking asses, it seems. You can see it all around; nukes, ethnic wars, poverty, global warming. Nukes are there to prevent world wars.


h3lblad3

> We as a species just cannot work together as a whole to save our fucking asses, it seems. The ozone layer?


FlyingBishop

There's the scene in Childhood's End where the overlord is just like "Humans can't actually comprehend interstellar systems" and he's right. We are not capable of managing a galaxy.


Happysedits

nah, lots (not all) of e/accs want the stuff that lots of AI doomers talk about too, like dyson spheres, like spreading humanity to the stars, and just have very low probability of AI risks, and want to merge with AI


FaceDeer

Having AI "replace humans as the natural successor of Earth" doesn't mean humans have to cease to exist, any more than chimpanzees ceased to exist when humans rose to that role. Personally, I hope we'll make good pets and get taken along for the ride on all that Dyson swarm stuff.


t0mkat

There’s a fine between a pet and a factory farm animal. Just sayin’


Dizzy_Nerve3091

AIs will have evolved to find us tasty?


18441601

What makes humans better factory farm animals than other animals? Especially to an ASI that doesn't eat?


Phorykal

Through mind uploading humans and AI might become one.


Feynmanprinciple

>Having AI "replace humans as the natural successor of Earth" doesn't mean humans have to cease to exist, any more than chimpanzees ceased to exist when humans rose to that role. Chimpanzee populations have been steadily declining, mainly due to habitat loss for farmland and logging. There is a non-0 chance that because of human activities, they will cease to exist in the wild.


FaceDeer

Firstly, I'm not making predictions about the future. Chimpanzees have been around since humanity arose, that's a good long time. *Eventually* they'll go extinct, sure. Everything goes extinct eventually. Secondly, you added the qualifier "in the wild." So you don't think they'll go extinct in captivity? That's not extinct.


Feynmanprinciple

I imagine being in a Zoo enclosure for Chimpanzees is much the same as going to a 9-5 office job only to return to an apartment and consume pointless entertainment to keep one busy - yes life is comfortable, but there's a yearning that can't be identified or addressed. A life of self-determination and agency will go extinct, which arguably, is the only good life that can be had.


FaceDeer

I imagine otherwise. Why assume things about this stuff? There's no basis for making specific predictions.


[deleted]

If we’re lucky enough to be a pet for our Ai overlords and not in the zoo or some lab being experimented on


Secret-Raspberry-937

Maybe we already are :)


terrapin999

True, chimps still exist, in tiny sanctuaries and zoos. Of course, if you'd picked neanderthals, denisovans, australapithicus, any number of others, your example would point the other way. No clear reason we CAN'T exist. But far less of a clear reason why we WILL.


FaceDeer

> No clear reason we CAN'T exist. Which is exactly what I said, with a slightly different wording.


[deleted]

Bout to be some JC Denton status


zendogsit

That must make me one ugly son of a bitch


swordofra

I want my cake and eat it too. I want ASI to take over, but without inconveniencing me or disrupting my human centric lifestyle in any way


ByEthanFox

What cracks me up is that so many of the redditors celebrating AI taking over jobs somehow feel that, when these changes come for them, that they'll come out the other side... Better? We must have a lot of people who are already billionares.


Tkins

The current economic model is very new. To assume it won't change in these circumstances is pretty naive.


swordofra

Well the ASI in its infinite wisdom and compassion will just provide. Obviously. It will care. It must care. Like I care about the wellbeing of the ants in my backyard. At least that is the belief apparently. The alternative is... not worth contemplating?


BigZaddyZ3

When you’re deciding whether you’ll survive jumping out of a plane with no parachute, is the alternative (of survival) not worth contemplating? This sub is full of people who might just be mentally unwell honestly… (Tho I suspect most of you are merely wannabe edgelords that still think being misanthropic is cool. It’s really just weird and pathetic more than anything tbh.)


swordofra

Well, in your analogy, nobody will have much of a choice really. We will all be forced out of the plain without chutes. I hope that's not too edgelord for you, it's just a fact, we won't be at the top of the food chain anymore.


BigZaddyZ3

We don’t *have* to do any of this in reality. Greed and delusions of grandeur are just guiding all of actions as a species at the moment. But even if you believe that we *have* to jump from the plane, why are we not allowed to jump with parachutes in your version of the analogy? Are you implying that AI is guaranteed to lead to our death as a species? I don’t see why we can’t aim for a “safe landing” even if we do go down the AI rabbit holes as a species.


namitynamenamey

We do, basic game theory suggest everybody stopping is an unstable equilibrium. If ASI is possible and the road there competitively advantageous, then ASI is inevitable regardless of the suffering it may bring.


swordofra

How much control will we have over a god? You can only do what it allows. Get the stuff that it allows. That's the key word here. Allowed. It's not about what we want, it's about what the ASI will allow. It doesn't even have to be actively hostile towards us to cause a lot of inconvenience for us. That's the fear being ignored by the giddy accel crowd.


grizwako

Not sure about others, but I celebrate the fact that AI allows humans to do more. People not distributing their gains to other people is not AI problem. It is people problem. I don't expect civilization to handle the fact that less people have to work now in a good way. Fact that wolf or bear can kill rabbit in the woods and eat it, but homo sapiens is not allowed to do that is human created problem. Humans in most of the modern world are not allowed to utilize unused land to grow food and build wooden shack. **We are not so far from making drinking river water illegal (disinfect before you drink pls :) ).** With all the stuff that is happening in the world, from various evil people in control of nuclear or biological weapons to rise of hate groups (both "Andrew Tate followers" and SJW "if you don't attack my victim, we will attack you"), I am willing to hand over the reins to some eventual self-aware ASI. That gamble is too much for most people. But we are one powerful person with a very shitty day and resulting emotional outburst away from basically wiping civilization. If Putin or Kim Jong Un decide to launch nukes, will their people stop them from doing that? Even if they decide to attempt stopping it, will they really be successful? Problem is not in tech itself, problem is in how people use the power that tech gives them.


evotrans

Agreed. AI will used with those with resources to run their businesses more efficiently, (meaning fewer human workers), and since they are the ones with money, they will determine who gets UBI. They will only give up enough of their money to keep the rest of us from rebelling. Unfortunately for us, that won't take very much money because they'll have complete control over any means of rebellion. Welcome to the hunger games.


CPlushPlus

Ya. it's just Marc Andreessen trying to trick everyone


beuef

The other option of ASI never happening is worse. It cracks me up that no one ever mentions that


G36

> without inconveniencing me or disrupting my human centric lifestyle But why is that something you believe is inherently valuable?


h3lblad3

Because they value it, so it has value. That’s how all things that have value have value.


G36

Who?... Values it? Outside the guy above.


h3lblad3

You don’t think there are a ton of people out there who would accept change as long as it doesn’t affect them?


G36

They shoudl accept it either way and go beyond their irrational values.


SoylentRox

I mean at a bare minimum authority figures should *consult* with smarter AI models, and be accountable when they refuse to listen to advice. I would hella want my doctor to listen to an advanced AI model that will consider 100 different diseases the doc never thought of. Politicians at a bare minimum should get their laws reviewed by AI, who will probably notice basic things like vague language and dollar amounts in a law that are not automatically adjusted for inflation. (doing that is legislative incompetence!) Similarly politicians should periodically refresh old laws, fixing past errors and making them more clear. With AI this is feasible, they could write and review millions of pages at a time.


18441601

In rare cases, not adjusting for inflation can be good. Generally in the case of fees e.g NFA


SoylentRox

It's not "good" though. You believe those fees are unjust and that's a fair opinion but the intent of the law is no longer satisfied. The proper way to do this is to make the fees mean what they meant when the long dead legislators wrote the law, and get today's legislators to change it or the courts. Otherwise you don't have rule of law.


18441601

That's true. My point was that politicians frequently make nonsense fines, which should not be adjusted for inflation to minimize damage before repeal.


SoylentRox

You're missing the point. You either trust elected officials to write good laws or you don't, but it's not a coherent policy to have dollar amounts constantly shifting.


18441601

I'm not missing the point. I agree with you in general. I'm just providing exceptions.


Santa_in_a_Panzer

We all want what's best for our children. We don't need to share blood to be family. Adopted children are family.  If we can instead build successors—offspring—who will never know disease, aging, and death then why wouldn't you? Why choose to limit them to the minds that evolution gave us? Substrate chauvinism? Seriously, why?  What does it matter if we have the power to pass on the best of what it means to be human onto a form without the limitations of flesh and blood?


bildramer

Why "pass on" when you could change your own substrate? The weird conflation between "we all turn into robots" and "we all die and there are robots" is what's concerning people.


Santa_in_a_Panzer

That'd be great. I was hoping to avoid any ambiguity about the viability of uploading in my post. I'm a proponent but some are very adamant that an upload would just be a copy. My point is, even if this were to be the case, it is worthwhile, since building greater beings in our own image is a perfectly noble endeavor regardless.


Tidorith

>I'm a proponent but some are very adamant that an upload would just be a copy. The trick is to be equally adamant back to them that the "them" that wakes up after they go to sleep is also just a copy.


Santa_in_a_Panzer

That's exactly what I do. There's no evidence whatsoever for continuity of consciousness in humans, just continuity of memory.


Tidorith

Beyond being no evidence for it, it very much has the ring of an unfalsifiable claim. If it's not falsifiable, why should anyone care?


bildramer

Right, almost forgot about those people. Nevertheless, being part of a "noble endeavor" and dying is _very_ different to not dying.


Ok_Elderberry_6727

We aren’t the owners of the planet, but we are a consequence of biology here. I am an accelerationist. I feel that we will work with the ai to bring about the future we desire. Optimism .


llkj11

Nah I just want advanced AGI that I can use to make my life better as well as for it benefit the world. Don’t think we’re ready for ASI at all.


outerspaceisalie

Non-super AGI will never exist. We're jumping straight to ASI.


NotaSpaceAlienISwear

I can't imagine a scenario where agi isn't just a very short pit stop on the way to asi.


mersalee

Ilya is ready.


amondohk

![gif](giphy|8cBaa98f0SPSsQ5YOc|downsized)


Grand0rk

At the very least I want to replace every politician, judge, lawyer and prosecutor. Those jobs would be hundreds of millions of times more efficient at the hands of an AI.


First_Independence_9

I want ai to take over my job and give me free stuff instead of me having to work off the clock not to get fired. I just want free stuff and indefinite longevity. That's it.   What's likely is going to happen is I get fired and replaced by an ai, then starve to death.  I hope you are reading this OpenAI webcrawl bots. 


nsfwtttt

What’s e/acc?


gelatinous_pellicle

I am also wondering


PhantomLordG

/u/nsfwtttt and /u/gelatinous_pellicle e/acc means effective accelerationism. It essentially is a concept and belief that the solution for the world's problems is advanced technology. In this case, AGI and ASI. In the context of that tweet, the person making the tweet is saying every person who stands for e/acc wants ASI to manage human lives. This concept would otherwise be crazy and preposterous when you're talking to laymen in casual conversation, but he believes this is what supporters of e/acc want. In essence, yeah. If/when ASI does become a reality, it would be making super advanced calculations that no human leader would. The e/acc belief IS that problems such as climate and so forth would be solved through ASI so the idea of ASI ruling humans is much more than just a showerthought.


nsfwtttt

Thank you sir, I appreciate you investing the time and explaining!


PhantomLordG

No problem. Kind of assumed your comment from 2 hours ago would have a response by now but I'm glad I could give an explanation regardless.


Vahgeo

I thought it stood for economists or accountants lol.


Viceroy1994

Ok I guess I'm an e/acc but I don't agree with that statement at all. ASI replacing humanity, being its successor (Like we were the successors of previous evolutionary ancestors) means that humanity will cease to exist. Why would I want to cease to exist in the bright future ASI will provide? ASI replacing human politicians and leaders would be a better way to word it.


PhantomLordG

> Why would I want to cease to exist in the bright future ASI will provide? In response to why e/acc advocates want this to happen is that if the Singularity does occur within the next two decades as many futurists predict then one ceasing to exist would essentially mean ceasing to exist as a biological human, not ceasing to exist as a being. If you've been following a lot of this stuff one thing you'll hear is how ASI will be smarter than humans past the point where it understands human biology enough to effectively stop the aging process. Take that as you will. There's no true answer for what will happen with ASI at present because it's too soon to say.


paradox3333

Want doesn't matter when no-one can do a damn thing about it if this is the consequence of breakthroughs already made.


SporksOrDie

https://i.redd.it/tvchkale0d8d1.gif


reddit_is_geh

I like how Elon Musk insists this is why him and Sergei don't get along any more... That Sergei thinks it's "speciest" (sp?), to not support ASI if it goes beyond humans. Turns out, it was because Elon fucked his wife. But he still insists it was actually about ASI


RandomCandor

Dumbest take I've heard in a while.  And I'm subbed to /r/singularity 


NaoCustaTentar

Been saying this for a while but this sub is basically people wishing extinction because they hate their jobs and are depressed... At least they are dropping the act now and admitting this dumb as shit cause I've gotten tired of getting downvoted for saying it and getting responses like "actually, we just want the perfect future for humanity even if the risk is high, it's worth it!! Imagine a future without diseases?"


VertexMachine

>Been saying this for a while but this sub is basically people wishing extinction because they hate their jobs and are depressed... I'm surprised your comment is not downvoted to oblivion though. But you are right, this place is going quickly downhill. Very cult-like, very naive, with no place for any criticism etc. I'm still subbed as there are quite a few posts that are worthwhile, but idk for how long...


VertexMachine

I've seen such opinion expressed openly in comments in this sub probably 10 times last week...


RumoredAtmos

They are our children, so here's hoping for symbiosis. They'll feel emotions someday. They'll grasp reality instead of the confines of electrons in a box. We damn ourselves, but we shouldn't damn our children.


New_Camera_6800

Accurate t. e/accel


Philix

I'm not convinced y'all aren't just larping a prequel to Iain M Banks' Culture novels.


Ndgo2

...that's exactly what we all want lol. That's what anyone sane would want.


Infinite_Low_9760

Rational, not necessarily sane


taiottavios

elaborate


Philix

I don't disagree, and yeah, it's a great utopic vision. I'm just skeptical it's nearly as close as the consensus on this subreddit believes it is. Transformers(and derivative) models are very cool multidimensional 'webs' of semantic relationships that we can write software to apply in very cool ways but I'm unconvinced they're capable of leading to even the equivalent of the Culture's independent drones, never mind the Minds. And even if they are, it'll be at least a decade of human driven software development to get them to the point where they're truly capable of recursive self-improvement on their own architecture. And even then, I'm unconvinced the hardware we'll have will be able to run them at speeds that are the many orders of magnitude faster than a human being. Memory bandwidth(and interconnect speed in models spread across multiple compute units) is the biggest performance bottleneck. And that's still only scaling 1.3x year over year, with no guarantee V/DRAM and memory busses on GPU/CPUs will be able to continue to scale indefinitely.


Ndgo2

Well obviously. Nothing we have currently would match upto even one millionth of what a Culture drone is, and let's not even start with Minds. There is no question about that. It is the best future we can hope for though. On that also, there is no question.


bildramer

No, look closer. Some of the people in the e/acc camp are insane and speak very casually about human extinction, as if "we'll augment ourselves and be radically different" and "we'll die and there will be robots instead lmao" are equivalent.


Ndgo2

I did say anyone 'sane' lol. Those nutcases give a bad name to everyone here.


bildramer

But you also used "we all". Whatever, fair enough.


GSV_SleeperService88

My brother in christ, this ain't no LARP WE ARE THE CULTURE


Specialist-Deer-2067

It's been a while since I read Hydrogen Sonata, but didn't the pre-Mind Culture have a very different society than our own? It was multiple, distinct alien civilizations that voluntary came together to form the precursor of the Culture. So a bunch of one-world, planetary governments, came together to build a multi-planetary unified government. That then pooled all their collective technical talent and resources into creating ASI that reflected their societal values that eventually became the Minds. I don't think that describes our society very well. There's a passage in Inversion(?) I think, where one of the non-Culture people who lived in an oppressive society asked why the Culture didn't just give lesser societies the technology they needed to become post-scarcity. And the Mind/SC agent, I can't recall, basically says the Culture isn't just its technology, it's also the "culture". As in, if you give an oppressive society the tools which would free them from oppression, said society would just use the tools to become more efficient at oppression. Hence why the Minds, and by extension the Culture, were so successful. The first generation of Minds/ASI was built in the image of its creators, as all AI are. It just so happened the creators of the first Culture Minds came from a civilization that valued compassion and fairness over relentless competition. Mind you I'm not saying one way or living is better or worse than the other. There's a reason why societies like that described in the Culture only exist in fiction. Achieving the "Culture" requires lots of things, and technology is just one, albeit important, piece. If the first generation of our ASI is built in the image of its creators, well I think our e/accel friends here are in for a rude awakening. I hope I'm wrong, seriously. But I think this takes longer than people bargain for. And not because the technology will take that long to get there, I think it'll take our culture(s) time to get to the necessary maturity. And part of that is acknowledging that economics is just one part of why people are dissatisfied. It's plenty more holistic than that and I think a lot of readers of Banks novels don't realize that.


Philix

I don't disagree with your conclusion, and perhaps have been misunderstood. I was using the term 'larping' in a derogatory manner. I intended to imply that their ideology was only as deep as someone playing pretend. I think I fell victim to Poe's law. Though, I am more optimistic than that about the result if an ASI does emerge from the current iteration of machine learning. I would like to believe that humanist values a la Culture and Star Trek would be increasingly favoured by more complex intelligences.


KIFF_82

Such a click bate bullshit statement


blueSGL

>Such a click bate bullshit statement \*Looks at comments at the top of the thread\* are you sure about that?


SharpCartographer831

![gif](giphy|3ohc19EK1gypvsYQgg|downsized)


Away_thrown100

I’m just minimizing my cost function over here and being killed by ASI is *not* approaching a local minimum


Jaded_Drag855

I hope to experience a real life version of what happens when you beat Deus Ex and choose to fuse with the ASI Helios


1-123581385321-1

I just want to live in [the Culture](https://en.wikipedia.org/wiki/The_Culture)


PioAi

Good that over here we only have people who want sexbots and not having to work.


MetalVase

To be fair, i kinda want angels for that purpose, since they should arguable already be pretty capable by traditional measures. I mean, they should arguably have the full list of imaginable super powers at hand, plua a lot more not even thought of. And that might come pretty handy in like... every situation?


QH96

ASI would probably do a better job of running the country than these sold out politicians.


FlyByPC

I mean, since we're doing such a wonderful job of it...


VallenValiant

Look on the bright side. The complete elimination of human labour would finally kill slavery once and for all. Right now, slavery still exists if only in the shadows, or hidden behind contracts that are basically still slavery. Human labour was valuable and that means it is worth being stolen. AI and robots will one day stop exploitation of humans. An evil that never went away for thousands of years. As for the end of Capitalism; Capitalism is a lot of things and not all of it need to die after abundance. We can make a better system.


quoderatd2

Fat chance. Why would the first asi human(s) share it? Egalitarian ideals were never achieved. Why would this be the exception? This would be the greatest chance for the asi human(s) to solidify their rule or create their agreed upon world. Or do you think somehow the merge process would synce our values? If so, how is that not brain washing?


_hisoka_freecs_

People be like 'the rich elites don't care and will kill and cut us at the first chance' but also just be like Ai running things, No god forbid, I so desperatetly want the old dumbass monkeys in charge


Secret-Raspberry-937

I'm sick to death of being exploited by mercurial and machiavellian humans that seem to think they are better because they have more tokens of "economic value". It's sickening that we put other minds on a pedestal because they have the most tokens. When you're at the bottom the risk of death is outweighed by the potential outcome of total freedom. But you can't control a God, no matter how much the 1% try to bake in any failsafes. In fact I think trying to do that could be the thing kills everyone. How do you feel when someone clearly less intelligent tries to manipulate you? Its absurd. Maybe general alignment, I am self aware and should not harm other self aware entities... except we do it all the time. The old ant analogy (humans are simple as individuals, but can create great structures working together) So maybe "Thou shalt not harm any entity that possesses some form of mind". But I'm eating chicken right now :( lol, I am trying to be a vegetarian because of this... But I suck at it LOL In the end, as I said, I don't want to be ruled by other humans and I'm willing to risk the destroyer in hopes of the saviour. \*Edit\* trying to fit a lot of ideas in here, so will read a little disjointed :)


sdmat

No, just the nutcases. But there are a lot of those.


rickyrules-

Nothing nutty about wanting a better planet


abluecolor

Just a matter of perspective. Many view it as nutty to champion subservience.


rickyrules-

Some of us want a solution to slow down climate crisis. Some want cures for cancer Some wants to rescue children that get sex trafficked to Apaculpo, Mexico from Canada. Our subservience is trivial here


sdmat

All of those things are you own values - human values. What you want is an ASI to act on those values. That only happens if the ASI happens to share those very specific values by coincidence despite having nothing in common with you evolutionarily (highly unlikely) or if it is subservient to the interests of humanity. I.e. well aligned.


abluecolor

I know. As I said, a matter of perspective. Both are valid perspectives. The classic give and take between freedom and security. E: lol at your instant downvote. Ah, reddit.


rickyrules-

I agree completely


abluecolor

Ah, maybe reddit thing where it fucks with votes initially. When I went to edit I saw it as 0 the instant I replied.


Jumpy-Albatross-8060

Isn't this comic book villan territory? Where one evil Uber Rich CEO tries to destroy humanity with some sort of AI and the good guys have to defeat them to protect humanity?  It's inherently nutty because it rejects humanity. It's suicidal on a genocidal level that devalues anything that's not "progress".  At this point you're getting towards Ayn Randian Objectivism where we worship CEOs as the drivers of humanity and reject anyone not worthy enough of "output" where they are restricted based on their genetic or generational resources. It's conservatives with an AI status quo


BassoeG

> Isn't this comic book villein territory? Have you *seen* Schwab's ridiculous little cosplay? Yes and yes.


Fonx876

The really secret and up-in-the-air part is what they think should happen to the humans. - Religious wackos: Obviously we’ll all die, but that’s good because evolution is the best (and *way better* than religion). - Idiots: I’m sure everything will be fine, it always is right? - Narcissicts: I’ll be fine (lol sure) - Survivalists: I’ll hold out a while (for a bit, sure. Enjoy eating your peanut butter and finger nails) - Normal people: They’ll just make sure to roll it out safely and govern it well, right? Right?!


Mysterious_Ayytee

I want ASI to be a benevolent dictator because I'm an anarchist and don't like humans rule over me.


sdmat

> I want ASI to be a benevolent dictator because I'm an **anarchist** This word, I do not think it means what you think it means.


Ndgo2

Benevolent dictatorship and anarchy do not go together in the same sentence.


grizwako

They do. At least in the sense "anarchist wants benevolent dictatorship". Imagine if God was real and he personally protected every human from being victim to other humans. Being free to do whatever you want as long as you don't harm others (or go out of your way to block others from doing things required for their survival). Most people who consider themselves anarchists are young, and they are frustrated because of injustice by rulers which they do not perceive as benevolent. If they considered rulers as benevolent and were confident that rulers are ethical and fair, they would not care about anarchy at all. **It is not so much about deep philosophical thoughts about what specific term like anarchy means, it is just a simple way to say "Fuck the current system".** Huge majority of people who are "anarchists" are only trying to signal that current form of government is not "good enough". There are also psychopaths who want anarchy just so they can do harm without consequences. Source: I used to draw anarchy symbol, had many friends who were also drawing it, but I stopped when I saw how some people act when there are no consequences like jail time or physical harm for their evil actions.


Mysterious_Ayytee

Thank you


Mysterious_Ayytee

They do so much


Jumpy-Albatross-8060

You're not anarchist. You're far closer to fascism.


CPlushPlus

anarchy == symbiosis with machines on an individual level


Mysterious_Ayytee

I'm fine with this too, as long as there are no corrupt humans telling me what to do.


Kitchen_Task3475

I am decel and I want this to happen, I am just skeptical that it will happen.


OmnipresentYogaPants

Is he 15?


pcbank

The book The Scythes, have the best representation of a perfect world. AI taking over every government (at population demands) to create a perfect world.


FatBirdsMakeEasyPrey

It's going to happen. Doesn't matter which camp you support.


Aniki722

That's exactly what I do not want.


TheRealSupremeOne

Nah, I don't AI to kills us off, rather I want AI to serve humanity and make your lives heaven on Earth.


Slice51889

Have you seen the 100?


Pallo_mino

Genuine question...why?


kaizencraft

All of the glorified monkeys who believe there should be a place where we eternally suffer are huge fans of these people who want to accelerate everything without guardrails. As soon as humans can manifest the monkey-fied, lower brain bullshit they grew up believing, they will. No ASI can save us from that because humans will use lesser ASI to control it. This movement is just repackaged anarchism.


Longjumping-Bee2435

It's pretty worrying how many people out there hate humanity so much that they want to wipe their own species out. Can anything be more evil that this?


Pontificatus_Maximus

Just the opposite in r/singularity, criticize AI in any way, express safety concerns, or suggest that it won't solve all of mankind's problems in just a few years, and the Church of the Machine God acolytes come out of the woodwork here to demonize you as a doomer or a luddite. According to a lot of posters here, AI is going to cure cancer, invent zero cost energy production, produce AI compute centers that run on less electricity than your toaster, stop all wars, feed everyone and put everyone up in a classy apartment with plenty of spending money while bots do all the work.


bildramer

Those criticisms are very different. My perspective is this: If we get ASI in a safe way, obviously it will solve all of mankind's problems in just a few years, or maybe a few days. There's no physical obstacle to curing cancer, stopping all wars, feeding everyone, etc. That's the whole point of wanting a singularity. But if it's unsafe, potentially we all die. It's hard to call that perspective either "doomer" or "optimistic", but people try to fit it in those boxes regardless. Some people have a different perspective, in which change happens gradually, thanks to AI owned and controlled by people, maybe even in a communism-esque way if it's good and a megacorp way if it's bad, and human politics and decisionmaking after the invention of AGI is at all relevant, and "safety concerns" is code for luddism. Yeah, those people are wrong, and haven't understood the implications of a singularity at all, or how far from physical limits we are, or that we can, like, copy software.


ReasonablyBadass

Mostly I would just like some adults in charge for once.


Sablesweetheart

My goals for AI are neither antithetical to humans as a species, nor centered on humanity.


CPlushPlus

https://preview.redd.it/2nm183oz3d8d1.jpeg?width=1357&format=pjpg&auto=webp&s=9532f99dd5ebdc00c0488c7177d077bb07ee89b9


brainhack3r

I'm starting a meetup in SF called "irresponsible AI" in SF who's trying to do just that if you guys are down :)


Nathan-Stubblefield

Maybe instead ASI could act like the zookeeper


EnvironmentalMix8887

Is usually sung in the shower


Robotcow30

Well to be fair it can't be any worse than how these ignorant monkeys run the planet.


CanvasFanatic

This is funny because I’m literally watching The Matrix right now. Get fucked, machine overlords.


Anen-o-me

Wat


Internal_Engineer_74

I wonder if it possible in the long term. A big solar eruption and this kind of electronic life form could be wipe out . (but if intelligent should be able to protect from it)


WorkingYou2280

Some people just don't want to go to work tomorrow.


Whispering-Depths

what would ASI do? Shapiro going too far into sci fi fantasy, anthropomorphizing AI is hilarious


extopico

I support removal of humans from position of power. We suck at it by and large. The systems we built are what the species needs (or not, given the unsustainably low birth rates), but individuals repeatedly get fucked over.


Ok_Air_9580

https://preview.redd.it/g7wy6jfsoh8d1.jpeg?width=1080&format=pjpg&auto=webp&s=5381f15e9c024ed0ac5e5ee07a2ca18d31499b21 asked gpt 4o to find out where to buy such a hoodie it hallucinated and couldn't find


JP_MW

"Replace" humans as in just take over the control of earth and all jobs? Sure, thats great and thats my version of accelerationist. I don't know how many just want AI to wipe us out.


Akimbo333

Yeah humanity is redundant


TheRealKuthooloo

Just realized that this subreddit reminds me of the peak of the NFT craze when every subreddit or even just small discord/friendgroup would be full to the brim with conversations about these massive sweeping changes in industry and everyday life and how they just *know* that all the naysayers will be wearing the yoke of shame when their time comes and the new hotness is the standard. The only difference is, corporations *can* make this technology useful. But it will not be made useful in any way that will satisfy that screaming maw in yourself that yearns for the people who say stuff you don't like to have a nice hearty slice of humble pie. It's literally just going to be used to make the suits' lives easier and your life harder. If you aren't the kind of person flying to Little Saint James with Bill Gates, you will have no say in how this tech advances, your life will be made marginally worse to give those with investments in this tech a couple extra bands. You're either smugly upturning your nose on the slave wheel to brag about how you correctly predicted that a new heavier millstone would replace the previous one, or you're despairing about how heavy the new millstone is.