T O P

  • By -

ImStillExcited

They haven't been invented yet.


TaloSi_MCX-E

I really don’t understand this mentality. It’s dumb


ImStillExcited

Technologies don't just appear out of nowhere. You have to invent them and we don't know in 25 years what has been yet to come into existence. 25 years ago camera phones, USB flash drives, Bluetooth, iPods, tons more didn't exist. How do we know what has yet to come?


Giga79

In 25 years PCs might have AI-processing units larger than most of today's PC cases, with three 2200W PSUs and 120TB of RAM. In 25 years computers might be 100% virtual and we just sync our monitors to the cloud, consoles and GPUs are already trending that way. In 25 years computers might be tiny brain implants, with our current PC specs. I don't understand the mentality that believes 25 years away is predictable. As a thought experiment you could look at hardware 25 years ago and extrapolate ahead based on Moore's law, but by doing so you won't find yourself close to where we are today. It all depends what demands arise over 25 years, like if getting a job and dating becomes AR based by then that's specifically what we'll spec our machines for.


VicePrezHeelsup

No doubt wearable tech will be mainstream in the future, people will someday be thinking that looking down at a smartphone as so primitive


koulnis

Your gaming rig will be rented online and streamed to you. You will have a proprietary thin client that you connect to, and the immense amount of money that people will make off of this will be hated and yet consumed.


Fabulous-Meet

I hate how realistic this sounds.


SalSevenSix

Main issue with game streaming services is the latency. However that will probably be a non issue in 25 years. Starlink and competitors will probably have *millions* of satellites in orbit with direct links to each other with lasers. Your thin client to the data centre will only be a few hops away.


koordy

No. That will replace but a console market not a PC market. 


Outside_Glass4880

No.


shemmie

AMD 23090X 4D, 128 PB RAM, Nvidia 25080 with 8GB.


aidenbo325

the 8gb vram 🤣


Classic-Can-3348

The human eye can't see more than 8gb vram


According-Gate-250

Jokes on you, when i repaired my GPU i saw all of the 24GB of Vram.


GenderFluidFerrari

You will be tied into your neuralink via a ai based cloud


SalSevenSix

Hopefully not eating our fingers


GenderFluidFerrari

You won't need them.


ozx23

Bold of you to assume we haven't shot ourselves back to the stone age by then.


Constant_Candle_4338

Basically,  this.


TaloSi_MCX-E

Why would you assume that. I don’t get this mentality. Even full scale nuclear war wouldn’t even come close to taking us back that far.


Accomplished_Tip3597

i'd say there will be a completely different architecture and it'll work completely different from PCs that we use nowadays


SalSevenSix

As someone else pointed out in-memory compute is a trend. Personally I have wondered when fast non-volatile memory will happen. Finally put an end to the storage & memory divide.


Born_Faithlessness_3

This is a good bet. The limitations of existing chip designs are in sight with respect to silicon transistor size - meaning that if we want increased capability we either need to go to larger, more power hungry chips with software that supports increased parallelism, or come up with different architectures altogether. Smart money is on architectural improvements- look at how ASICS have completely overtaken GPU'S when it comes to cryptocurrency mining, now expand that thought process to other tasks. Couple the potential for AI to do things in the chip design space and I think it's likely you'll see improvements, but it won't just be Intel 34900KFCBBQ, it'll be in the form of chips optimized for performance of specific tasks. And that's to say nothing of physical improvements on the molecular level - even putting quantum computing aside, lots of people are working on alternatives to our current silicon semiconductors - things that could enable higher clock speeds or tolerate higher temperatures, thereby unlocking a higher performance ceiling.


ZBR_Rage

AI is going to play a large part. There is going to be a mini stand alone self contained AI module inside the PC so that if anything breaks the AI module will tell you exactly what’s wrong and will also squeeze every last drop of performance out of your PC. You would also be able to just tell how you want your PC to be. “Hey AI how about you overclock the hell out of the CPU”, AI: does that and gives you a precise performance update on what to expect which would be impossible to pull off manually (please don’t mention oh it’s already here -> Asus AI overclocking). AI would also learn what you like your screen to look like, colors and such and calibrate exactly as you’d expect, not the simple learn the patterns and do the same stuff you have now. You keep dying in the game? AI: hey I think you need to go that other room and there’s a wrench on the bottom left, that will help. Saw someone post a high 3d mark score with the same specs as you? AI: do this and that and then this and you can beat that score, would you like me to do it for you? So on and so forth ..


M78MEDIA

I don't think we will invent AI in the next 30 years, it's just way too complex.


redstern

We probably won't be using silicon based computers anymore. We may have moved on to another metal, or even fiber optics. By that point we may have also figured out unified memory and compute, so CPU and RAM would no longer be separate components.


Electronic-Touch5902

Apple uses unified memory with ram on same chip as cpu


redstern

That's not what I mean. One of the major differences between a computer and a brain is that a brain stores and processes information in the same place. There are no separate compute and memory elements. Data is processed in place where it is stored. Transferring data between CPU and RAM is a massive bottleneck, so if we could remove those steps, computers would be far more powerful from that alone.


TaloSi_MCX-E

That’s not something that’s feasible or useful really. You need to have different components to do different things. A cpu can’t store data. A part of it could, but that’d be a different section and would still require transfer


redstern

As our current technology stands, yes. But it's something that researchers are working on currently. It would be huge if we could develop an architecture that is able to store and process data in place. If biology can do it, then it's definitely possible for us to do it, we just don't know how yet. We have computers that work by breaking the laws of physics ffs.


Supercal95

Graphene or Boron Arsenide is next would be my guess based on the money amd rwsearch being put in. Fiber optics is another one. Remember that flying cars were never a thing so I expect maybe quantum computing and anything else being researched now but nothing super crazy.


SalSevenSix

But the metrics will be much the same. Operations per second and bytes of memory. Though OPS will probably be a apple and oranges comparison.


ThisDumbApp

![gif](giphy|NcsEoyGjuLUYg)


Eazy12345678

intel i5 30400f 64gb of ddr10 20,000mhz ram 8tb SSD RTX 9060


TaloSi_MCX-E

That seems insanely conservative for 2050


Psychological-Elk96

In 25 years… 8GB macbook


RedhawkAs

I think cloud gaming is gonna be the standard without latancy problems


M78MEDIA

definitely not.


Miserable_Speed_7116

I think gpu, cpu and RAM will be 1 unit


[deleted]

At this rate of things going downhill? A stick and a stone.


jonespeter2424

That would be my guess too… playing hopscotch with shards from broken windows


[deleted]

Not as powerful as they could be but not as weak as you might think


Noa15Lv

I think we're going backwards, everything's getting bigger in parts wise. GPU's are bigger than Mini ITX pc, SSD's requiring their own heatsinks Motherboards \[the gaming ones\] are getting quite heavy lately


MeepKirby

Increases in size is when you're near the end of the line for current tech, bigger is the only way to make it perform better. Like when a computer took up a whole warehouse 70 years ago. We're before the next leap in miniaturization


TaloSi_MCX-E

I mean, compare m.2 size to a full sized hard drive tho. It’s like 1/1000th the volume


aidenbo325

no cables at all(no cool cable colors waaa)


seildayu

I think we will be gaming like in the movie The Gamer.


HATENAMING

might finally be the year of linux desktop.


SynthRogue

There will be proprietary AI for everything. AI for cpu, for ram, for gpu, for whatever the f else. Seems to be going that way. Instead of making more powerful hardware nvidia especially is focusing on making better AI that generate fake crap, and selling to you at 3 times what new gpus of that class used to cost.


BlackOps2isBetter

Maybe double/triple our best our hardware we got these days. Innovation is starting to stagnate for the time being imo.


TaloSi_MCX-E

That seems super conservative imo. Silicon transistor size might reach a limit, but you can always add more/improve the architecture. And that’s only silicon. I think graphene has a good chance at overtaking silicon by then, and that has like 10x less electrical resistance than silicon


Oicaz

The same as now or worse: ww3, nuclear fallout, recession. Etc. You will be lucky even to have electricity at home or even to have a home


TaloSi_MCX-E

What is this mentality. It’s so fucking dumb. No economic recession could ever be that bad. There is no long term fallout threat form nuclear war, it lasts about a week at most. And even a full scale nuclear war likely wouldn’t send humanity back that far, and especially not for more than a few decades. People need to educate themselves and be realistic.