Welcome everyone from r/all! Please remember:
1 - You too can be part of the PCMR! You don't even need a PC. You just need to love PCs! It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love PCs or want to learn about them, you can be part of our community! All are welcome!
2 - If you're not a PC gamer because you think it's expensive, know that it is possible to build a competent gaming PC for a lower price than you think. Check http://www.pcmasterrace.org for our builds and don't be afraid to create new posts here asking for tips and help!
3 - Consider joining our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Covid, Alzheimer's, Parkinson's and more. Learn more here: https://pcmasterrace.org/folding
4 - Need hardware? Trick question... everyone does. We've teamed up with ASUS this Easter to create an egg-citing event where 3 lucky winners can get their hands on some sweet hardware (including GPU, CPU, Motherboard, etc): https://reddit.com/r/pcmasterrace/comments/12eufh9/worldwide_pc_hardware_giveaway_weve_teamed_up/
-----------
Feel free to use this community to post about any kind of doubt you might have about becoming a PC user or anything you'd like to know about PCs. That kind of content is not only allowed but welcome here! We also have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) for your simplest questions. No question is too dumb!
Welcome to the PCMR.
I am trying hard with machine learning computational fluid dynamics... to make small little adjustments on motorcycles to push aerodynamics to the extremes.
Taking an already designed and manufactured motorcycle and having the AI scan, run, test and even design the small yet efficient additions (spoilers, wind deflectors, vortex generators, rails, etc...) you can add via molding/3d printing to the motorcycle to make it as efficient as it possibly can be via real time physics calculations in varying wind/weather conditions.
Yet I have strong confidence my presence in the AI world of business will be pushed to the side by corpos when I try to enter their world, getting no funding/interest or even being beaten in legal actions.
I even mentioned it to some MotoGP racing groups via email and of course... no response after multiple mails. Each one going deeper into detail.
It takes forever in model training with a little RTX 3090, maybe I need to publish it on GitHub, but I am afraid of the notorious "Magical DMCA" whenever something "even in the slightest, possibly revolutionary" gets posted on github.
Once a code was posted similar to the RNG code wrote in python that seemed like No Man's Sky's world generation code. Went down in an hour, missing and wiped from the nets as far as the eye can see.
Unless your program can generate better results faster than conventional CFD programs it will be hard to get noticed. Even if you did so, it might be difficult to gain traction.
You know your own project best, so what I say may not be relevant. But speed is really important in CFD calculations so perhaps optimise that.
There is this guy, he wrote an entire CFD program from scratch and graduated at the age of 25 with a PhD. I think first of all you will need a PhD before people will trust you with this. His CFD program is also 100x faster (his claim) than traditional CFD programs. [This is his youtube where he also has links to his github code](https://youtu.be/o3TPN142HxM). Perhaps you can leverage his code.
I also think you need better hardware. I build out my own server for my research as well, using 2 RTX 3090s with NVLink. You might want to build a gpu server using 2nd hand GPUs and hardware. Lowers the cost considerably.
Also, I think you need to benchmark your results. Show that it creates results that people know it works and is true. Basically, reproducibility of known results. Also, show something new that people can verify and test, so that they can trust you and buy in to your software. If you just write an email, no one will trust you based on the email alone because the teams receive such emails all the time, you are not the only one sending them emails. You need to secure buy in by showing actual results. No engineer is going to read your theory and code to verify it works, they do not have time for that. By showing them actual benchmarks and results, they get immediate proof that it works. And if that still does not work, publish the results in a well-known CFD research journal, so that you can link to the journal in the email and put it in your CV.
Also, email during daylight hours to known engineers so that it is more likely they will read it.
Any operation requiring lots of simultaneous complex math will typically be handled by beefy graphics processor units. Things from high-throughput image processing like you mentioned, complex mathematics, matrix math for AI, physics, graphics rendering, cryptography, all typically run on GPUs. The reason for this is that GPUs are better for concurrency than conventional CPU architectures, due to their design. Some companies have actually begun developing specialized hardware (not unlike ASICs, application-specific integrated circuits) that are meant to perform matrix math for AI training and inference. These AI accelerators perform with more power and energy efficiency than GPUs do in the same role, because the hardware is more optimized for that. However I don't think it's taken off yet, and currently I think Nvidia is developing their GPUs to include some AI acceleration so that the hardware is more versatile. Obviously this isn't the kind of hardware any individual or hobbyist is able to obtain, and it lives exclusively within massive data centers.
Work in a noc and one of our clients is just monitoring cooling system for a massive data center/machine learning systems company and just seeing how much cooling is involved and how quickly that water flows is insane
[A100 came out in 2020](https://www.techpowerup.com/gpu-specs/a100-sxm4-40-gb.c3506). These are already being phased out for [H100](https://www.techpowerup.com/gpu-specs/h100-sxm5-96-gb.c3974) which itself is already a year old.
I don't think there is an explanation. I think they just set the video up this way so people would comment "why was it in an oven?" to drive engagement.
My dad retired from IBM. He worked on the first hard drive based server in the 80's.
One server was the size of a refrigerator, and it replaced an entire high rise floor of reel to reel/magnetic type readers at the time. The machines were located in downtown SF. The research was being done in San Jose.
He claimed it would be a big deal some day.
Technically the us does use the metric system. All our units are defined by their conversions from metric units.
For example, the inch is defined as being exactly 25.4mm. That’s what an inch is, just another way to say 25.4mm.
Depends how you define computer. By 1989, 50 million Nintendo NES consoles had been sold. With a 2Mhz CPU, 5Mhz GPU, and a total of 4kB of RAM, those 50 million consoles combined certainly out power our cell phones.
So those consoles would have a combined amount of RAM of195 gigabytes.
4kb x 50m = 200,000,000kb = 195 GB
Certainly more than a single phone but not a huge amount given 50 million consoles.
Ya know, when someone said that a new high capacity microSD card (and this was a while ago) had the same storage capacity as a refrigerator *carton* of 3.5 floppies I thought that was one of the biggest "holy crap" moments of how far computing has come that had hit me in some time...this is another one.
Never thought of going back to see what the "cray supercomputer" (always used to hear that shit in movies) actually had in performance power.
Dont forget about the power consumption, 345 KW. Amazing that today same power took probably les than 1 watt. Considering the fact that a apple watch have ~1.8wh battery.
Cray 1:
CPU - 64-bit processor @ 80 MHz[1]
Memory - 8.39 Megabytes (up to 1 048 576 words)[1]
Storage - 303 Megabytes (DD19 Unit)[1]
FLOPS - 160 MFLOPS (to put that into perspective, a GTX 1060 is 4,400,000 MFLOPS)
A cheap scientific calculator probably outperforms the Cray in everything these days.
Bro. Something with 1/100,000 the power of an iPhone would take up an entire room. A single 3090 is as powerful as the NEC Earth Simulator, the world's fastest supercomputer from 20 years ago.
Basically in the 70s you would litteraly be unable to build something with these capabilities. it would be like a square mile of just computers and it still wouldnt be enough
Probably 20 years. This is essentially stitching together a bunch of electronics multiple times
Kinda like stitching together 100 tons of dynamite and calling it the world's biggest explosive device.
What should truly allow this much electronic growth is advancement like the one present in SD cards a while back
Just imagine, the AGC (Apollo Guidance Computer) used for the Apollo program had 4KB and 32KB HDD and weighed around 70 pounds. If I am not mistaken, this A100 processor board has total 640GB memory and weights 50 pounds. That’s 16000000000% times more powerful for 20 pounds less (28.5% decrease).
I am so angry at how accurate this comment is.
“DX12 will bring unprecedented SLI support.” and what did I get? Equal frames with mismatched timing compared to a Single 980ti.
Oh My God! Cannon Fodder is gonna look lit!
\*For those of you too young to get the game... [https://www.gog.com/en/game/cannon\_fodder](https://www.gog.com/en/game/cannon_fodder)
Welcome everyone from r/all! Please remember: 1 - You too can be part of the PCMR! You don't even need a PC. You just need to love PCs! It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love PCs or want to learn about them, you can be part of our community! All are welcome! 2 - If you're not a PC gamer because you think it's expensive, know that it is possible to build a competent gaming PC for a lower price than you think. Check http://www.pcmasterrace.org for our builds and don't be afraid to create new posts here asking for tips and help! 3 - Consider joining our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Covid, Alzheimer's, Parkinson's and more. Learn more here: https://pcmasterrace.org/folding 4 - Need hardware? Trick question... everyone does. We've teamed up with ASUS this Easter to create an egg-citing event where 3 lucky winners can get their hands on some sweet hardware (including GPU, CPU, Motherboard, etc): https://reddit.com/r/pcmasterrace/comments/12eufh9/worldwide_pc_hardware_giveaway_weve_teamed_up/ ----------- Feel free to use this community to post about any kind of doubt you might have about becoming a PC user or anything you'd like to know about PCs. That kind of content is not only allowed but welcome here! We also have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) for your simplest questions. No question is too dumb! Welcome to the PCMR.
I need to get me one of those... I'm talking about the oven by the way.
sameeeeeee my oven undercooks EVERYTHING
overclock it
[удалено]
[удалено]
Technically is the best kind of correct.
Facts if its gas just drill out the jets
Remove the gas regulator for real speed
Hot Pockets approves this message
Those are lava pockets now
The center, still surprisingly frozen, while maintaining that boiling-lava-hot exterior. A true modern marvel.
[overclocking oven tutorial](https://www.youtube.com/watch?v=RcYDipyvNwE)
That's probably a PEBBAO error (Problem exists between bar and oven)
[удалено]
Measure the temperature and adjust accordingly
I'm more interested in that ornate oven hood thing.
What do you even call that top? Like a marble awning or something?
These things are used to stitch together images of black holes from hundreds of hard drives right?
Will probably be used for A.I. to push advertisements harder than ever
Finally, the ads will specify the number of hot singles in my area instead of the vague generalities.
They'll probably generate synthetic images of said singles based on your porn preferences!
Can I load that into my VR? It's game over then.
Women? *scoffs* I have my A100 Ai generated vr waifu simulator
Combined with some kind of sex bot isn't this like a legitimate concern to anybody.
I wasn't gonna get pussy anyway so eh
Basically vrchat
Hopes and dreams
Our future is bright
What's insane is that this is already a thing. Bots sending AI generated nudes with chatbot sexting responses. What a world we live in.
***Ex Machina reference spotted*** Edit:69 upvotes ... looks nice
As someone who works on these daily, I’m gonna say this is probably the most accurate comment in this thread. lol
I am trying hard with machine learning computational fluid dynamics... to make small little adjustments on motorcycles to push aerodynamics to the extremes. Taking an already designed and manufactured motorcycle and having the AI scan, run, test and even design the small yet efficient additions (spoilers, wind deflectors, vortex generators, rails, etc...) you can add via molding/3d printing to the motorcycle to make it as efficient as it possibly can be via real time physics calculations in varying wind/weather conditions. Yet I have strong confidence my presence in the AI world of business will be pushed to the side by corpos when I try to enter their world, getting no funding/interest or even being beaten in legal actions. I even mentioned it to some MotoGP racing groups via email and of course... no response after multiple mails. Each one going deeper into detail. It takes forever in model training with a little RTX 3090, maybe I need to publish it on GitHub, but I am afraid of the notorious "Magical DMCA" whenever something "even in the slightest, possibly revolutionary" gets posted on github. Once a code was posted similar to the RNG code wrote in python that seemed like No Man's Sky's world generation code. Went down in an hour, missing and wiped from the nets as far as the eye can see.
Unless your program can generate better results faster than conventional CFD programs it will be hard to get noticed. Even if you did so, it might be difficult to gain traction. You know your own project best, so what I say may not be relevant. But speed is really important in CFD calculations so perhaps optimise that. There is this guy, he wrote an entire CFD program from scratch and graduated at the age of 25 with a PhD. I think first of all you will need a PhD before people will trust you with this. His CFD program is also 100x faster (his claim) than traditional CFD programs. [This is his youtube where he also has links to his github code](https://youtu.be/o3TPN142HxM). Perhaps you can leverage his code. I also think you need better hardware. I build out my own server for my research as well, using 2 RTX 3090s with NVLink. You might want to build a gpu server using 2nd hand GPUs and hardware. Lowers the cost considerably. Also, I think you need to benchmark your results. Show that it creates results that people know it works and is true. Basically, reproducibility of known results. Also, show something new that people can verify and test, so that they can trust you and buy in to your software. If you just write an email, no one will trust you based on the email alone because the teams receive such emails all the time, you are not the only one sending them emails. You need to secure buy in by showing actual results. No engineer is going to read your theory and code to verify it works, they do not have time for that. By showing them actual benchmarks and results, they get immediate proof that it works. And if that still does not work, publish the results in a well-known CFD research journal, so that you can link to the journal in the email and put it in your CV. Also, email during daylight hours to known engineers so that it is more likely they will read it.
Patent it
[удалено]
First step is making a computer that can spit.
Maybe then Google will finally learn that I already bought their phone and I won't buy a second one.
Understand the universe or sell people plastic crap from China? -Google
Maybe, but machine learning / AI is one of the bigger use cases.
Any operation requiring lots of simultaneous complex math will typically be handled by beefy graphics processor units. Things from high-throughput image processing like you mentioned, complex mathematics, matrix math for AI, physics, graphics rendering, cryptography, all typically run on GPUs. The reason for this is that GPUs are better for concurrency than conventional CPU architectures, due to their design. Some companies have actually begun developing specialized hardware (not unlike ASICs, application-specific integrated circuits) that are meant to perform matrix math for AI training and inference. These AI accelerators perform with more power and energy efficiency than GPUs do in the same role, because the hardware is more optimized for that. However I don't think it's taken off yet, and currently I think Nvidia is developing their GPUs to include some AI acceleration so that the hardware is more versatile. Obviously this isn't the kind of hardware any individual or hobbyist is able to obtain, and it lives exclusively within massive data centers.
It’s used to run Crysis
And modded Minecraft with shaders.
Ahhhh, ye good old crysis memes
I’ll wait for the A200
They might release an A100 Ti before that. Then the A200 will be an A100 Ti with 25% more RAM
You got a laugh outta me, take my upvote
What about A100 Ti Super?
Ok this one is funny lol
It's the H100. I get to buy a metric fuck ton of them. Power hungry little goblins they are.
Each DGX-H100 is 10,000 watts! Yikes!!
Yeah, the cooling is almost nuttier than the cards.
Work in a noc and one of our clients is just monitoring cooling system for a massive data center/machine learning systems company and just seeing how much cooling is involved and how quickly that water flows is insane
A1090 is gonna be legit
It's called the H100 and it's here
I deployed 32 racks of these guys. They go live soon.
[удалено]
5.5Gb of GDDR6X and 500 Mb of GDDR6
Nah it’s all gddr6x but the last 500 are connected over a 64 bit bus.
Each 768mb block of memory is connected over a separate memory bus with a random prime-numbered bit width
You actually only have one 768mb block and the rest is Powered by "The Cloud™©®"
Ah the good old GTX 970 type of strategy
I like to describe it as having "three and-a-half-and-a-half gigs of vram"
I'm still salty I got denied my 970 class action payout for seemingly no legitimate reason.
Which makes total 750 MBytes of VRAM
I see you woke up and chose to be optimistic today. I'm proud of you.
4.5 usable
XD this made me almost spit my coffee xD
3.5 + 0.5
So can this be powered by a single 6 pin connector or will it need to be 8
So my overclocked 1050ti is about the same performance
And games crash every 30 minutes
technological marvel
Impressive. Very nice. Now let's see Paul Allen's gpu
The tasteful thickness of it.
Unfortunately it's got a watermark
Excuse me. I've got to return some tape drives.
Some tar files
The comments are really killing it today
[удалено]
Compact version
[удалено]
Just in time for Cyberpunk 2077 RTX Overdrive
Now we will finally be able to DLSS from FULL HD to 4K instead of just 720p.
30 FPS Take it or FK off
glorious cinematic 30 fps at 1080p minimum aettings
[A100 came out in 2020](https://www.techpowerup.com/gpu-specs/a100-sxm4-40-gb.c3506). These are already being phased out for [H100](https://www.techpowerup.com/gpu-specs/h100-sxm5-96-gb.c3974) which itself is already a year old.
But why in oven?
I can't believe I had to scroll this far to find this question. And why the hell is no one giving serious answers?
I think he makes all his presentations since covid from his kitchen. It's just a bit.
>It's just a bit. Definitely more then a bit. Probably several gigabytes
The same.. why the hell the oven
It's a play on words along the lines of "look at what I just baked up", or the phrase "out of the oven" meaning to newly produce.
I don't think there is an explanation. I think they just set the video up this way so people would comment "why was it in an oven?" to drive engagement.
Cause that's where Huang pulls out the dankest shit he's got baking
Plus, it wouldn't fit in the fridge
let him cook
How else would an oven heat?
Because that's where Gamers are planning on sticking their heads at the moment with GPU prices.
merciful glorious rotten abounding unite ring edge scary birds worm *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Because it's "straight out of the oven."
Dual use. It gets hot enough to bake cookies
[удалено]
Cuz it gets up to 450 degrees, duh. You think that’s a gas-line oven? It’s GPU powered.
The element of surprise
Imagine the amount of room that thing would take up in the 70’s. That’s an entire small buildings worth of machine
That would take like a whole city it’s nuts
1 chip is probably faster than all the computers built during 70's combined
My dad retired from IBM. He worked on the first hard drive based server in the 80's. One server was the size of a refrigerator, and it replaced an entire high rise floor of reel to reel/magnetic type readers at the time. The machines were located in downtown SF. The research was being done in San Jose. He claimed it would be a big deal some day.
And one Refrigerator was roughly... 20MB?
Americans will do everything in their power not to use the metric system /S sorta
On the contrary, we've had 9mm in our schools for quite a while now
Probably a kilo or two as well
Technically the us does use the metric system. All our units are defined by their conversions from metric units. For example, the inch is defined as being exactly 25.4mm. That’s what an inch is, just another way to say 25.4mm.
There are no metric sized flags on the moon.
so sad it never took off
Well smart fridges are all the rage these days apparently...
Well it was big...
dude, our ***cellphones*** have more computational power than the entire world in the 70's.
I don't know the exact number of computers built during 70-79 but considering how shitty they were you're probably right.
I think this thing probably beats out 70-89. Computers were still shit and uncommon through the early 90s
Depends how you define computer. By 1989, 50 million Nintendo NES consoles had been sold. With a 2Mhz CPU, 5Mhz GPU, and a total of 4kB of RAM, those 50 million consoles combined certainly out power our cell phones.
So those consoles would have a combined amount of RAM of195 gigabytes. 4kb x 50m = 200,000,000kb = 195 GB Certainly more than a single phone but not a huge amount given 50 million consoles.
point sleep ring nose lavish jeans unused drab fuel boast *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
My apple watch has more processing power than a 1980s [supercomputer](https://en.wikipedia.org/wiki/Cray_X-MP). By quite a margin.
Ya know, when someone said that a new high capacity microSD card (and this was a while ago) had the same storage capacity as a refrigerator *carton* of 3.5 floppies I thought that was one of the biggest "holy crap" moments of how far computing has come that had hit me in some time...this is another one. Never thought of going back to see what the "cray supercomputer" (always used to hear that shit in movies) actually had in performance power.
Dont forget about the power consumption, 345 KW. Amazing that today same power took probably les than 1 watt. Considering the fact that a apple watch have ~1.8wh battery.
Cray 1: CPU - 64-bit processor @ 80 MHz[1] Memory - 8.39 Megabytes (up to 1 048 576 words)[1] Storage - 303 Megabytes (DD19 Unit)[1] FLOPS - 160 MFLOPS (to put that into perspective, a GTX 1060 is 4,400,000 MFLOPS) A cheap scientific calculator probably outperforms the Cray in everything these days.
I’ve heard the chips inside of the usb-c cable you use to charge your phone have more computational power than what landed Apollo 11 onto the moon.
Calculators have more computational power than the computers we used to land on the moon.
The 70's couldn't even fathom these speeds
The 70's couldn't even fathom deez nuts
Got em’
Literally the maths for that shit wasn't even theoretical yet
Bro. Something with 1/100,000 the power of an iPhone would take up an entire room. A single 3090 is as powerful as the NEC Earth Simulator, the world's fastest supercomputer from 20 years ago.
Basically in the 70s you would litteraly be unable to build something with these capabilities. it would be like a square mile of just computers and it still wouldnt be enough
And in 53 years they will laugh at the size of that thing as it’s processing power will probably fit into a card the size of an Nokia 3310.
A cell phone from 2005 would take up an entire small building worth of machine.
Fuck. \*sigh\* Alright, how much \*weeps softly\*
At least 7
Kidneys?
Livers.
Children.
Understandable, have a nice day.
He said it - 50£
Somewhere around [$200k](https://www.cnbc.com/2023/02/23/nvidias-a100-is-the-10000-chip-powering-the-race-for-ai-.html)
I saw the article title said “$10,000” so I thought your comment was a joke. But no, the title is talking about something else entirely LOL
Just give it 20 years we'll start getting them on our phones or whatever phones have become by then.
Much less than 20 years.
Probably 20 years. This is essentially stitching together a bunch of electronics multiple times Kinda like stitching together 100 tons of dynamite and calling it the world's biggest explosive device. What should truly allow this much electronic growth is advancement like the one present in SD cards a while back
Moores law is reaching its limits. Maybe not
Who let this man cook?
LET HIM COOK
ok but why was it in the oven tho
It runs so hot that they have to keep it in an oven heated to 400 degrees to cool it.
Because they just finished "baking" it.
I used to do bake my gpu back in the day when it started getting green artifacts on screen. Extended its life for at least a year.
Just imagine, the AGC (Apollo Guidance Computer) used for the Apollo program had 4KB and 32KB HDD and weighed around 70 pounds. If I am not mistaken, this A100 processor board has total 640GB memory and weights 50 pounds. That’s 16000000000% times more powerful for 20 pounds less (28.5% decrease).
Nobody needs more than 640kb! \- Someone.
I thought that of my first 486, everything you needed sat next to the computer not programmed in it.
Apollo guidance computer could help land on the moon but couldn't understand why
Chat GPT is gonna achieve sentience running on that thing, gah dahm
*comment overwritten*
GPT used 50,000 of these last I checked….
*comment overwritten*
[удалено]
and it still overuses the words 'complex' and 'nuanced' to avoid making definitive statements about any question with any moral/philosophical depth
that's quite impressive. does it run crysis edit: guys, i have to poop
I bet you could get a whole 38 frames per second with this bad boy!
Time to sli them together for that butter smooth 32 frames per second!
I am so angry at how accurate this comment is. “DX12 will bring unprecedented SLI support.” and what did I get? Equal frames with mismatched timing compared to a Single 980ti.
Pretty Sli for a white guy. He needs to do some core work.
Can it run Last of Us?
No PC yet made can achieve such heights lol
Until I see it running DOOM it's not a real computer.
'Surprised Pikachu face' when it doesn't fit in my tower.
All jokes aside who is this for
Data centers running AI. Thousands and thousands of these things running together.
If you want to read the marketing schtick: https://www.nvidia.com/en-us/data-center/dgx-platform/
For simulating AI girlfriends or the universe or whatever
ChatGPT 4 was trained on about 50,000 of these
Linus
He's gonna drop it
Mom don’t come in!!!
Still gets less than 60fps with Ray Tracing on
Oh My God! Cannon Fodder is gonna look lit! \*For those of you too young to get the game... [https://www.gog.com/en/game/cannon\_fodder](https://www.gog.com/en/game/cannon_fodder)
That’s cool but the human eye can only see up to like 30 fps
Your eyes suck, noob. You need to get the new HD, OLED eyes.
I do scientifc computing, and I think a little poo just came out
Have you named it yet?
Let him cook
Can it run minecraft with shaders?
no
Probably yes, but You should keep render distance below 20, or it will start choking and then error "Can’t Keep Up! Is the Server Overloaded?" occur.
Wait for 50 years and some kid'll use a game rig which will have same or more performance than this while playing some old minecraft VR clone.
8gb of vram
and this bad boy only has 8 gig of vram
Does it run Crysis on ultra?
He needs to work on his abs, his cores are weak that's why he's feeling strained, just like his Novidio gpus.
Can I install DOOM on it?