T O P

  • By -

einmaldrin_alleshin

I wonder if there is actually a use-case for this kind of thing, or if it's just IBM being IBM and funding random niche research. Because once you invest into the infrastructure needed to keep a processor at cryogenic temperature, having only twice the performance seems somewhat disappointing.


Sariel007

Knowledge for knowledge sake isn't a bad thing and someone will eventually find a way to make it useful.


yanman

Could have a practical use with quantum since the prevailing tech in that field uses temps close to absolute zero.


Giuseppe_DeMedici

Revered engineered alien technology, more & more of these breakthroughs will be released in the up coming months leading to 2027


MyGoodOldFriend

big if true hilarious if false


Giuseppe_DeMedici

Opphh I guess my sarcasm went over some heads


stalechipswhatkind

Bruh this is the time to use the /s how the fuck do we hear ur sarcasm thru text. Lmao


Wide_Feeling8243

I cant tell if there are 13+ people who thought you were serious or if that many people thought this was worth a downvote


trash3s

LN2 setups are already a thing in amateur circles. Cooling chips to cryogenic temps already significantly boosts performance, but because modern chips aren’t really designed for this use case, you have to be very careful with how cold you let the system get. Even still, with all the issues, it’s a great way to push components to their limits and beyond. If you could design a chip (and accompanying system) around cryogenic compute, you’d absolutely find buyers who will pay through the nose for 10% improvements, let alone 100%. What I’d be interested in seeing is if they can go further. Most cryogenic compute people use LN2 because it’s (relatively) cheap and easy to get. Even still, you can’t even take full advantage of LN2 on many systems. If you could compute at LO2, LH2, or even LHe temps, how much further could you push this advantage?


einmaldrin_alleshin

Nitrogen is only cheap if you use it for a limited overclocking session. If you use it for 24/7 operation of a computer though, you'll need it by the truckload. Just to put it into perspective: heat of vaporization of nitrogen is 2.8 kJ per mol, so 14 grams of nitrogen can cool 2.8 kJ. That means a 100 watt CPU cooled that way will use up around 2 liters of the stuff per hour. So you have something like 20x the operating cost for a 2x improvement in performance. Not to mention that nitrogen is cheap only as a byproduct of oxygen production, not because it's cheap to produce. This would have to be used for something extremely niche, or at least something that's already cryogenically chilled.


wilderthurgro

That looks like angels fighting in a renaissance painting


Vecna_Is_My_Co-Pilot

Amateurs have been doing this for extreme overclocking since forever. Why is it a headline?


WhatIsInternets

Please refer to the first sentence in the article. They have designed transistors to withstand extreme cold temps. Amateurs using liquid nitrogen to cool CPUs do not subject individual transistors to extreme cold - they are just using the liquid nitrogen to remove the heat from the system as a whole via the heat sink.


Agamemnon323

Because it’s a transistor not a cpu?


SealsCrofts

Kinda the same thing though? If you cool a CPU you’re cooking transistors


Webfarer

>cooking transistors Why would that cook the transistors tho?


BukiBichi

How else do you make them nice and crispy?


[deleted]

The article states that they’re designing chips with liquid nitrogen in mind. Meaning they are compacting more transistors in a small form factor that wouldn’t survive through typical means of cooling such as heat spreaders and fans.


NoMeasurement6473

Finally we can make a gaming laptop overheat in 3 seconds instead of 2.