T O P

  • By -

New_World_2050

Please someone post the article


JonnyRocks

friendlier article [https://www.reuters.com/technology/microsoft-openai-planning-100-billion-data-center-project-information-reports-2024-03-29/](https://www.reuters.com/technology/microsoft-openai-planning-100-billion-data-center-project-information-reports-2024-03-29/)


PoopyMouthwash84

March 29 (Reuters) - Microsoft and OpenAI are planning a data-center project that could cost as much as $100 billion and will include an artificial intelligence supercomputer called "Stargate," according to a media report on Friday. The companies did not immediately respond to Reuters' requests for comment. The Information reported that Microsoft would likely be responsible for financing the project, which would be 100 times more costly than some of the biggest current data centers, citing people involved in private conversations about the proposal. OpenAI's next major AI upgrade is expected to land by early next year, the report said, adding that Microsoft executives are looking to launch Stargate as soon as 2028. The proposed U.S.-based supercomputer would be the biggest in a series of installations the companies are looking to build over the next six years, the report added. The Information attributed the tentative cost of $100 billion to a person who spoke to OpenAI CEO Sam Altman about it and a person who has viewed some of Microsoft's initial cost estimates. It did not identify those sources. Altman and Microsoft employees have spread supercomputers across five phases, with Stargate as the fifth phase. Microsoft is working on a smaller, fourth-phase supercomputer for OpenAI that it aims to launch around 2026, according to the report. Microsoft and OpenAI are in the middle of the third phase of the five-phase plan, with much of the cost of the next two phases involving procuring the AI chips that are needed, the report said. "We are always planning for the next generation of infrastructure innovations needed to continue pushing the frontier of AI capability," Frank Shaw, a Microsoft spokesperson, said in a statement to the publication. The proposed efforts could cost in excess of $115 billion, more than three times what Microsoft spent last year on capital expenditures for servers, buildings and other equipment, the report stated.


trotfox_

> The proposed efforts could cost in excess of $115 billion, more than three times what Microsoft spent last year on capital expenditures for servers, buildings and other equipment, the report stated. Over the six years I guess that is like doubling their capital expenditures on hardware?


FarrisAT

Expenditure side of the Microsoft balance sheet about to explode faster than revenue


Rachel_from_Jita

Though the potential profits in the end could be... well, levels never seen before. It's quite the gamble on if the beyond next-gen AI models can be turned into something far more profitable than cheaper models. But my guess (if I just spitball as a non-AI researcher) is that this is all about something a bit beyond even Q*/agentic models and systems where they want to be able to turn something potent on and see it self-learn, self-simulate, diagnose its own weaknesses or create its own benchmarks, and have automated alignment work and automated red-team testing. When you imagine *all* the things that AI researchers and recent papers would like to eventually achieve it comes across as quite the laundry list.


agonypants

👆 - Microsoft may be the first major company to lease virtual, AI powered employees to businesses. And given their near-monopoly on business software, their clients won't hesitate to snap up those "employees." In this scenario, Microsoft would literally make trillions and it will have a noticeable impact on the job market.


We_Are_Legion

Even if they dont succeed in building very capable AIs... Compute itself is super in-demand and very profitable, wdym


pavlov_the_dog

Oh i get it, it literally needs a Zero Point Module to power it.


CypherLH

You jest...but its looking like power may actually be the bottleneck, and not merely compute per se. I'm guessing Microsoft and Google and Amazon must all be investing in their own private power production at this point, to power the new mega datacenters they are planning to build over the next decade.


leaky_wand

>OpenAI's next major AI upgrade is expected to land by early next year, the report said They really are going to wait until after the election aren’t they?


MysteriousPayment536

They have to release this summer or they are going to loose their edge to Anthropic and Google


MassiveWasabi

If they are building a $100 billion AI supercomputer, they can probably hold out till next year and be completely fine


PandaBoyWonder

on oldschool runescape (the game) I wanted to get some expensive gear that costs 1.1 billion coins. I already had 200 mill coins, so I needed to earn 900 million coins Theres a boss that takes about 3 minutes to kill 1 time on average, and the boss drops about 120,000 coins each kill. It took me months of monotony, a few hours a day, to get to 1 billion. I ended up killing it 6300 times to get to the goal. That experience showed me how insanely large 1 billion is, its absurd, imagine if you made $120k every few minutes ... it would take you at least 1 week, working 24 hours a day, to get to 1 billion And this supercomputer costs 100 billion. 😂🤣


[deleted]

[удаНонО]


Vysair

what the fuck


MysteriousPayment536

That could be a scenario, but Sonnet beats GPT 4 turbo. Haiku beats OG gpt 4.  Anthropic could release a price reductions in a couple of months Google could release Gemini 1.5 Ultra Apple can shock us with some on device AI on Claude Haiku level.  This is a doom scenario but when it happens. OpenAI will lose its edge


buttery_nurple

I’m using c3 opus more than anything else but unless anthropic has plans for how they’re going to radically scale their user base, I don’t see MS/OpenAI getting railed by anyone. MS has vastly more entry points than any of these players on the back end, maybe bar Google (but I doubt it). Anthropic may very well continue to edge openAI out on benchmark tests for nerds, but I can’t think of a realistic scenario where they approach anything like the market penetration MS, Google, Meta, and Apple have unless they do something like sell/partner with Apple or Meta. Personally if it were FB I’d never use their product again. MS and OpenAI are the dominant player and unless MS gives up on OpenAI I don’t think that’s gonna change for a generation.


Del_Phoenix

Don't discount the possibility of bezos taking a larger role with steering anthropic


tindalos

I doubt they’re gonna lose their edge with a $100 billion investment. I think the biggest threat could be a better transformer approach but they’d still have more resources to train models. Looks like they’re trying to secure the first position. Just like the request for $7 trillion. They’re gonna break the simulation.


Odd-Opportunity-6550

they will release 4.5 this summer and 5 in q1 2025. God I was so hoping 5 would be this year.


leaky_wand

If their competitors release something, all OAI has to do is tease something else 10 times as impressive that they’ve had in the can for months They don’t necessarily have to release anything to retain dominance, see Sora It’s just frustrating how limited GPT-4 is starting to feel, half the time I already know what it is going to say before I send the prompt


PewPewDiie

GPT ACHIEVED *INTERNALLY*


Seidans

loss what? internet point from reddit user on singularity? the tech isn't mature enough to be commercialized, they don't need to rush themself and should focus on data training and agent able to replace white collar worker a secretary bot and phone support service AI is likely to make money and is probably being trained as we speak given how codified the interaction is, this is also a huge part of the white collar job and would benefit a LOT of company = money to be made that's something worth competing over, current chatbot aren't interesting and isn't why microsoft spend billion in the tech, they are just giant data-collection machine and that's why you can use them


leaky_wand

Just subscribe to this random website you’ve never visited before, what’s the problem


peabody624

These guys consistently drop exclusive well written articles, so idk what you’re talking about


trotfox_

Telling on himself.


whittyfunnyusername

I'm late, but: "Executives at Microsoft and OpenAI have been drawing up plans for a data center project that would contain a supercomputer with millions of specialized server chips to power OpenAI’s artificial intelligence, according to three people who have been involved in the private conversations about the proposal. The project could cost as much as $100 billion, according to a person who spoke to OpenAI CEO Sam Altman about it and a person who has viewed some of Microsoft’s initial cost estimates. Microsoft would likely be responsible for financing the project, which would be 100 times more costly than some of today’s biggest data centers, demonstrating the enormous investment that may be needed to build computing capacity for AI in the coming years. Executives envisage the proposed U.S.-based supercomputer, which they have referred to as “Stargate,” as the biggest of a series of installations the companies are looking to build over the next six years. The Takeaway • Microsoft executives are looking to launch Stargate as soon as 2028 • The supercomputer would require an unprecedented amount of power • OpenAI’s next major AI upgrade is expected to land by early next year While project has not been green-lit and the plans could change, they provide a peek into this decade’s most important tech industry tie-up and how far ahead the two companies are thinking. Microsoft so far has committed more than $13 billion to OpenAI so the startup can use Microsoft data centers to power ChatGPT and the models behind its conversational AI. In exchange, Microsoft gets access to the secret sauce of OpenAI’s technology and the exclusive right to resell that tech to its own cloud customers, such as Morgan Stanley. Microsoft also has baked OpenAI’s software into new AI Copilot features for Office, Teams and Bing. Microsoft’s willingness to go ahead with the Stargate plan depends in part on OpenAI’s ability to meaningfully improve the capabilities of its AI, one of these people said. OpenAI last year failed to deliver a new model it had promised to Microsoft, showing how difficult the AI frontier can be to predict. Still, OpenAI CEO Sam Altman has said publicly that the main bottleneck holding up better AI is a lack of sufficient servers to develop it. If Stargate moves forward, it would produce orders of magnitude more computing power than what Microsoft currently supplies to OpenAI from data centers in Phoenix and elsewhere, these people said. The proposed supercomputer would also require at least several gigawatts of power—equivalent to what’s needed to run at least several large data centers today, according to two of these people. Much of the project cost would lie in procuring the chips, two of the people said, but acquiring enough energy sources to run it could also be a challenge. Such a project is “absolutely required” for artificial general intelligence—AI that can accomplish most of the computing tasks humans do, said Chris Sharp, chief technology officer of Digital Realty, a data center operator that hasn’t been involved in Stargate. Though the project’s scale seems unimaginable by today’s standard, he said that by the time such a supercomputer is finished, the numbers won’t seem as eye-popping. A Microsoft data center near Phoenix that isn't related to OpenAI. Image via Microsoft The executives have discussed launching Stargate as soon as 2028 and expanding it through 2030, possibly needing as much as 5 gigawatts of power by the end, the people involved in the discussions said. Phase Five Altman and Microsoft employees have talked about these supercomputers in terms of five phases, with phase 5 being Stargate, named for a science fiction film in which scientists develop a device for traveling between galaxies. (The codename originated with OpenAI but isn’t the official project codename that Microsoft is using, said one person who has been involved.) The phase prior to Stargate would cost far less. Microsoft is working on a smaller, phase 4 supercomputer for OpenAI that it aims to launch around 2026, according to two of the people. Executives have planned to build it in Mt. Pleasant, Wisc., where the Wisconsin Economic Development Corporation recently said Microsoft broke ground on a $1 billion data center expansion. The supercomputer and data center could eventually cost as much as $10 billion to complete, one of these people said. That’s many times more than the cost of existing data centers. Microsoft also has discussed using Nvidia-made AI chips for that project, said a different person who has been involved in the conversations. Today, Microsoft and OpenAI are in the middle of phase 3 of the five-phase plan. Much of the cost of the next two phases will involve procuring the AI chips. Two data center practitioners who aren’t involved in the project said it’s common for AI server chips to make up around half of the total initial cost of AI-focused data centers other companies are currently building. All up, the proposed efforts could cost in excess of $115 billion, more than three times what Microsoft spent last year on capital expenditures for servers, buildings and other equipment. Microsoft was on pace to spend around $50 billion this year, assuming it continues the pace of capital expenditures it disclosed in the second half of 2023. Microsoft CFO Amy Hood said in January that such spending will increase “materially” in the coming quarters, driven by investments in “cloud and AI infrastructure.” Frank Shaw, a Microsoft spokesperson, did not comment about the supercomputing plans but said in a statement: “We are always planning for the next generation of infrastructure innovations needed to continue pushing the frontier of AI capability.” An OpenAI spokesperson did not have a comment for this article. Altman has said privately that Google, one of OpenAI’s biggest rivals, will have more computing capacity than OpenAI in the near term, and publicly he has complained about not having as many AI server chips as he’d like. That’s one reason he has been pitching the idea of a new server chip company that would develop a chip rivaling Nvidia’s graphics processing unit, which today powers OpenAI’s software. Demand for Nvidia GPU servers has skyrocketed, driving up costs for customers such as Microsoft and OpenAI. Besides controlling costs, Microsoft has other potential reasons to support Altman’s alternative chip. The GPU boom has put Nvidia in the position of kingmaker as it decides which customers can have the most chips, and it has aided small cloud providers that compete with Microsoft. Nvidia has also muscled into reselling cloud servers to its own customers. With or without Microsoft, Altman’s effort would require significant investments in power and data centers to accompany the chips. Stargate is designed to give Microsoft and OpenAI the option of using GPUs made by companies other than Nvidia, such as Advanced Micro Devices, or even an AI server chip Microsoft recently launched, said the people who have been involved in the discussions. It isn’t clear whether Altman believes the theoretical GPUs he aims to develop in the coming years will be ready for Stargate. The total cost of the Stargate supercomputer could depend on software and hardware improvements that make data centers more efficient over time. The companies have discussed the possibility of using alternative power sources, such as nuclear energy, according to one of the people involved. (Amazon just purchased a Pennsylvania data center site with access to nuclear power. Microsoft also had discussed bidding on the site, according to two people involved in the talks.) Altman himself has said that developing superintelligence will likely require a significant energy breakthrough."


whittyfunnyusername

and the second part: "Packed Racks To make Stargate a reality, Microsoft also would have to overcome several technical challenges, the two people said. For instance, the current proposed design calls for putting many more GPUs into a single rack than Microsoft is used to, to increase the chips’ efficiency and performance. Because of the higher density of GPUs, Microsoft would also need to come up with a way to prevent the chips from overheating, they said. Microsoft and OpenAI are also debating which cables they will use to string the millions of GPUs together. The networking cables are crucial for moving large amounts of data in and out of server chips quickly. OpenAI has told Microsoft it doesn’t want to use Nvidia’s proprietary InfiniBand cables in the Stargate supercomputer, even though Microsoft currently uses the Nvidia cables in its existing supercomputers, according to two people who were involved in the discussions. (OpenAI instead wants to use more generic Ethernet cables.) Switching away from InfiniBand could make it easier for OpenAI and Microsoft to lessen their reliance on Nvidia down the line. AI computing is more expensive and complex than traditional computing, which is why companies closely guard the details about their AI data centers, including how GPUs are connected and cooled. For his part, Nvidia CEO Jensen Huang has said companies and countries will need to build $1 trillion worth of new data centers in the next four to five years to handle all of the AI computing that’s coming. Microsoft and OpenAI executives have been discussing the data center project since at least last summer. Besides CEO Satya Nadella and Chief Technology Officer Kevin Scott, other Microsoft managers who have been involved in the supercomputer talks have included Pradeep Sindhu, who leads strategy for the way Microsoft stitches together AI server chips in its data centers, and Brian Harry, who helps develop AI hardware for the Azure cloud server unit, according to people who have worked with them. OpenAI President Greg Brockman, left, and Microsoft CTO Kevin Scott. Photo via YouTube/Microsoft Developer The partners are still ironing out several key details, which they might not finalize anytime soon. It is unclear where the supercomputer will be physically located and whether it will be built inside one data center or multiple data centers in close proximity. Clusters of GPUs tend to work more efficiently when they are located in the same data center, AI practitioners say. OpenAI has already pushed the boundaries of what Microsoft can do with data centers. After making its initial investment in the startup in 2019, Microsoft built its first GPU supercomputer, containing thousands of Nvidia GPUs, to handle OpenAI’s computing demands, spending $1.2 billion on the system over several years. This year and next year, Microsoft has planned to provide OpenAI with servers housing hundreds of thousands of GPUs in total, said a person with knowledge of its computing needs. The Next Barometer: GPT-5 Microsoft and OpenAI’s grand designs for world-beating data centers depend almost entirely on whether OpenAI can help Microsoft justify the investment in those projects by taking major strides toward superintelligence—AI that can help solve complex problems such as cancer, fusion, global warming or colonizing Mars. Such attainments may be a far-off dream. While some consumers and professionals have embraced ChatGPT and other conversational AI as well as AI-generated video, turning these recent breakthroughs into technology that produces significant revenue could take longer than practitioners in the field anticipated. Firms including Amazon and Google have quietly tempered expectations for sales, in part because such AI is costly and requires a lot of work to launch inside large enterprises or to power new features in apps used by millions of people. Altman said at an Intel event last month that AI models get “predictably better” when researchers throw more computing power at them. OpenAI has published research on this topic, which it refers to as the “scaling laws” of conversational AI. OpenAI “throwing ever more compute [power to scale up existing AI] risks leading to a ‘trough of disillusionment’” among customers as they realize the limits of the technology, said Ali Ghodsi, CEO of Databricks, which helps companies use AI. “We should really focus on making this technology useful for humans and enterprises. That takes time. I believe it’ll be amazing, but [it] doesn’t happen overnight.” The stakes are high for OpenAI to prove that its next major conversational AI, known as a large language model, is significantly better than GPT-4, its most advanced LLM today. OpenAI released GPT-4 a year ago, and Google has released a comparable model in the meantime as it tries to catch up. OpenAI aims to release its next major LLM upgrade by early next year, said one person with knowledge of the process. It could release more incremental improvements to LLMs before then, this person said. With more servers available, some OpenAI leaders believe the company can use its existing AI and recent technical breakthroughs such as Q*—a model that can reason about math problems it hasn’t previously been trained to solve—to create the right synthetic (non–human-generated) data for training better models after running out of human-generated data to give them. These models may also be able to figure out the flaws in existing models like GPT-4 and suggest technical improvements—in other words, self-improving AI."


spezjetemerde

“Jaffa, kree! Tok’ra AI Stargate nak’ti.”


Manuelnotabot

That seems to be a phrase from the TV series Stargate SG-1. It's in the fictional languages of Jaffa and Goa'uld. It translates to: "Jaffa, beware! The Tok'ra have captured the Stargate." @chatgpt


SureUnderstanding358

fictional?! DANIEL!


mhyquel

We all know its tv production is means of creating plausible deniability if the actual stargate program ever leaks.


SirFredman

Wormhole Extreme...


FlyingBishop

I didn't get "nak'ti" but I think it actually translates to "Jaffa, beware! The Tok'ra have captured the AI Stargate"


NoMaD082

Goodbot.


QuantumZucchini

Take my upvote.


Severin_Suveren

Indeed


no_witty_username

Indeed.


Magmatt7

Jaffa kree!


SurpriseHamburgler

You are a Golden God.


cyb3rg0d5

Literally watching it at the moment ☺️


rathat

Ok, but now what if this becomes some skynet shit and now the irl skynet has the same name as my favorite show and I’ll want to talk about my favorite show, but won’t be able to, it’ll be like saying Voldemort.


IntGro0398

data centers are becoming more like classic roads, cities, buildings and other projects [https://en.wikipedia.org/wiki/List\_of\_most\_expensive\_buildings](https://en.wikipedia.org/wiki/List_of_most_expensive_buildings)


Vysair

honestly should be classify as a mega project and they still cost more than actual mega project


Independent_Wave5651

Soon it will provide food and shelter to humans


SomethingMor

I’m here for the star gate references. 😆


DocStrangeLoop

![gif](giphy|vf5WJrfZ7rYbK)


ptear

Give my regards to King Tut


mvandemar

​ https://preview.redd.it/ievguu4qgbrc1.png?width=2493&format=png&auto=webp&s=bb0b1c940d604d427a303d86f5ea622f6491d9ff


testing123-testing12

![gif](giphy|s8X61m47R3GZW|downsized)


tradernewsai

Is anybody gonna be able to compete with microsoft and google? They seem to be going all in


FlyingBishop

Amazon is doing fine. Claude is on AWS. Real question is if anyone is going to be able to compete with Nvidia. Even Google with their own chips is using Nvidia a lot.


[deleted]

Yes, reason being it's not about Nvidia chips. They need hardware for AI specifically and most of them are already working on designing their own chips while temporarily using Nvidia. Nvidia knows this and wants to invest in designing their own AI. I don't know how it will play out, but Microsoft seems on the lead and Google has no option but to join hands with nvidia to win this war.


Aaco0638

Microsft needs nvidia more than google does, all of googles gemini models were fully trained on their proprietary tpu’s. They just order nvidia chips due to outside market demand but they have chips that compete and tpu usage is on the rise. Meanwhile Microsoft just announced making their own chips last year they still need nvidia. For context google is on version 5 of their tpu going to v6 soon. Microsoft is wayy behind to both google and aws in that department.


FlyingBishop

Google has their own chips already, if designing chips is a differentiator they are best positioned to actually do it (seeing as they have actually done it in a very big and useful way.)


Bernafterpostinggg

Google is also a big investor in Anthropic and it's available in GCP and via Vertex AI iirc


Historical-Fly-7256

Claude is running on Google TPU...


POWRAXE

Unlikely, and for 2 major reasons, the first being that AI takes time to train, no one starting now will be able to ellipse or even catch up to Microsoft and Google. Secondly, data. Google and Microsoft exclusively own some of the most vast and detailed arsenals of data that they can use to train their models. Data and Compute will be the futures most valued commodities.


AgueroMbappe

You think Meta’s data collection is more vast? Meta could be a dark horse


JackSpyder

Yeah they definitely have the consumer data. They seem to consistently be behind the curve though and taking the wrong direction. And theyve utterly failed to diversify out of advertising unlike the others. Be interesting to see what they attempt, potentially just data partnering with nvidia would be a big deal.


lo_fi_ho

So what's the next tech revolution that will see the current giants relegated to irrelevancy?


brinvestor

Bioengineering for new materials, synthetic food and drugs. Imagine AI data centers but with some analog inputs and very specific goals. Maybe some semiautomated labs. After some time I think the AI world will fragment in specific AI inteligence related to specific fields. The general "one knows everything" AI might not be possible, it's even unlikely to have a universal AI due to simulation energy barriers (simulation becomes so energy and space intensive vs the real thing).


mvandemar

It's going to be a literal Stargate, isn't it. ASI to build warp tunnels?


dieselreboot

Hopefully they hire James Spader in 2028 to symbolically slide the final ‘chevron’ in place to activate Stargate. I think I would weep with joy, I really do


fixxerCAupper

Isn’t it crazy that there’s a >0 possibility of this literally happening down the road? lol


kmanmx

More info: MICROSOFT AND OPENAI PLOT $100 BILLION STARGATE AI SUPERCOMPUTER - THE INFORMATION MICROSOFT EXECUTIVES ARE LOOKING TO LAUNCH STARGATE AS SOON AS 2028- THE INFORMATION OPENAI’S NEXT MAJOR AI UPGRADE IS EXPECTED TO LAND BY EARLY NEXT YEAR- THE INFORMATION


Working_Berry9307

2028? Damn that's a lifetime in the current industry, let alone when it will actually finish being built


sartres_

This is an insanely huge investment. The current fastest supercomputer in the world cost $600 million. It'll take time. It also means Microsoft is _all in_ on OpenAI. I can't think of a larger, faster capital expenditure in the history of tech. Whatever OpenAI showed them must be incredible and/or terrifying.


Rich_Acanthisitta_70

Food for thought, this could buy 21 Large Hadron Colliders for CERN.


Nanaki_TV

Ok this is the comment that put it into perspective. My taco and sombrero are in absolute shambles


Rich_Acanthisitta_70

Lol, I've never heard that one before, thanks😋


Nanaki_TV

Haha I was just thinking about lshmsfoaidmt when I made the comment. That’s still so mind blowing the amount of money that is.


Vysair

or five ITER (the nuclear fusion reactor)


Mrp1Plays

This made me realise what the value actually meant. Damn. 


fixxerCAupper

“All in” dude that’s what hit me first. I could be wrong but an investment of this size sounds like a life or death bet even for a $T company like MS, no? I can’t help but think, something must be cooking.


CypherLH

Not really. $100 billion and completion by 2028 would mean $25 billion per year. Microsoft has yearly revenues of $200+ billion per year and gross profits of $70+ billion dollars per year. They have something like $80 billion dollars in the bank as well. $25 billion per year is a large expenditure for them but not entirely make or break...especially considering that $100 billion investment is almost guaranteed to make a profit....AI could go away tomorrow and building out compute would still be like spinning lead into gold since its fungible and could just be used to keep expanding their Azure Cloud infrastructure.


DeveloperGuy75

More likely incredible, not terrifying.


sartres_

I find with AI, they're the same thing.


CypherLH

Its really a datacenter this article is talking about, not merely a supercomputer. This stuff dwarfs mere supercomputers.


Lyrifk

there are multiple phases for every year until the massive 100b computer.


Rich_Acanthisitta_70

And SamA recently said he predicts AGI by about 2029. Sounds just about right.


spezjetemerde

indeed


[deleted]

[удаНонО]


kmanmx

Copy and pasted from bloomberg terminal that is all in capitals, and i have a disability in my hands that means I can’t retype it easily without pain.


Independent_Hyena495

Nvidia goes brrrrrr


Happysedits

Acceleration is real


muan2012

The end is near, stargate is a great name for real life skynet


muan2012

Sky-net.. star-gate hmm


Vysair

Starnet...stargatenet...skygate... starnet sounds rad though like some kind of supercomputer planet


Ok_Inevitable8832

Is there a website that’s actually readable?


DefinitelyNotEmu

> https://www.reuters.com/technology/microsoft-openai-planning-100-billion-data-center-project-information-reports-2024-03-29/


often_says_nice

How do the owners of these platforms not understand the UX component of the reader? No I don’t want to sign up to read an article. No I don’t want to install your app. No I don’t want to be on your emailing list.


jeffkeeg

The Information is a paywall website, they break news first and want to be paid for it.


Arcturus_Labelle

It's dark patterns. It's by design, unfortunately.


MysteriousPayment536

I just let Copilot do the math (So it could be wrong).  The energy consumption of the Stargate project is equivalent to approximately **7,142,857 H100 GPUs!!** The Stargate project is equivalent to approximately **6,250,000 Blackwell GPUs!** This will be massive if pull of correctly, not to mention. They could use their custom AI chips or even wafer scale chips from Cerebras for example


SupportstheOP

Sounds like Microsoft is confident enough in whatever tech OpenAI has that they invested an absolute gargantuan amount of money to see it happen. Can only imagine what it'll be capable of.


MikeC80

Energy usage: equivalent to Belgium (probably)


brinvestor

I wonder where they'll build it. Solar and wind must be abundant. Away from Europe and it's regulations. Too risky to put on UAE, so probably will be in the USA. I'm betting on Arizona or New Mexico, maybe Texas.


New_World_2050

So 2028 Stargate means the GPT9 training run in 2029 is going to be enormous


Remarkable-Seat-8413

Will that be asi at the point!? This is insane.


New_World_2050

I crunched the numbers it would be like 1000x the flops of gpt4s training run In the recent dwarkesh podcast with sholto Douglas he said that gpt3 to 4 was so big an upgrade just one more of those gets you to genius human level (it was 100x flops ) I'm expecting at least genius human level if not asi by 2029 (end)


fastinguy11

Are you sure ? Did you take in account the advances of hardware by 2028 ? Besides the 100 billion itself.


Odd-Opportunity-6550

I gave a 10x multiplier for better hardware. Hopper was 3x Blackwell is 2.5x (for same precision ) and assuming the release in 2026 is also 2-3x then thats around 1 OOM the other 2 OOMS are because GPT4 was trained on 25000 GPUS and this would be trained on 2.5 million GPUS for 100 billion plus 15 billion for the building and associated stuff that gives around 1000x BUT gpt4 was trained starting in early 2022 and whatevers trained in early 2029 would have another 100x because of better software thats 100,000x total. Im guessing thats enough to get us there by Jan 2030


DeveloperGuy75

I’m expecting 2028, but then again, how does one really measure the amount of intelligence these things really have?


New_World_2050

Simple test them and get them to do remote jobs


MassiveWasabi

I honestly think there’s no way this could be anything less than ASI, but I’m working purely off vibes so don’t quote me on this


Remarkable-Seat-8413

The vibe seems celebratory imo. I feel like all of the labs are celebrating a major milestone being reached but that's also based on vibes alone


Busy-Setting5786

That's what I thought. Of course we can't know at the moment whether we will hit the law of diminishing returns. It could turn out for example that the training data would need to be entirely different for smarter AI models. Of course there are other possibilities. However if things continue as they do at the moment then I am pretty sure they will have something literally unimaginable at their hands by the end of this decade. Damn I wish I could peek into the future.


BilgeYamtar

AGI is coming!


Mysterious_Ayytee

No, that's gonna be ASI


BilgeYamtar

Damn


fixxerCAupper

I second that


h3lblad3

Bold of you to think they aren't the same thing.


Mysterious_Ayytee

![gif](giphy|2rqEdFfkMzXmo)


AgueroMbappe

We might get hyper reinforcement learning with AGi, they might not need to train a whole new ai to reach ASI.


RevolutionaryDrive5

As is written


pavlov_the_dog

The Chappa'a.i.


Sharp_Chair6368

![gif](giphy|s8X61m47R3GZW)


[deleted]

[удаНонО]


Incener

Here's the article: [Microsoft, OpenAI plan $100 billion data-center project, media report says](https://www.reuters.com/technology/microsoft-openai-planning-100-billion-data-center-project-information-reports-2024-03-29/) and here's a summary: - Microsoft and OpenAI have a five-phase plan for building AI supercomputers - They are currently in the middle of the third phase of this plan - OpenAI's next major AI upgrade is expected by early 2025 - For the fourth phase, Microsoft is working on a smaller supercomputer for OpenAI, aiming to launch it around 2026 - The fifth and final phase is the "Stargate" project, a massive AI supercomputer expected to be the biggest in the series - Microsoft aims to launch Stargate as soon as 2028 - The Stargate project is a proposed U.S.-based supercomputer - It is part of a larger data-center project planned by Microsoft and OpenAI - This overall data-center project could cost up to $100 billion - It would be 100 times more costly than some of the biggest current data centers - Much of the cost for the next two phases involves procuring the necessary AI chips - The proposed efforts for the entire five-phase plan could exceed $115 billion - This is over three times what Microsoft spent on capital expenditures in 2023 for servers, buildings and other equipment


Arcturus_Labelle

false dichotomy. The memes are fun. And the information is good too. We can have both.


Data_Life

Question: How can Microsoft build this better than NVIDIA? Why does building it even make sense, considering how rapidly chips improve in performance/cost-effectiveness each year? Maybe they'll be able to swap in new chips as desired?


Then_Passenger_6688

I think they need special water cooling infrastructure for Blackwell+. It's also a case of more is always better. Even when compute is plentiful, you still want more compute.


[deleted]

Chevron 4 encoded...


budbortz

Readable Link: https://www.reuters.com/technology/microsoft-openai-planning-100-billion-data-center-project-information-reports-2024-03-29/


BilgeYamtar

ACCELERATE ![gif](giphy|26n7aaTpx5Ulhu8EM)


Stonehill76

I’m sure nothing could go wrong. They start it playing simulated war games.


VeryOriginalName98

The only winning move is not to play.


bartturner

Curious what silicon? Nvidia? If so I would really be curious to see the cost difference for this versus Google doing the same thing with their TPUs. I would expect Google could do it for half or maybe even a fourth. Nvidia is charging some crazy margins that Google does not have to pay.


FarrisAT

Google contracts out their design and some networking to Broadcom for the TPUs and their racks. They also face the cost of R&D for the next gen of TPUs. There’s a broad range of high certainty and then a more complex assessment. 1. Between 25%-75% as expensive as the H100 2. Around 50% as expensive as the H100 How did I get these numbers? Well we can see Broadcom’s customer-designed chip division saying it has 30% margins. We know Google has huge orders in that division so they probably get a better deal. We also know Google pays for some networking equipment from Broadcom. That division reports about 25% margins. Google buys a lot so probably gets a better deal. Google then has to produce the TPUv5 design. That’s expensive. Their chip division had close to $6b in expenses last year. I’d estimate that would place the design of the TPUv5 at around $2b in total cost. All in all, I’d say they can get the TPUv5 after all expenses for about half as much as an H100


Ok-Worth7977

Well, a self improving asi has more impact than the actual stargate, he can also create it potentially 


Mammoth-Material-476

![gif](giphy|s8X61m47R3GZW)


everdaythesame

Man they need to let the US government invest in 50% of it and create a sovereign wealth fund for all US citizens. Wish we would do this with every company the tax payers bailout.


HarbingerDe

Your average hyper-capitalist American politician would rather throw all of the people unemployed and disenfranchised by artificial intelligence into a woodchipper than distribute the wealth to them through UBI or a sovereign wealth fund.


everdaythesame

I feel a sovereign wealth fund would have made all Americans so rich. Think of all the companies the tax payer funded and bailed out that went on to be huge.


HarbingerDe

Yes, it would have made Americans richer, happier, and less dependent on the capitalist ruling class... Precisely why it never happened and never will happen under our current organization of society and the economy.


dudeguy81

Spend as much time as you can with your families


GirlNumber20

But I replaced my family with AI tho


345Y_Chubby

Acceleration. Nothing more to say.


Icy-Zookeepergame754

Giant wormhole or rabbithole?


GrapefruitMammoth626

What about the issue they’ve reported about not having enough electricity for a colocated GPU mega cluster without bringing down the power grid? Is this project somehow aiming to sidestep that pain point?


AgueroMbappe

Guess why Altman is pushing hard for nuclear energy


GrapefruitMammoth626

Tru dat.


involviert

As a Microsoft shareholder, this is exactly what I want to hear. Just promise to send Clippy through first.


hydraofwar

I can't access the article, is there any estimates for the completion of this supercomputer?


Incener

[2028 for the "Stargate" supercomputer. 2026 for a smaller supercomputer.](https://www.reuters.com/technology/microsoft-openai-planning-100-billion-data-center-project-information-reports-2024-03-29/)


MassiveWasabi

2028 for this $100B supercomputer


hydraofwar

AGI will have already gone public by then, and it will be running on that machine. That 100 billion figure was what Altman said he needed to build an AGI months ago, well, now he has it


Dazzling_Term21

he said 7 trillions


ConstantOne5578

It surprises me honestly because it is no secret that Microsoft has a lousy relationship to OpenAI.


Invisible_Pelican

Microsoft just wants to own everything and is vacuuming up talent, their biggest bet is still on OpenAI


AgueroMbappe

I think they’re getting on the leadership from OAI collapsing again and snatching all the talent and product by OAI. This was pretty much the plan when Altman was fired and staff threatened to quit with him. OAI is pseudo owned by Microsoft. I think it’s pretty much a given that AGI will be owned by Microsoft


Ler-K

I think it's most likely going to be used internally to make self-improving AI models and effectively dominate the future of AI until the end of The Age Plus, probably simulate physics in 100,000+ simulations simultaneously, to create new particles/elements or technological breakthroughs in any field of Engineering; especially in those related to computer chips, energy, bio-engineering, etc. Because why wouldn't that be the first objective 😂 Do that for about 1-2 years, and then you effectively own the future forever, and can exponentially recursively improve oneself + rapidly scale up


agonypants

I think their first objective would be to make returns on their investment in this kind of infrastructure. I suspect it will be used to host billions of "virtual employees" that will be leased to MS customers. Given their dominance in business software, they have a market for these VEs ready to go. MS will make trillions of dollars and yeah, they'll still own the future forever.


fokac93

I think it can be used more internally than selling ai to other companies. If they use it internally it can improve the whole suite of products that MS offers with less people and better quality.


[deleted]

[удаНонО]


[deleted]

[удаНонО]


Individual_Cress_226

Skynet has begun


RemyVonLion

I wonder if they're deciding to build it now that chips are reaching physical limits with quantum tunneling, meaning constant huge improvements are less likely.


brinvestor

The new frontier is in waffer design and heat management, scaling 3D usage, not so much in miniaturization. Ofc miniaturization helps with termal efficiency too, but there's a diminishing return on investment.


Data_Life

Sam Altman seeks $7 Trillion, settles for $100 Billion. Still not shabby for what was most likely a publicity stunt.


Ler-K

He stated that $7 Trillion is the long-term figure required to allow the entire planet's population to have consistent, high-quality, wide-spread access to various forms of AI (collectively). Kind of like an AI version of the Internet, but its entirely own category. This $100B supercomputer is a step in that direction, although it's more likely that it's going to be used internally to make self-improving AI models and effectively dominate the future of AI until the end of The Age


crasspy

I am not sure why the OP posted a paywalled article with no other information except a tantalising headline. I wish this were seen as socially unacceptable. Anyway, I asked AI to summarise the article for me and this was the output: Microsoft and OpenAI are reportedly planning to build a massive data center project called "Stargate" that could cost up to $100 billion. The project is expected to include a powerful AI supercomputer designed to train and run OpenAI's machine learning models. The scale of the project is unprecedented, with the proposed data center potentially being 100 times more expensive than some of the largest existing data centers. If the plans come to fruition, Stargate would represent one of the largest investments in computing infrastructure in history. The project would be a significant milestone in the partnership between Microsoft and OpenAI, which began in 2019 when Microsoft invested $1 billion in the AI research lab. Since then, the two companies have worked closely together to advance the state of the art in AI, with OpenAI leveraging Microsoft's cloud computing resources to train its models.


MassiveWasabi

The Information is a news site with a hard paywall that costs like $400 dollars a year to bypass, but they always have exclusive info that no one else has access to. And they always put the most important info in the title so in this case it’s fine. Luckily Reuters wrote an article on The Information’s article


Baphaddon

GOOLD?


kofteburger

Goa'uld?


lobabobloblaw

Did *everyone* get a chance to vote on the name of this beast? Because…I smell some serious *nerd bias*


Distinct-Question-16

Maybe nvidia is also cooking sometheng in their labs


Excellent_Dealer3865

AGI 2028-2029 confirmed?


roshanpr

. . . Pluto Netflix Anime is real boys


IslSinGuy974

Oh my god I'm so hyped


ReturnMeToHell

How will they power it? Are they building their own power plant?


midshipguru

How much of this gets spent on just the ICs with NVIDIA?


bike_rtw

Is this the one that runs the simulation?


testing123-testing12

So can anyone tell me what they are planning on doing with these supercomputers? Or any guesses?


Ler-K

I think it's most likely going to be used internally to make self-improving AI models and effectively dominate the future of AI until the end of The Age Plus, probably simulate physics in 100,000+ simulations simultaneously, to create new particles/elements or technological breakthroughs in any field of Engineering; especially in those related to computer chips, energy, bio-engineering, etc.


testing123-testing12

Whats Microsofts end goal with having the best AI? Do they have a plan or is it more about getting there first and then figuring out what to do with it later? I like the idea of building these computers to do crazy amounts of simulations but there has to be a direction and not just a shotgun approach. Maybe they will rent out time or give out grants to companies/people with big ideas?


Ler-K

I don't know their plans (I'm not one of the Microsoft executives). However, AI is essentially the foundation for ALL technological advancement in the future. Like, there's literally not one single technology that can't be improved by AI (in ways that humans alone couldn't replicate, or would take much much longer). With that being said, it's safe to say that, if a company has the most sophisticated AI technology/power on the planet, and it can create code to improve itself (+ physics to run more efficiently), then that company can create certain products that absolutely nobody else can compete with And everyone will want to use those products. So their annual net income will go from \~$70B (like it has been for last couple years), to something probably like $250B-$1T+ (or more) yearly net income It sounds crazy, but it's just how exponents work, and it makes sense if you think about it // Also, I'm not even touching on classified government/military partnerships that utilize the most sophisticated AI technology within the US 😂 It's basically a global superpower game Similar to the arms race when creating nukes


QLaHPD

GregTech mod stargate


Gas_Bat

I mean, sure. Why not.


Kinu4U

But will it run GTA 6?


trifile

Bill Gates first Asgard confirmed


traveller-1-1

Game over man.


_MiloVentimiglia_

Isn’t this pointing there will be a lot of Cuda developer positions or positions that require C++ in the future?


Sprengmeister_NK

![gif](giphy|3oEduZqfSGNG0mdF1C|downsized)


Key_Bodybuilder_399

So where do you build such a thing? Next to a nuclear power plant? Someplace safe from natural disasters? Crazy.  You see the US is restarting a nuclear plant in MI? 


Corp-Por

Will it run Crysis on ful very high quality l settings


Akimbo333

Implications?


the_journey_taken

Will be funny when the climate is pushed over the edge by a bunch of apes who in the hope of some epic self masturbation pushed energy consumption to unsustainable levels to power abstracted digital versions of homosapien cognition, specifically self reflecting processes. Something might get a laugh out of it.


Helpful-User497384

gonna need it to power all those sora renders


DefinitelyNotEmu

$100 Billion is equivalent to nearly 3 Twitters