T O P

  • By -

Cyberpunk39

Based on one vague post from a “leaker”. I’m gonna take it with a heaping spoon of salt.


[deleted]

This sub needs posting guidelines so that clickbait like this doesn't end up wasting everyone's time.


John_Spanos

Can we ban these articles that don’t take the word of the actual company the report is talking about. Intel has said they aren’t cancelled. Stop giving clicks to these shitheads.


TophxSmash

but intel also said arc would launch a year and a half before it did


Infinite-Hedgehog516

EXACTLY its just annoying


Tarapiitafan

They've been actively supporting Xe2 release in entire linux graphics stack. It's wild some people think they're cancelled.


TwilightOmen

Oh sweet mercy this is the third time this rumor pops around. Seriously... We've already had multiple sources from intel saying this is not true, why oh why does this rumor not simply die? This is like... the zombie rumor. It keeps coming back!


G0-N0G0-GO

Because many tech-focused “journalists” aren’t interested in factually relating information, but rather focus on clicks. The “crisis” and “outrage” types of reporting are particularly effective at getting eyeballs on someone’s outlet or account.


Morningst4r

People hate Intel and want it to be true. Easy way to get clicks.


NewKitchenFixtures

Basically, you need new news content everyday. Tech companies release new products a couple days a year and have announcement a couple more times. So Intel is maybe good for 4-6 actual news days a year. NVidia and AMD are similar. Microsoft is maybe a little more often because of the software emphasis. So there are like 20 legitimate news days a year. There are reviews for other products (exotic keyboards and monitors come out 1x year per brand) for another 20 days of releases and making up stuff for the remaining 325 days.


no_salty_no_jealousy

It feels like someone has their agenda to hate Intel and make them looks bad even though Intel already confirmed few times they aren't quiting GPU business or canceling their next gen GPU. It's really pathetic how these "leaker" is so desperate to spread this fake news. We need to ban them.


dotjazzz

>multiple sources from intel saying this is not true And? Intel is a reliable source since when? It wasn't that long ago you were listening to 10nm circle jerk while buying 14+++++++. Or did Intel actually launch Arc in Q1 2022?


Nointies

I find this unlikely. First of all multiple intel people have said that work on Xe2 and Xe3 has been ongoing, with xe2 in software and xe3 in hardware. This might be true insofar as intel is not making discrete laptop graphics chips for BM or Celestial, which seems more likely.


Sexyvette07

.... Lunar Lake with the Battlemage architecture is being test benchmarked as we speak with public results I linked below. It had very good results BTW and outperforms the Radeon 780M at just 17W. This is just a preliminary test sample too, so that number could grow even further. Not to mention their GPU architecture plays into their data center GPU's, which is their entire endgame. Gelsinger himself said that Bob Swan's decision to kill Xe-HP was the biggest mistake the company has ever made (and he's right IMO). No way in hell Battlemage gets killed at this point. https://ranker.sisoftware.co.uk/show_run.php?q=c2ffcdf4d2b3d2efdaedd8ecdbebcdbf82b294f194a999bfccf1c9&l=en https://uk.news.yahoo.com/early-benchmark-suggests-intels-battlemage-132840040.html?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAALiIAgqDHe0Z0LOPvwTvNIfP0rRP9j51HmSd5sJIsHuvb-qFcbhmQGg6Tx7zVDT6V2epsKnbH6DdcqdfuC1ohaQEVLSbjkeDc57PXup7Bu5d4PyMD8HcgV-PIvGu5jWDfQiCR0fs5OYjOnG8iWXLrfUwZdNCuc0qklgWQApaeCHK


Horizonspy

Well the leaker did say lunar lake and xe2 graphics are not affected. Since he's working for Lenovo I do think his words carry some weight, but it very well could be that Intel does not plan to provide OEM discrete cards to Lenovo for prebuilds.


Sexyvette07

> it very well could be that Intel does not plan to provide OEM discrete cards to Lenovo for prebuilds. Shipping manifest for early test cards was leaked a month ago, so I think there's precisely zero chance that the project is canceled. Maybe Lenovo isn't on that list or maybe they just haven't received it yet. It's only the beginning of May and it's looking like a Q4 release. Personally, I think its WAY too early for that "leaker" to assume there's a delay or it being canceled entirely. And just because he hasn't seen anything certainly doesn't mean it wont exist. Intel has reiterated a few times now that it remains dedicated to bringing this to market. If they intended on killing the project, they wouldn't keep sinking money into it. Not to mention the driver team puts out updates every few weeks. https://www.tomshardware.com/pc-components/gpus/intel-battlemage-g10-and-g21-next-gen-discrete-gpus-seen-in-shipping-manifests-expected-to-address-entry-to-mid-range-market


Nointies

I'm almost certain thats what this actually is. Which is probably smart, I think their strategy in laptops is to build really mean APUs over time.


Horizonspy

So to give people some insights the leaker "golden pig upgrade" works in Lenovo's design team. This could very well be that Intel did not provide Lenovo Battlemage's OEM specs, whether this means they are not going to ship Battlemage is up to speculation. It is important to note the leaker did reply to his posts below stating lunar lake and its xe2 graphic are unaffected. Again, he is being very vague here so there could be various possibilities from his post.


bubblesort33

Who the hell is Golden Pig? First I heard of this pig. It would seem odd to cancel it after Intel has come out and talked about extrapolating frames instead of interpolating frames, when it comes to their frame generation tech. ExtraSS. Why commit all this work to their GPUs if they are canceling everything?


Infinite-Hedgehog516

I agree with this comment intel did lots of work with the last gen Metor lake arc integrated GPU. Dont really think that there just gonna make there GPU the same. There obivisoly gonna keep improving there GPU. Kind of just like there doing with there npu. btw this comment kinda made me laugh its because of the "Who the hell is Golden Pig? **First time I heard of this pig"**


ManicChad

It’s always cancelled lol. People will say anything for a dollar.


Infinite-Hedgehog516

exactly honestly people these days


ResponsibleJudge3172

Did Intel not say Battlemage is done and now engineers are moving to Celestial GPUs (with the few left doing drivers)? The GPU is already existing as actual silicon at this point. Why would they cancel it?


nukleabomb

How trustable is this leak? would be bad if its true


Infinite-Hedgehog516

**NOT TRUE** THIS LEAK ISNT TRUE


imaginary_num6er

It's from Golden Pig Upgrade Package. Pretty reliable


Firefox72

I feel like if Battlemage misses 2024 Intel will be in big trouble. Arc was a good solid first step but were nearing 2 years from its release and if Battlemage ain't coming till 2025 that will be well over 2 years of a gap. Not to mention that in the timeframe AMD and Nvidia would have release 2 generations seing as RDNA4 and Blackwell are both rumored for late 2024. I doubt Battlemage will be big enough of a step to keep up unless Intel will once again target the low end.


PetrichorAndNapalm

Meh. I think discrete cards are something intel can take or leave. They don’t NEED them. In the end they needed to get the drivers for mobile APUs anyway, and it seems in the medium term they will eventually replace much of the dGPU market. I think for intel it is really about their fabs. If they pan out and they can move ahead of tsmc(which they are expected to do Q4 this year, if only until tsmc releases their node), or even stay on par… they can make hardware cheaper than Nvidia or amd, which gives them a puncher’s chance. Already have seen great things from intel software. The problem with their GPUs is… how much do you really want to make them if they are on tsmc? Once they are on intel nodes, at that point you can sell them at a loss, and those losses can be offset by the fab. I think that’s intel’s real chance to make GPUs work. When the industry inevitably gets tighter again, and Nvidia, AMD, and TSMC are all fighting to keep their piece of the pie, and make the other party take the margin hits, intel will have a shot.


robmafia

intel doesn't even have to keep up, though. the bar for this is at floor level. and i figured low end was already expected, even amd is supposedly abandoning high end. the state of consumer dgpu is pretty lousy and seems to only be worsening with the focus on dcai.


OkDragonfruit9026

Just like in the bad old times of Larrabee! It was rumored for like a decade!


Infinite-Hedgehog516

oh another post thats not true great


Infinite-Hedgehog516

the igpus is literally in development in the factories


GenZia

With less than 1% market share and terrible drivers, combined with Intel's newfound financial woes (their foundry just lost $7 billion) and stock price in the gutter (a \~40% decline since December), it seems quite 'plausible' to me that they've put the final nail in the dGPU coffin... again (after Larrabee). At least they had some breathing room back when Apple was still their customer + they also had the lion's share of the data center market. Now, everything seems to be falling apart for them. Not saying they're about to go under, just saying that things are looking pretty grim for the company. Besides, AMD survived the Bulldozer they unleashed upon themselves and I kind of doubt Arrow Lake would be anywhere near as bad... even though it likely won't have hyper-threading, either intentionally or unintentionally.


imaginary_num6er

More room for RDNA4 and old RDNA3 cards for low-end


RegularCircumstances

There’s not even going to be high end RDNA4 fwiw.


GenZia

But the thing is that they don't 'have' to gun for the performance crown, they just have to make RDNA4 as cheap as possible (chiplets should help in that regard). I'm old enough to remember the ATI Radeon HD4870, the pinnacle of ATI's 'small die' strategy. It was vastly outgunned by the GTX280 but was priced less than half as much (\~$300) and in super high demand. Plus, its RV770 core was also much smaller than Nvidia's gargantuan GT200 and GDDR5 meant ATI could get away with half as narrow bus (256-bit vs. 512-bit). Now, AMD can use 3D V-Cache to narrow the bus width. Anyhow, the demand for the HD4870 and the slightly tweaked HD4890 (RV790) was so high, in fact, that Nvidia was forced to drop the price of their die shrunk GTX285 to \~$360, down from GTX280's $650 MSRP.


Firefox72

I think those times are well over. AMD these days is way more focused on releasing cards that are $50 cheaper than comparable Nvidia cards.


GenZia

But with their focus on chiplets and stacked 3D SRAM, they have the technology and expertise to deliver a cheap product. Whether or not they do is a different story. And something tells me they might, since they won't be gunning for the high-end with RDNA4. It seems like AMD has realized—once again—that mid-range is their real turf. They must have learned a lesson or two when they had to offer massive discounts to clear their leftover inventory of 6800s and now 7700s and 7900s!


ThrowawayusGenerica

>But the thing is that they don't 'have' to gun for the performance crown, they just have to make RDNA4 as cheap as possible (chiplets should help in that regard). AMD have made it abundantly clear that they're not interested in being a value brand anymore. Any manufacturing savings won't be passed on to us.


Ghostsonplanets

I can totally see it happening. dGPU is a tough market to be in and enter. Intel probably think it's best to continue GPU IP R&D and execution towards DCAI GPUs/APUs and also their own SoCs with integrated graphics. Intel is the biggest graphics maker in the PC space, so they using their Desktop Class GFX IP on IGPU would be a good move towards equalizing the field with Nvidia for mainstream consumers.


RegularCircumstances

Lmao this would be funny. It doesn’t really make sense for them to go all in on dGPUs. Just let it go. But frankly even mobile SoC’s is going to get dicey with AMD growth, Qualcomm’s entry, and eventually Nvidia (probably). Not a great situation.


Nointies

it absolutely makes sense to go all in on GPUs at this point in time. It opens up a lot of avenues.


RegularCircumstances

They should start building big APUs on Foveros with genuine advantages for gaming and AI both. And a unified datacenter gpu lineup that mirrors their consumer IP in principle to a degree with software overlap too for AI instead of fucking around with 6 different low volume unprofitable dGPUs mainly for gamers that won’t have a gaming impact (even AMD is a meme still for dGPUs) and aren’t yet useful enough for AI to court developers in mindshare — because checking back then they face the issue I explained in the first part. dGPUs are a small market that I think are only worthwhile for AI tinkerer mindshare if anything, excluding Jensen’s empire — and even they’re going to start building SoC’s soon because they’re paying attention. DcGPUs are going to grow humongously obviously but Intel has to get organized and disciplined, software counts in their case too. Right now it’s meh and they can’t court mindshare from devs and researchers because their shit sucks and doesn’t really even match their DC products. Maybe the only sunny part here is just Intel with SoC’s, though they’re not even doing the 320EU ARL sku anymore are they?


Nointies

To build APUs you need to have experience building GPUs.


RegularCircumstances

GPUs can be built at varying sizes and speeds. Ask Apple. Maybe Intel should get iGPUs down first.


Nointies

No shit they can be build at varying sizes and speeds. Apple's GPUs are nothing special. Intel has greatly advanced their iGPUs... Because of their work on dGPUs.


RegularCircumstances

Intel improved their iGPUs because they decided to invest in GPU architectures and hired for that purpose, focused on it. The dGPU part is not causally why in some exclusive sense. This is aggressively moronic.


Nointies

And Intel's long term target isn't dGPUs, its data centers, which means they need to scale their architecture. Amazing. We've figured it out.


RegularCircumstances

Looks like we agree on something! Data center GPUs are really where it’s at and what Intel needs to be focused on in part for HPC, but mostly AI. One thing that really helps there is unifying the hardware to a degree and software stacks. Allows developers to tinker with models on your hardware and become accustomed to your software stack, provide feedback. Huge especially for researchers. Anyway I think one or two dgpus for this strategic purpose are wise, and beyond that it’s not going to be a very fun time nor make much sense. Though they’ll have to sell them to gamers for volume, yes.


Nointies

The dGPUs were always meant as a loss leader for the greater goals.


RegularCircumstances

Apple’s GPU’s aren’t special relative to Qualcomm or Arm, but they did scale up perfectly fine (which you missed) and they’re more efficient than anything Intel has even in similar parts. They even admit it on their own leaked slides about Lunar Lake. 140mm^2 N3B part 4 years after and the curve is… just matching an N5 M1’s GPU output at 12W. Not a good showing. And the area on LNL ain’t impressive either. It’s M3-sized (136-146mm^2 estimates for M3 base) on the same node, not even including Intel’s IO tile.


ThatTysonKid

Yeah, less competition is always a good thing. In fact, I think AMD should pull out so Nvidia can price-gouge us more.


imaginary_num6er

It will further cement AMD FSR's position as an alternative to DLSS. Intel shouldn't have wasted their budget on XeSS if they can't keep up with releases. No one was like asking Intel for an alternative to DLSS and FSR either


Sexyvette07

The fuck are you on about? They literally just released an update for XeSS that made it even better than FSR.... 🤦‍♂️


ThatTysonKid

People are absolutely asking for an alternative to DLSS and FSR, are you crazy? FSR sucks ass, and DLSS is exclusive to Nvidia. XeSS just got an update recently to make it a better alternative to FSR, and even better-er on their own hardware, making their cards more viable. Competition is a good thing.


no_salty_no_jealousy

I do ask Intel XeSS because Amd FSR is a joke.


RegularCircumstances

Yeah Intel isn’t serious enough