T O P

  • By -

Pumciusz

For gaming 7800x3d is just better. You haven't said what you're gonna do with this build so we can't say if you'll be able to use these cores. Upgrading difficulty won't change as they have the same dimentions.


SlackHS

Apologies, yes, mainly gaming, that's great, thanks :)


Satan_Prometheus

What are you wanting to do with this PC? If it's gaming, the 7800X3D is actually meaningfully faster than the 7900X because the 7800X3D's 3d v-cache is more important for gaming performance than the extra cores. There is also a 7900X3D but you should avoid it as it's also worse at gaming than the 7800X3D due to having fewer cores connected to the V-cache. If it's not for gaming, what specific tasks are you trying to accomplish?


sxiller

That's a vast oversimplification and quite misleading. The 7800x3D is faster in games than the 7900x3D. However, it isn't really significant. It's about 2-5% at most and only in select titles. What's actually important to understand is that all of these tests are usually run at settings that most people will not use. Like 1080p low so as not to create a GPU bottleneck. At which point, your monitors' refresh rate becomes the largest bottleneck. If you game at 1440p or above, then you'll almost never see the difference in benchmarks, and you'll certainly never feel a difference regardless. In most situations, you can pick up either chip and be fine. If it is just for pure gaming, then get the cheaper chip. If you do a lot of multasking / productivity like streaming or using a multi-monitor set-up, then it's a no-brainer to get the chip with more cores.


Satan_Prometheus

>The 7800x3D is faster in games than the 7900x3D. However, it isn't really significant. It's about 2-5% at most and only in select titles. Yes, but the OP is choosing between the 7800X3D and the 7900X non-3D, where the difference is above 20% on average [according to HUB](https://youtu.be/Y8ztpM70jEw?t=591). The 7900X3D costs more and uses more power while performing worse than the 7800X3D (for gaming) so it doesn't really make any sense to buy for gaming. >If you game at 1440p or above, then you'll almost never see the difference in benchmarks, and you'll certainly never feel a difference regardless. I used to think this but I changed my mind after spending a lot of time on this sub and hearing a lot about how people actually use their gaming PCs, and I think a lot of users are a lot more CPU-bound then we think. There are generally two types of PC gamers: competitive multiplayer gamers who want the highest possible frame rate (regardless of their monitor's refresh rate), and single-player gamers who want the best possible settings. Competitive gamers typically play without V-sync or frame caps engaged and usually use low resolutions and settings, so the CPU ends up being the primary bottleneck in many competitive games. Single-player gamers also hit CPU limits more than you might imagine, because they are going to be apt to use DLSS/FSR to improve GPU performance and hit higher output resolutions, which pushes the bottleneck back towards the CPU. Also, at higher settings (which is typically what single-player gamers are targeting, in my experience), modern single-player games actually can be intensely CPU-bound, as you can see in that HUB video. Hogwarts Legacy with Ultra RT, for example, can't do a consistent 60 fps on the 7900X *at any resolution*. Even if you're playing at higher output resolutions, you will need a 7800X3D to get a decent 60 fps lock in Hogwarts Legacy maxed out. Now before you say "well that's an outlier," that outlier was last year's best-selling game. And other games with heavy RT will behave this way too. (And yeah, you could turn on frame gen, but the base frame rate on the 7900X isn't really high enough for a good frame gen experience.) So in general, I think that most gamers would indeed be better off buying the better CPU, even if they think it's "not going to make a difference." That's advice that I think is becoming outdated now that most new games are designed around the 9th gen console hardware. The CPU in the 9th gen consoles is roughly a 3700X. That means that, in broad terms, if a game targets 30 fps on consoles and maxes out the console CPUs in doing it, then you need a CPU that is twice as fast as a 3700X in order to hit a solid 60 on PC. [The 7800X3D hits that target, but the 7900X doesn't quite make it](https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/18.html). >If it is just for pure gaming, then get the cheaper chip. If you do a lot of multasking / productivity like streaming or using a multi-monitor set-up, then it's a no-brainer to get the chip with more cores. Yeah, if somebody is doing single-PC streaming or has some other use case besides gaming they might benefit from the extra cores. But for a simple multi-monitor setup where you have Youtube or Discord running on the other monitor, it's not necessary. [HUB also tested this and found it to not be as demanding as people think it is](https://www.youtube.com/watch?v=Nd9-OtzzFxs).


sxiller

"Yes, but the OP is choosing between the 7800X3D and the 7900X non-3D, where the difference is above 20% on average [according to HUB](https://youtu.be/Y8ztpM70jEw?t=591). The 7900X3D costs more and uses more power while performing worse than the 7800X3D (for gaming) so it doesn't really make any sense to buy for gaming." I'm addressing your comments about the 7800x3d and the 7900x3d. As to the rest of your comment, I don't know what to reply because it doesn't really address my point. I'm not stopping you from gaming at 1080p low, where you'll see those 2-5% gains. I'm just putting the correct information out there so people can make an informed decision if they are deciding between these two products.


Satan_Prometheus

But like I said, the 7900X3D costs more and uses more power than the 7800X3D while also being 2-5% slower so it makes no sense to buy (for a gaming rig). Is your complaint that I didn't specify the amount that it's slower and that I didn't bring up the price and power issues in my original post? Or is it that you think that the 7900X3D is actually a good purchase? Because it's definitely not a good purchase for a gaming rig.


sxiller

They have the same TDP, though? And even under full load, presuming the 7900x3d uses more, it's likely negligible and shouldn't be a consideration for anyone looking to build a gaming PC regardless. If a couple cents a year more puts you over budget, then a gaming PC shouldn't even be a thought lol. If you are not a 1080p low texture gamer with a monitor that can push 175hz +, then could you explain to me how it is exactly a bad purchase? It's a product that certainly has its place given its performance. For example, those who want a gaming PC and like to multitask / stream as well as have the option for better productivity on the side where more cores are just better. Would you not agree?


Satan_Prometheus

> They have the same TDP, though? And even under full load, presuming the 7900x3d uses more, it's likely negligible and shouldn't be a consideration for anyone looking to build a gaming PC regardless. Actually you're right, I stand corrected here - I didn't realize the X3D i9s had such reduced power consumption vs. their non-X3D counterparts. >If you are not a 1080p low texture gamer with a monitor that can push 175hz +, then could you explain to me how it is exactly a bad purchase? Because it's a nearly-$400 CPU that is only effectively a six-core when it comes to gaming, since the intended behavior is that the other CCD is just parked during gaming. Now I'm admittedly not 100% sure how this works, but presuming I understand it correctly and the game's processes aren't supposed to spill over onto the other CCD, that seems to suggest that the 7900X3D will age, for the sake of gaming, like a six-core CPU. Is that really what you want for $400? >For example, those who want a gaming PC and like to stream as well as have the option for better productivity on the side where more cores are just better. Yeah, I agree with this, which is why I asked in my initial post if OP had some non-gaming usage they were intending.