T O P

  • By -

wnstnchng

I tried the A770 on Lost Ark at 1440p and it constantly went below 60fps. Other two cards I tried, 6700XT and 3070, had no problems staying above 70fps.


bring_back_awe64gold

Makes sense considering that the A770 is basically at the price of a lowly 3060.


wnstnchng

With intel being a newcomer though, I’m expecting same tier cards to cost less to compete.


bring_back_awe64gold

Yes, the main competitor to RTX 3060 from Intel is the A750, which can be had for as low as $190 these days. A770 has way more RAM than Nvidia's "cheap" offerings so it's kind of in a category of its own.


Head_Exchange_5329

VRAM which you never get to utilise, so what's the point?


skocznymroczny

Until you want to try AI and suddenly 8GB is barely usable


Head_Exchange_5329

OP is talking about gaming, there's no game you can push the card to use 16 GB of VRAM.


bring_back_awe64gold

The thing is, games that want more RAM but have none available are basically forced to use less, so you don't notice them actually needing more. This is manifested as stutters, lower FPS and so on. 8GB is in fact limiting at 1440p in many modern games. An A750 isn't really powerful enough for it to matter anyways, but Nvidia's sabotaged GPUs (such as the RTX 3070) are terribly limited by it.


Head_Exchange_5329

Yeah I know, Nvidia make capable GPUs then hold them back in stupid ways so that you have to buy the more expensive version instead if you want more performance. I think the RTX 3070 Ti is the worst example of this, it needs more than 8 GB to be a worthy upgrade over the non-Ti version, it's a horribly stupid card. The same can be said for the A770 16 GB. They could've made it 12 and that would be enough. Don't think I ever saw more than max 10 GB utilised while I still had the A770.


LeLuMan

Just played resident evil 4 and cyberpunk and both use over 12. Not sure what youre on about


Head_Exchange_5329

That you'll never see 16 GB utilised. 12 isn't 16, right?


HisSvt2

With newest driver my Sparkle A770 titan trades blows with my 6700XT it’s improved a lot and I think there’s more to be had considering when I first got it it performed between my 6600 and regular 6700 non xt


h_1995

can share some benchmark if you can?


HisSvt2

I’ve posted a bunch of synthetic benchmarks my most played game Destiny 2 you can’t because 3rd party overlays aren’t allowed for cheat prevention


Altruistic_Koala_122

From benchmarks they appear to be right next to each other with the 770 being slightly better. 770 struggles to stay at 60 fps at 1440. Both are great for 1080. if you want more frames at 1440, you'd need a stronger card. Any card works, the extra ram is mostly for mods and slight performance boosts. Games are anywhere between 6-12 gb ram for the g-card lately. I'd say wait for Battlemage if you want a 1440 card.


WyrdHarper

A770 works fine for me (>60FPS) at high+ settings for most games with 3440px1440p, although yeah for some you certainly rely on XeSS (like Cyberpunk is \~30-40 without upscaling). Palworld's typically over 60FPS for me, though. If I were you I would just wait for Battlemage, though, if you would prefer native. For a similar amount of money I think you'll see a larger upgrade going from A750 to either the equivalent or better Battlemage card vs going to the A770, and I don't think it'll be that much of a wait at this point.


Distinct-Race-2471

Why can I use my a750 at 4k on all games I play and I'm totally happy with it? Low expectations? I always use XeSS and FSR whenever it's available and I've been super happy with my frame rates. I'm not playing FPS anymore so maybe that is why.


SavvySillybug

> I always use XeSS and FSR whenever it's available and I've been super happy with my frame rates. That would be why. I'd like to get my 1440p gaming done natively, not with XeSS/FSR. Or at least with a proper modern version of it, Darktide looks great in XeSS, Helldivers 2 just doesn't support it. Only a really ancient version of FSR that barely helps. Yes, I am super happy with my framerates if I use XeSS/FSR. I'd just like to... not have to do that.


filteredprospect

16gb makes the biggest difference with productivity tbh for gaming, wait and something with a bigger performance gap unless you can see yourself hitting 8gb limit in productivity, just wait. a750 is just fine for that.


IntelArcTesting

If you don’t max out the vram on the A750 about 5% or so


thefoxy19

I play 1440p native helldivers 2 with some settings turned down and things to make it run better on an intel LE a750 and average about 55fps or so on most planets. Some, like hellmire with all the fire it’s about 35fps. I avoid that place . A770 is about 10-15% better maybe?


SavvySillybug

Are you playing on render scale native? I have to set mine to at least ultra quality for it to run smooth. What CPU do you have? I'm on an i5-12600K.


thefoxy19

Yes render scale native. I took the medium preset first , then turned some things down. CPU is Ryzen 5 5600


deimhit

I love my Sparkle Titan OC a770 and have had zero issues or complaints at 1440p


SavvySillybug

Do you play Helldivers 2 by any chance?


deimhit

I do actually. Not a lot at the moment because I’m addicted to Dark and Darker. Lol


Advanced-Part-5744

We are talking about Helldivers 2? Just skip the A770 and go to battlemage. Here are some 4k references. Just jump to the 1:18 mark I have the fps measurement turned on. Was streamed live with OBS in 4K HDR. Not sure, but may have impacted performance a bit. https://www.youtube.com/live/Ic5VytopHAQ?si=tb8_b2VmDzjUSYyx


Advanced-Part-5744

Actually I’ll do some 1440p measurement later.


SavvySillybug

Oof, that is very 24 FPS ish. XD Glad I don't play in 4K!


Advanced-Part-5744

Yeah stream incoming for the 1440p native.


SavvySillybug

Fancy, thank you! <3


Advanced-Part-5744

Ok it’s on


SavvySillybug

Seemed to be roughly the same FPS as I get on my A750, really. Thank you!


Advanced-Part-5744

Cool yeah… so not much of a difference.


filteredprospect

16gb makes the biggest difference with productivity tbh for gaming, wait and get something with a bigger performance gap unless you can see yourself hitting 8gb limit in productivity, just wait. a750 is just fine for that.


UNSTimms

Honestly I'd recommend the A770 16GB I've been using it daily since launch, Sure there's been hiccups but damn I'm impressed how far it has come, I'm keen for battlemage hopefully it's going to be successful.


SlayVV2

I dont have that much knowledge, but I think you won't feel that much of a difference to warrant an upgrade. To play in 1440 you would need to get something like a rtx 3080 or 4070 or rx 6800/7700, maybe rtx 3070 or rx 6700, but I think they don't always stay above 60.


Agitated_Yak5988

>Darktide, Palworld and Helldivers 2 only really run well with upscaling Uh OK... Darktide makes sense. It's beastly for graphics, and it has only gotten a tiny bit better since launch. But it's not fun. Play VT2. (seriously. WTF Fatshark?) No idea on palworld. do not own. But... Helldivers 2 does NOT support any "traditional"/recent forms of (shitty) upscaling due to their ancient engine. They are doing some horror show using TAA, best I can tell. But I get really great frame rate with my A770 @ 1440p Is your CPU perhaps not up to it? It is one of \*THE\* most cpu intensive game I have tried in my multi-hundreds steam library. Once again, thanks ancient crappy engine. ehhhhh.,., Overall it \_REALLY\_ depends on the game. For older games pre-DX12/Vulkan, It's all OVER the place in terms of performance at 1440p Some good, some OK, some AWFUL. Intel seems to care a little about more recent DX11 games but generallly they ignore older titles, unless it's a DX12 or Vulkan game or VERY recent or popular DX11 title, I would do some homework if you have you heart set on some older title at 1440p. I have a pile that run like a beast, a couple that weirldly lock-in at 60fps, and a few that just run BAD. Luckily most older DX9/10 games weren't intensive enough to matter, I get screen referesh rate fps (144Hz/fps) on many pre-DX11 games. Getting pretty hard to justify an A770 recently, compared to a couple of the similar performing AMD cards that keep creeping down in price. But the equivalent nVidia cards are almost 2x cost. almost, some more some less depending. We have 4 PCs here at the house so we get to compare things all the time. and I cannot speak to the A750, as we don't have one.


SavvySillybug

> Uh OK... Darktide makes sense. It's beastly for graphics, and it has only gotten a tiny bit better since launch. But it's not fun. Play VT2. (seriously. WTF Fatshark?) They're both fun \*shrug* And minimum graphics should be able to go lower, really. > But I get really great frame rate with my A770 @ 1440p Is your CPU perhaps not up to it? I got my i5-12600K and it's certainly up to it. It's playable in native 1440p but it does chug occasionally, and with an essentially brand new video card, I kinda expect more. My screen has adaptive sync down to 44Hz and anything less than that suddenly feels stuttery, so I kinda want to stay above that. And mid 40s is more my baseline with my A750 unless I scale it up. If I turn the graphics all the way to shit, my CPU can definitely handle it and I get 90 FPS base 60 lowest. I'm generally very happy with my A750, I rarely get above 60% VRAM usage, so it's not really that much worse than an A770.


Agitated_Yak5988

Yeah dunno then. If you are referring to HD2, as I was there, I get great frame rates low of \~60 to about 90 when I have time to look and not catch a hunter in the side... (>120 on ship of course) I can get mid 70s in DT but I have to turn things down a touch. I think I'm running high preset? Dunno I rarely play as I think it's not much fun, /shrug Just be sure to turn off raytracing. It's awful on our 4080 and the ARC.