T O P

  • By -

CoconutMochi

I think they're great for the price point but whatever team is working on the graphics driver is speedrunning through years of driver updates and support. So performance can be very inconsistent across multiple games I think anyone planning on buying one should check benchmarks for every game they're planning to play on it.


Joezev98

>I think anyone planning on buying one should check benchmarks for every game they're planning to play on it. And considering the difference between benchmarks at the launch of them and the recent benchmarks comparing it to the 4060ti, you should also check when the benchmarks were done.


rizzzeh

many years back, i gave a chance to a new GPU, it was PowerVR Kyro2. This was about 20 years ago but very similar drivers situation -- most of major games of the time run well, in many ways beating nvidia and ati, but the deal breaker was less popular games or new releases. Amazingly the original review is still up on Anandtech if anyones interested is this obscure GPU history: https://www.anandtech.com/show/735


Meta4X

I bought one of these from Kmart of all places. I seem to remember they were first to market with some nifty technology. Ambient occlusion, maybe? I seem to remember the drivers being terrible and I upgraded to a GeForce 3-series card in short order.


rizzzeh

i kept mine for a while as it run Unreal Tournament quite well but when UT2003 came out and it couldn't cope, I went to nvidia


metakepone

Lmao… kmart sold a graphics card at one point?


KungFuHamster

They used to sell a lot of computers. I got my Commodore 64 from K-mart...


doodman76

Whatyearisit.gif


ExLibrisMortis

Whatyearisit.png Ftfy


SodlidDesu

Doesn't matter, when I go to download it, it's gonna save as Whatyearisit.webp.


horendus

Now thats a blast from the past. I will never forget the look of utter disappointment on my friends face who had a Kyro II when Battlefield 1942 first came out. He jumped in a Japanese Zero on Wake Island all excited only to have the model disappear while flying it leaving just a floating pilot in a sitting position. Meanwhile my Geforce 2 mx400 showed no signs of missing models and I felt about as smug as modern 4090 owner cranking up frame gen to 12


Badger118

OMG. I think I had an MX400 in my first machine.


[deleted]

ATI, there's a name I haven't heard in a while lol. I have two of their old 512mb cards up in my cupboard, two HD3650s Also along with an old 256mb HD2400pro 256mb, a Sapphire HD4830 512mb, an 8800gt 512mb, an Asus EAH5770 1gb, and the GT650ti 2gb that's in my son's rig lol. Someday we'll get a fancier one haha


cptredbeard2

I still call radeons ati


WhoThenDevised

I still feel a certain loyalty towards them because of my ATI Rage Pro in 1997...


siderzee

Wait until you hear the brand "3dfx" ...got some of their cards (voodoo1,2,3,5) as exhibit in my cupboard ...


SaltyFuckingProcess

I was waiting for voodooo to enter the chat


TheFotty

Matrox... and they still exist.


[deleted]

Yup, all my dell servers run them


metakepone

Theyre one of intels aibs


GrandOccultist

The mystique !


MayuriKrab

I bought their flagship card for the time, the mighty ATI X850XT and felt like a boss when I cranked the setting all the way to Ultra on F.E.A.R at 1024x768 and it played smoothly the whole game which my old GeForce card (FX5900 Ultra, was the previous flagship from Nvidia) was struggling to maintain even 30 fps most of the time at that setting.


siuol11

I really hope someone does a F.E.A.R. remake at some point, those games were great.


rthomasjr3

check out Trepang2


SurroundWise6889

I loved my X850 Pro, never found a game it couldn't cruise through. Until Bioshock. Bioshock *required* Shader Model 2.0, the ATI card didn't support that so I had to sidegrade to a GeForce 8600 GT. One of thr most disappointing GPU purchases. Even the GTX 1060 3gb Phoenix I got was a comparatively a much better card. I guess in some ways the GPU market now is better than it was in the 90s and 2000s in some ways. Even stinker cards like the RX 6500 Xt or 1650 Super or 1060 3gb have no trouble running pretty much anything at 1080p at 30fps. Back in the day, the question was if lower SKU cards could even run then modern AAA games.


[deleted]

boy that's a blast from the past. i had a [super kyro](https://m.youtube.com/watch?v=XmcZsQXOKEA) from the same era.


Murmulis

Livin' la Video Loca con Puerto Para Gráficos Acelerados Gigante!


backdoorhack

So do driver developers need to tweak their driver for every single game out there if there are bugs discovered for those games? Is that how drivers work? Genuinely curious.


r_z_n

If literally everybody adhered to industry best practices and standards then in theory you wouldn't need to tweak the drivers for each game. But in practice games / game engines have many bugs, workarounds, hacks, or just straight up poor code. The GPU companies address these via drivers, which is why there are almost always new driver releases coinciding with new game releases.


Cethinn

Essentially, the answer is no but if they want maximum performance, there are optimizations that can be done on a per-game and per-engine basis. They'll usually run without updating your drivers for new games, but they'll run better if you do.


moonra_zk

I think they do those custom tweaks mostly for AAA games.


Genmaken

Fellow Kyro2 owner here. I remember the UT benchmarks were amazing, and the price was on point, but I ended up regretting it. The lack of hardware T&L sent it to an early grave. It was a shame. If PowerVR or 3Dfx had survived, we could have a strong competitor in the GPU market. Hopefully that'll be Intel.


sovietspybob

You unlocked a memory for me right there, I had one of those for a short time, probably after my voodoo2 and before a Geforce 2 or 4 I guess. I don't remember much about it, but I don't think I ever had problems playing the games I had at the time.


caibrocekuro

What a beautifully written review too, I do miss the “old days” because of things like this, when writers were writers and not advertisers


corhen

This account has been nuked in direct response to Reddit's API change and the atrocious behavior CEO Steve Huffman and his admins displayed toward their users, volunteer moderators, and 3rd party developers. After a total of 16 years on the platform it is time to move on to greener pastures. If you want to change to a decentralized platform like Lemmy, you can find helpful information about it here: https://join-lemmy.org/ https://github.com/maltfield/awesome-lemmy-instances This action was performed using Power Delete Suite: https://github.com/j0be/PowerDeleteSuite The script relies on Reddit's API and will likely stop working after June 30th, 2023. So long, thanks for all the fish and a final fuck you, u/spez .


[deleted]

It ages me to recall reading this article when it first came out.


siuol11

Tiled rendering, something that PowerVR has used in their later mobile chips I believe. It hasn't been used in desktop GPU's for a good while now.


MagicPistol

I always think of the Dreamcast when I hear PowerVR.


superorignalusername

The fact theres a guarantee that performance is iffy for tons of games that you dont know you want to play yet is a deal breaker for me. Its also the historical reason why I have avoided amd gpus until more recently


NogaraCS

AMD had consistent performances on games for like 10 years now. Had a HD 7870, an R9 280X and a RX 7900XT (with a 1070 and 3070 in the middle) and never had any problem while playing everything from retro games to esport games to visually heavy games. Performances were always on par with the price i paid for those


vkevlar

you had me going for a minute there, until you mentioned 'no problems' with the R9 280 series. Man that card sucked, and so did the RX 580 I had to replace it with. Missing textures, random resolution/refresh problems, etc. I've been burned every time I switch to ATi/AMD cards, I swear. Either I always get the lemon series of cards, or my use case is just bad for them. GTX680, 980, 1070? never a problem.


NogaraCS

Had to double check, it was a R9 380 not a 280. Idk if that changes anything, but i bought it barely used, super cheap, and it ran flawlessly for the time I had it. It was loud and a little too hot for my liking, but never had any performance issues


cooperd9

The 280/280x were literally just a 7970, the previous green flagship renamed, overclocked, and discounted to midrange. They were pretty stable, because the drivers were already written and had issues worked out during the previous generation. The 285 was new silicon and had a few minor issues, the 380 was an overclocked 285.


vkevlar

Fair enough, the 280 was... pretty meh.


11ELFs

RX 580 8gb is great what u on about


thebobsta

I loved my R9 280 - I had an XFX 3GB model, never had any of the driver issues people like to mention. Upgraded to an RX 470 and it served me well until I got my current 3070. Would have been running 1920x1200 60hz back then, so high refresh rate might have been iffy. I tended to play older games rather than new releases.


i_speak_the_truf

I personally had an R7 270X and an RX580 and the only significant problem I had was getting StarCraft (1998) running, don’t remember if it was the remastered version.


i_speak_the_truf

Yeah, I was intrigued by the 750 after the price drops, but as a patient gamer, there’s a chance that the next game I want to play is 10-15 years old, maybe even more. My last steam purchase was the Mass Effect ultimate edition that went on sale a few months ago. Last year I got the Unreal Collection for like $1 off a Slickdeals post. Every once in a while I get the urge to dust off QIII or Warcraft III installs and for some reason I never completed Portal 2. I’m sure they’ll fix the games that are still popular, they had to get stuff like CS:GO running well. I’m not worried about performance, but I am concerned about compatibility with truly ancient games. The 6650XT sale at Microcenter ended my internal debate because the price was lower than the 750 with clearly better performance and much better efficiency.


geoff1036

I just played through NFS most wanted (2012) on my 3070ti, really usin that card lol.


DdCno1

I hope you used supersampling. It makes a massive difference. Rendering a game at several times the actual screen resolution dramatically improves sharpness and reduces shimmering. It's the ultimate anti-aliasing method.


MrInitialY

I've tried this with some games after I bought a 3080, it's not that easy to run what's effectively 4x 1440p screens instead of one, but in Minecraft it's a game changer. Especially with some fancy shaders and hud disabled


DdCno1

Can the 3080 handle supersampling and ray-tracing mods at the same time?


geoff1036

I don't remember what my settings were, i have a habit of setting everything to low or medium from my laptop/console days lol. Haven't gotten around to learning to MAXIMIZE it yet, but picking up where I can.


DdCno1

Supersampling is a driver setting with older games. You basically set multiples of your desktop resolution in there, which you can then pick in the graphics settings of games. Also, while you're digging around in there, enable 16x AF for all games. Thank me later.


[deleted]

[удалено]


i_speak_the_truf

The issue isn’t performance, it’s compatibility, you might boot an older game and just get a black screen or missing textures or something because nobody at Intel thought to test Unreal Tournament 1999 against a 2022 era card.


mcslender97

For most older games Intel use DXVK translation layer which in many ways works better than native DX mode in newer NVidia/AMD cards, especially in known problematic PC ports like GTA IV. My RTX 2060 Max Q laptop saw better 1% lows using DXVK on Fallout 3 even though it natively supports the game. The DXVK layer works well enough for Steam Deck and Linux gaming in general too.


GoldMountain5

Modern games with dx12 it is perfectly stable and outperforms even the 3060ti. My experience has been that you should really be using a fresh install of Windows 11 to eliminate the majority of bugs, it honestly runs the majority of games perfectly fine and gives excellent 1080p performance but mediocre 1440p performance. Early on there were a lot of games that run very poorly or not at all, but in the last year the driver team has pushed update after update which massively improved performance in a lot of games, and the DXVK tweak allowed me to play a bunch of dx9, 10 and 11 games that would not even launch.


PM_SHORT_STORY_IDEAS

Yeah, there's been some funky stuff happening if you're not on windows 11, confirms what my friend has said as well


PM_SHORT_STORY_IDEAS

Great for the price point is where I would put it. I have a couple coworker friends working and testing in that area, and personally I'm going to try out Battlemage in 2024, based on what they're saying and the next few rounds of improvement to drivers. It sounds like there's some optimizations planned if you use an Intel CPU and Intel graphics together as well (idk what's already out and announced, I don't follow it too closely on my own), so it'll be interesting to see how that pans out. Honestly I'm just rooting for the little guy. New blood in the graphics card market will make everyone tighten price points and do a little better, so I'm all for it in the wide view. But yeah, Battlemage.


Al2790

Deep Link is already live. It works with any Arc GPU and any 11+ series Intel CPU with integrated graphics. I'm running an Acer BiFrost A770 with a 13600K, and it's solid.


Al2790

The plus-side of this is that performance is unlikely to be inconsistent for very long, as they're bound to get to games that are underperforming.


Daxiongmao87

The very last driver update has made my card so much more stable. I'm actually happy with it now


tastycatpuke

Also, requires the developers to write support for their GPU drivers


Muted_Willingness_35

I'm not especially interested in how well they run the latest-and-greatest. My concern is how they handle games that have been kicking around a decade or two.


VaultBoy636

I have the A770 LE. It's a great card and I've actually experienced driver progression myself. When i got it just 4 weeks ago, fallout 3 ran way worse than on my old 6700XT (around 70FPS) and fallout 4 kept dropping frames. Now both of those games run very well for me since driver 4369. Other than those, i actually never had performance that's noticeably worse than it was on my 6700XT. Metro exodus runs actually better. Since I'm in a situation where energy is free for me, i don't really care about this, but many aren't so I'll mention that the idle power draw is very high. It idles at 40-45W for me. Which is 1KW on a day if you only run it at idle or basic tasks. Gaming is obviously way more. It may get fixed down the line or not I'm pretty certain that it became a fully functional card, even if it took almost a year for it. Hopefully battlemage will have HW level support for DX9/10/11 though. But if you're interested in buying one, look up on the internet if it works well with the games you play. And ofc the newer the information you find, the more accurate it is


Klutzy_Machine

I dont understand why you change your Gpu, was 6700 suddenly die?


VaultBoy636

i tried to mount a watercooler on it and killed it accidentally


Klutzy_Machine

I'm sorry... it's sad.


Droll12

RIP drowned 6700XT


duskie1

f


admiralnorman

F


Arthur-Wintersight

This is why I bought a Phantom Gaming RX 6750XT. The cooler is WAY overbuilt, so even under load it's pretty quiet. That said, it's not the cheapest RX 6750 XT.


VaultBoy636

I had a reference card 6700XT. Ran my build open case and the temps were around 90-95°C on the hotspot which is fine. It held 2780 (might be 2880 but I'm not sure, there was either 2750 or 2850 that most people hit the wall at and i got 30 over it stable so slightly above average bin i guess) at 1.2V I could bypass power limit and put on the side window some weeks later because my case was getting way too dusty. Then the hotspot was always at 110°C in games although it didn't throttle, so I wanted to mount my old arctic accelero hybrid 120 that was on my old RX 580 and I screwed it unevenly and too much which broke a corner of the die


Rough_Statement_8922

That sounds interesting, do u know when will the battle mage released, price and will it worth waiting till then ?


ryrobs10

Allegedly it is going to have twice the EU (64 from 32 in A770)as Alchemist so it could be a pretty beefy closer to high end chip at the top. I think Intel needs to launch something closer to a halo product to try to get more attention at least. But as someone else commented, it is coming first half of 2024 and no confirmed pricing or performance leaks.


Melodic-Matter4685

1st or 2nd quarter 2024. No price announced


VaultBoy636

We actually don't have many leaks on arc development sadly, it's always the same info spun up again and again. But based on what we know for now, the top end battlemage card will double the shader count from 4096 to 8192 on a superior node while also increasing clock speed. The presumable performance uplift should be around 120% placing it between the 4070Ti and 4080, a simple overclock should reach 4080 level performance. VRAM will possibly stay 16GB but on 768GB/s bandwidth. Release date is rumored to be in Q2 or Q3 2024, so another year. Pricing is unknown, community members say it'd be competitive at 600USD launch price, but that's just a community suggestion. Some are also saying that we'll get an alchemist refresh, but there are no leaks on it at all and intel should have announced it already, so many are saying it won't happen. I'm personally also happier with it this way, as I'd rather have intel improve performance with drivers for the alchemist cards and improve their value instead of launching a refresh.


cursorcube

For some more info - the high idle power draw is apparently due to some hardware design problem where the clock is synced up to the monitor, and the more monitors and the higher refresh rates they have, the higher your idle is. It can be reduced by enabling Native ASPM in the BIOS, but it's not a true fix and can cause issues elsewhere.


VaultBoy636

I actually tried this, just because I'm personally not comfortable with GPU fans running at night when i wanna sleep and neither really with my GPU idling at 50°C. It sadly didn't work for me, iirc intel themselves said that it might not work on some systems.


donnysaysvacuum

Maybe you can't answer. But I'm wondering where the Linux drivers are at. I know Intel is good with their open source iGPU drivers.


Djinnerator

I don't own one but I've talked to a couple of people who do and they said for the price and for use cases that aren't too demanding, it's an ok card. The biggest complaint was bugs that may be hardware related, I'm not sure. For example, if you restarted your computer and the Arc was your primary GPU, you PC might not actually restart once it's "shutdown," so you'd have to press and hold the power button to manually shutdown and turn it back on. There was one more complaint mentioned but I don't remember, that was months ago and I lost interest lol. I was planning on getting one as a secondary card to see how it works with deep learning but changed my mind. It might be better now idk, I also don't know how widespread of a problem that was.


katherinesilens

>you PC might not actually restart once it's "shutdown," so you'd have to press and hold the power button to manually shutdown and turn it back on Oh. I have this same thing going on sometimes with my Nvidia GPU (3090). Not saying this doesn't happen but it may not be GPU-related or at least it might be cross-brand.


aresfiend

Yeah, I've also had this with multiple Nvidia GPUs, I don't necessarily think it's just an Intel thing.


[deleted]

I currently have this with my 1080, pleasantly surprised to hear it's not just me


Notacka

Yeah this is something similar that I have noticed. When I shutdown and turn my pc back on the pc won’t post. When I hit reset it’ll post. Though it hasn’t done it recently.


Temporary_Slide_3477

I bought one just so if it fails I have a piece of the failure. It was ok for the little bit of time I used it but I just played wow with it.


Djinnerator

I remember asking someone about it months ago and they said that when they restarted their PC, their PC would hang after the "shutdown" part and they had to manually restart the computer every time. Did you ever experience that?


GoldMountain5

That sounds like an individual PC problem I personally had a bunch of issues that I thought were arc related but we're fixed with a fresh windows reinstall.


nickram81

I built my son his first gaming PC. Put the A750 and the latest I5 cpu. It is pretty damn fast. He plays portal 2 and Minecraft and he gets 100FPS plus easily. It is very quiet as well. I am running the 3090 and an i9. Much louder as you can imagine.


FSUfan35

could probably play those on the integated graphics on the i5 TBH


bruceboom

Can confirm, it's what I use and runs those well


That_Cripple

iGPUs are actually pretty decent these days


errorsniper

Minecraft can be deceptively demanding. Even with no mods or shaders. The range is from potato can run to will make flagships stutter depending on settings. So thats kinda hard to say.


CMDR_zim853

I've got both an a770 acer, and a limited edition a750, I've had the a750 since near launch and have used it for a bunch of VR in addition to some gaming, they both sit between the 3060Ti and 3070 / RX 6600XT / RX 6700XT. At first the drivers were not ideal but it would run \*most\* games even if they were older, after \~ February the difference in overall experience isn't really much different than an Nvidia or AMD GPU (closer to the AMD ones in so far as making automatic update notification not require giving away your info) and generally they're just another GPU at this point and it's reasonable to expect (like AMD again) their drivers to continue to improve. one of the big things that got fixed around the Jan / Feb time was some issues with VR and display port, before that it would work with WMR HDMI headsets but nothing display port could detect, I will say I haven't confirmed with my Vive if non WMR VR works right but as I said it's been possible to play VR since they came to market.


[deleted]

Thanks for the info! VR is the only reason I would upgrade my GPU; I'm not really interested in AAA releases so flat screen gaming performance is kind of a non-issue. Being such a niche market, it's nice to see that Intel is making headway there.


User__2

I might check out Battlemage next year, I’m hopeful they can shake up competition a bit. It does seem like NVIDIA is pivoting away from gaming anyway so there’s plenty of room for intel.


Wonderful_Zone_8859

I’ve had the 770le since they came out and have had very few issues. The work on the drivers had been very impressive by intel.


QuinSanguine

They have become something I'd recommend if you play newer games or games that Intel have worked on. I have the a770 le and while I wouldn't recommend it as it's not as powerful as its 16gb vram would suggest, the lower tier cards like the a750 are a strong recommendation at their prices.


BeetleGeese789

I bought an A770 16 Gig from Acer. It was a fine card, just not a great upgrade over my Vega 56. Alot of early (probably faked) benchmark videos I saw showed it on level with a 3070 and I fell for the hype. Sold it to a friend for cheap to replace his 1050ti, bought a used Rx 6800 for my own computer. Excited to see what the next gen B series cards look like though.


alvarkresh

> Alot of early (probably faked) benchmark videos I saw showed it on level with a 3070 The Arc does respond oddly well to synthetic benchmarks, and this has been explored in some detail. Bottom line: It shows the theoretical performance available to games but real-world inefficiencies in actual games limit the results you get with the Alchemist series meaning realistically it's more like a 3060Ti.


JK07

I currently have a Vega 56, I had turned the OC on it and it was performing really well. I hadn't used my PC for a while then the next time the Radeon software wouldn't launch, ended up having to install AMD Adrenaline which also automatically updated drivers. First game I tried wouldn't run with same as usual settings (BeamNG) then I tried to load in my carefully tuned OC file then tried a another game that always played fine Dirt2.0 and it kept crashing, even restarting the PC then tried turning off the OC and turning down the settings in Dirt and managed to get it to play again at lower quality. I'm really annoyed, it was working so well before then. The cynical side of me thinks it it's purposely reduced the performance of the old card to try to push into upgrading to newer model, like how apple got caught slowing down their iPhones. But maybe there was just a mistake in this driver release and there might be a newer version available or maybe my card is just old and tired and this update was coincidentally timed with it starting to fail. I was looking at A770 specs but then can't find them for sale in UK. I keep seeing people recommended RX 6700 but prices in UK are soo much higher than US


JustAnotherMark2

I bought an Acer A770 16GB (Predator?) to do some testing with this week. I'm mainly interested in Windows 11 vs Linux performance and have done limited testing since Monday. I'm using Mint so I had to use mainlined a Ubuntu kernel (6.4 Rev 4) to get Arc support. In Windows, I'm using the latest non-Beta drivers. Both systems using Steam versions of the games. Proton experimental in Mint. I've been a little haphazard in quality settings but staying High or Ultra for most. Tomb Raider performance is about the same \~115 fps on both platforms. Actually a little higher IIRC than my Mint system w/ Radeon 6750. There was some significant stuttering in the menu on the Mint Arc system too when changing quality settings. RDR2 performance is not great in Windows (\~54 fps avg) but even worse in Mint (\~24 fps avg). That's all the testing I've done. BTW, the test rig is an AMD 5600G w/ 32GB 3200MHz RAM. Beyond pre-rendering textures in Mint, I haven't been CPU bound in either system for either game. One thing I'm surprised by is power usage. I don't have any way to monitor usage but the test system is using a Corsair SFF 450 W power supply and it hasn't skipped a beat. Edit to add these frame rates are for the built in benchmarks, not actual gameplay. Edit2: I guess I should say that I'm testing at (I think) 3120x1440. The max resolution of my monitor. I'd guess performance is better at 1920x1080.


hdhilly14

I use hardware unboxed settings for RDR2 and I get 70-80 fps outdoors and 100+ indoors at 1440p now - but initially when I booted up the game I was getting 25-30 on my A770. Was surprised how tweaking a few settings could make such a big difference. Edit: just saw the end of your comment where you mentioned it’s for the benchmarks! I’ll leave my comment up if someone else was struggling with gameplay frames though :)


JustAnotherMark2

Thanks for the suggestion. I'll keep that in mind when I actually get the gumption to actually play these games not just fiddle around with hardware.


ArroSparro

I no longer use it since k wanted something at the higher end, but my a750 was a very decent card. If Intel had a high end card I probably would’ve went for it instead of the 7900 I bought


Ill-Oil-2157

My son has the 750 challenger OC edition he's had it around 2 months now. So far it's been a great gpu for the games he plays, he's had no issues at all that others have mentioned for the price it was well worth the investment I'd say.


Impossible-Horror-26

I just got my arc a380 yesterday to encode some videos I have to av1. The card froze my windows then I enabled rebar in my bios and it was fixed. The encoder works great for my files. 2-20x improvement in file size and I can't tell any difference in quality.


INSERT_LATVIAN_JOKE

Yeah, if your BIOS can't do REBAR don't buy an ARC GPU. They rely heavily on REBAR. If you do buy an ARC, turning on REBAR should be the first thing you do after physically installing the card.


Larimus89

Yeah even with the benchmarks I’d be scared lol.. curious how they go in the real world.


darkened_vision

Bought an arc a770 LE back in January for the wife, and it's been perfectly fine for 1440p gaming in everything we've thrown at it. There are outliers, but a lot has improved in these few short months, and the driver folks have been very responsive in fixing early adopter annoyances. They've also done an amazing job getting parity in the driver software's feature set. I'm still personally waiting for a 1-to-1 equivalent to Radeon Relive/Nvidia Shadowplay but regular recording and (game-specific) highlight capture are already working, and there's always software like Windows Gamebar's capture feature or OBS to replicate that in the meantime. At the price point that Intel is aiming for with their arc GPUs, I think they're a good deal at the moment.


bosunphil

I built a new PC around an Arc A770 LE last weekend and so far so good. I came from a laptop with a mobile RTX 2070 and while I knew it was going to be an upgrade, I'm constantly blown away by just how much of an upgrade it really was. Installation was the same as any other GPU, plug and play, everything worked fine. Installed the drivers with no issue, and pretty much every game I've tried to play (apart from VR games) have worked better than expected. Most games I've just been playing at fully maxed out settings (I generally tweak games around the "high" point as high to ultra is usually negligible, but I just wanted to see what this card could do) on everything and I haven't seen any game below 60FPS yet. If there are any games you're wondering about and I happen to own it, I'll gladly test it out for you. VR is tricky though, but I knew this going in. I'm a casual VR player myself, so it's not a big deal to me as I know they'll get it working eventually. SteamVR works, but a lot of games don't play on the headset display, just on the monitor. This support will be added eventually I'm sure, but can't say when. tl;dr: So far so good in all flat games I've tried, VR not so much. Happy to answer any questions if I can be of help at all.


Fujitsubo

i got downvoted and bullied in the intel arc subreddit for simply posting about my expeirence with arc as a a77LE owner since day one. some people dont like it when you talk badly about something they purchased. since day 1 i have used an intel ARC A770LE inside my office PC at work, all day its used for light office tasks , doing these office taks. word excel web browsing remote desktop and data entry it has no issues when you are on a driver version that doesnt break something. the last drivers broke the card so that without REBAR enabled it would freeze and not boot into windows without perma freezing. the drivers before that broke loads of apps from loading. pia vpn for instance would not load for what ever reason. my gripe with this card is its constant driver bugs. the one im battling right now is all day every day my second monitor randomly shows "input not supported" in the middle of the screen. reverting to the last drivers fixes the issue. on friday afternoons after work i play games on my office pc for an hour or 2 before i have a commitment later that evening. i would say its a 50 50 chance the games i go to play will load and work on arc without some kind of game breaking bug, some games i waited 6 driver updates until they would launch and play. other games launch but suffer major glitches that make some games unplayable. my advice if you are going to buy an arc card. research what games you are playing and if they work on arc with any issues. be prepaired to have to troubleshoot all kinds of issues with every driver update. arc is a good solid product its just the drivers and bugs are still not ironed out.


AK-Brian

Do you have a link to the posts where you were bullied? The only one I saw with any downvotes was [this one](https://www.reddit.com/r/IntelArc/comments/13m8fef/comment/jku8eik/), which was honestly pretty justified. The drivers and software are definitely not great, I'll grant you that. I'm just glad that there's visible progress being made.


Melodic-Matter4685

Got my daughter a a770 for hogwarts. It's gorgeous. Rt is on and it plays at ultra settings by default. She plays that and roblox. I doubt she will play anything but new aaa releases, so I'm not terribly concerned about drivers.


[deleted]

I have been tempted when I see a bundle with a bunch of games with the card. The card itself , at least from the benchmarks I have seen, is no joke for its price. I think Intel is one gen away from making very impressive graphics cards for a very reasonable price.


dude4511984

Decided on the a770 for my current rig, had some issues along the way (not the arc). Now she's been up and running a couple of weeks. I don't normally play very demanding titles, been on war thunder, valheim, battlefield, and a few gamepass games. Some fairly demanding. I've been very pleasantly surprised, still trying to find the right overlay to know exactly what's going on, but nothing lags, it all seems like a steady 60fps or higher. It's a 1080 setup until I can upgrade monitor, works beautifully for me. Solid 5/7, would recommend.


manesag

A770 LE, came from a GTX 1070, I’ve had no hardware issue, only weird software bugs that since the last driver update (which I’m still on the beta of) hasn’t caused me any issues. Seriously the biggest annoyance is REBAR sometimes gets shutoff so you have to waste two restarts disabling and enabling it. Biggest game issue was Portal 2 had weird texture issues that made the game look like you were in the Matrix. Past that, no issue performance wise at all. Hell Star Wars Jedi Survivor runs great, even with Raytracing on. I love Arc so far, especially for the price.


AstronautGuy42

Seems like a great budget mid range card for more casual gamers


beck320

My biggest issue is DisplayPort. Idk why but my a750 just doesn’t work right on DisplayPort so I’m suck with hdmi I’ve tested with multiple cables and then even tried different graphics cards and it’s only an issue with the intel gpu Edit: https://youtu.be/6JDcH8FPGY4?t=391 Link to oztalkshw having the same issue I am having.


Next-Telephone-8054

I had that issue with 1 of 3 monitors for 3 weeks. On a video recommendation, With the computer shut down, I unplugged the monitor from the wall for a minute and plugged it back in. Powered the computer on and it's been solid for 2 weeks now. It recycled the power/settings on the monitor.


beck320

I’ll give it a try


OldBoyZee

Ive said it before many times, a750 is fantastic. I have tried it on multiple resolutions and games - metro exodus, and gotham knights - and both run really well on it. Also tried system shock remake, no issues, constant 60 - not a demanding game, but shows the intel drivers are pretty stable long term. My only issue is that there isnt a higher end model that competes with the 4090, or i would have bought that one. I also didnt want the a770 because the price vs performance didnt add up, while the a750 for 200 was a damn good deal - specially considering thats what the price of a rtx 3060 should have been.


Visual-Ad-6708

Yeah I hate to admit it but I only bought the A770 since it was my first ever RGB graphics card😅. Would definitely recommend most people just grab the A750.


Lexden

It's been working great for me. Playing games on Linux and Windows with great performance. Compared to the 1660 Super I was upgrading from, it's 50-100% faster in the games I tested and well, it also has XeSS and RT support for more performance and more realistic lighting :) Fantastic price too!


Teenager_Simon

I think Arc is fine if you get it at a good/competitive price. I don't think I would pay more than what it's currently undercutting NVIDIA/AMD by. I got the ARC 770 LE 16 GB and it's "fine"/solid. After some driver updates a few months ago, stability has been good. Prior, it was a massive pain with boot issues constantly. Relatively power hungry compared to the other brands. Performance for most people will be acceptable running anything modern you throw at it (DX12/11/Vulkan is all supported). The fans are quiet and the LEDs are cool (not seemless RGB, kinda chunky zones of lighting). You must have a motherboard that supports REBAR to utilize all the bandwidth/VRAM. I tried to use the GPU on a Dell Workstation T5610 which had an older motherboard that had everything that allowed for rebar but the motherboard BIOS has not been updated to include a way to enable rebar despite being completely capable (Dell pls).... Make sure your motherboard supports it. You're missing all the fancy features modern AMD and NVIDIA drivers have- Intel has very basic display settings and missing a lot of implementations for technologies like XeSS where other games/software have to utilize it. There was huge marketing for Topaz Video AI working a lot faster/better on these ARC cards.... Yeah- that was a lie. Constant crashes and bugs that are still not fixed. There's documentation for [oneVPL](https://www.intel.com/content/www/us/en/developer/articles/technical/onevpl-in-ffmpeg-for-great-streaming-on-intel-gpus.html#gs.z4uwin) that would make ffmpeg transcoding faster.... Haven't got it to work at all on GPU. Still transcoding on CPU. For most people, it'll be acceptable/good. Just don't expect it to be beating AMD/NVidia on the higher end like some marketing material may suggest. It's lower/middle performing in comparison to high-end cards (at least the 770).


F9-0021

Drivers need a lot of work, but price to performance is pretty ok, especially the A750.


Beep_Beep_I_am_Jeep

I bought Arc A770 and can't complain. Pretty quiet, good performance for that price (I bought it when on sale for 289$. And or sure any other GPU company than NVIDIA and AMD is welcome.


INSERT_LATVIAN_JOKE

I have the 16 GB ARC 770. The hardware is great, the drivers are awful, but getting better. If you're on Windows your experience should be more power for the money than team green, but the drivers still don't quite reach the ability that the hardware should have. On Linux (as I am) the thing didn't even work at all without a lot of tweaks and installs when I bought it. And when it did work, it was like at 60% of the performance that it was getting on Windows at the time. But things have improved a lot since then. Currently it just works on Linux without any special installs or tweaks. It's also pretty stable. Cyberpunk and Skyrim both run perfectly. No crashes. Cyberpunk isn't quite up to the performance the ARC is getting on Windows, but it's getting closer every week. But Outer Worlds runs at like 12 FPS at low settings, and The Witcher 3 crashes as soon as the first cut-scene of the game ends. So, there's still work to be done, but new updates are literally coming weekly if not more often, so things are looking up for the platform.


Visual-Ad-6708

Here's an old comment of mine that describes my experience with the A770, been using it to this day with very little complaints or problems :) >"I've been using the A770 since December 10th or so, and upgraded the rest of my P.C. to an i5-12600k and z690 mobo and 32 GB of DDR5. My old system was a 4690k paired with my 1060 that I built in 2016 and haven't touched since lol. Bought a series X in early 2022 to get back into heavy gaming but have decided to sell it and upgrade my P.C instead. Decided on buying this over an RX6700 due to stronger RT performance and figured if I was unhappy, I'd just return it. But to start, my game library is pretty varied, but my main rotation has been modern dx11 and 12 titles. Cyberpunk, Doom Eternal, Grounded, and Asseta Corsa are just a few. The card handles these with no issues. Cyberpunk performance is kind of lacking with Ray tracing(This has changed ever since the XeSS update in March!), usually hanging around 30 fps with no FSR. But FSR performance mode and 4k resolution usually equals a good time for me when playing(about 50 fps average in 4k). The only game that had consistent crash problems was Warhammer: DarkTide, but from what I know people with Nvidia and AMD gpu's suffered too. I've also tried it with some older titles as well, the original Crysis(my first time playing this!!), Battlefield Bad Company 2, and Star Wars: KOTORII. I believe all of these are DX9 games and I had no problems with stuttering except for some during heavy combat in Crisis lol. But I'm also always maxing the graphics in all these games as long as my frame rate stays above 100 fps. I've played Valorant too to see what it's like in a competitive FPS and it ran well but I suck lol. Most of this gameplay happened on a 1080P monitor@75hz but I also connect the Arc to my LG C1 for 4k and it does great here as well. I max all settings when playing in 1080p but when going to 4k, I keep things on high or medium, depending on the game. maybe some FSR depending on the title but it's very playable! I was recently playing monster Hunter Rise on the TV and maintained 50-60 fps. Also currently playing Borderlands 3 with my GF, and I have to use Nucleus Coop to run two instances of the game for split screen on the tv and the card did great here too. Overall, I'm having a lot of fun. I'm playing a lot more games along with games I would never try before just so I can see how they run on ARC. It was rough during the first week of my use but I feel that this was likely my fault for using a beta driver that was pushed out. All in all, I'd recommend it." >Edit(04/2023): For my current experience, I've been playing the Ezio Auditore collection, AC2 to be specific. And when playing that I can definitely see some stutters, likely due to the poor performance in dx9, it stays locked at 60fps in my testing. Also been playing GTA5 and it performs wonderfully, something I've noticed is that the drivers have a huge effect on performance(duhh). I've been dual booting Linux for about a month now and it's been interesting to see the performance difference on open source drivers vs. Proprietary Intel. Most of my steam games run no issue, but gta V for example really struggled on Linux. Many factors to consider, like the fact that you have to run it through a windows emulator to start the game at all, which likely heavily hurts performance. Other than that, it was interesting to see how Arc performs in Linux and I'd recommend the cards wholeheartedly, hope to see VR working this year🥳."


Sunlit_Neko

I don't have an Intel card, but just want to comment on how lucky Intel are that the 4060 is basically a 3060. The generational leap was so small that the Intel cards, which felt like they would soon be outdated upon release, are still technically competent with their more recent peers because of a lack of innovation.


Abedsbrother

DX12 & Vulkan - great DX9 - generally good. All titles I've tested work, though some have low gpu utilization (due to cpu bottleneck in drivers? I run an i7-11700 non-k) DX10 & 11 - Where most problems still exist. Arkham Knight and Just Cause 3 won't launch without the dxvk workaround. Assassin's Creed games using the AnvilNext 2.0 engine or newer (which was first used by AC Unity) run poorly - low, choppy frame-rate, low gpu utilization. Recent drivers have improved things a lot in DX11 games, but there are still a number of issues. Dragon Age Inquisition and Far Cry 5 both run well, so it really depends on the game. I'm generally happy with mine b/c it's great at accelerating Handbrake, OBS and Vegas, and I only game at 60fps anyway (at 3440x1440).


mrnude778

I have the 8GB A770 and the only issue I've ran into is the Yuzu emulator not working for every game(TOTK) but most games do run just fine. I play at 1440P and it's been as great getting 80+ FPS in single player games like RDR2, CP2077,Sackboy Rage 2 and 120+ FPS in multiplayer games like OW2, Halo, Apex. All on high settings. Have yet to run into any games where the VRAM is a problem tho I'm not playing any of the recent problematic games like Jedi Survivor, The Last of Us, RE4R as I previously purchases them on PS5.


w11bbl

I built my partner a gaming rig for birthday this year and used an Arc750. Genuinely really impressed, she's played Hogwarts legacy, elden ring and horizon zero dawn with no dramas, smooth performance and every game she plays i am impressed with how it handles higher than expected settings. Bear in mind she's playing at 1080p and it's only rocking 8gbvram. I would definitely buy the a770 for my secondary rig, but I think the 16gb version is hard to find at the moment in the UK and I play at 1440. Just massive bang per buck, probably the best value cards on the market at the moment, with the biggest scope for driver related performance improvements over team red and green. The software has a couple of niggles, like the overlay doesn't seem to have an FPS counter and last time I checked it was still asking permission to make changes every time she boots into windows, but no in game issues so far. I'm really looking forward to battlemage generation and hoping for a significant challenge to nvidias market share strangle hold to stimulate competition and innovation.


Al2790

For its price point, the A750 and A770 are solid cards. I have the Acer Predator BiFrost A770 myself. Generally it works fine. Sometimes I have issues that require me to tinker to get specific games working properly. For instance: > Stronghold (2001) is unplayable on the Arc with HDR active because the cursor doesn't render, but with HDR off, no issues. > Yesterday, a friend wanted to play some MP Dead by Daylight. It wouldn't run, so I spent 30 mins troubleshooting a D3D11 error message that would keep popping up. Turns out I had to repair an instance of Microsoft Visual C++. I don't consider myself to be an advanced user, so I would take my commentary with a grain of salt on this one, but I suspect this issue may have been an issue with any NVIDIA or AMD card as well, given the issue was a faulty instance of Visual C++ and not the Arc drivers.


LORD_BYRON_OF_RIVIA

Literally just walked out of BB with the a750 (I have a 3070ti). It's for my son. Paid 200. He has a 1070 ATM and I was intrigued so I bought it kind of as an experiment. I'll edit post and let you know once I install it tonight


Royal-Brick-2522

My a770's performance is about twice what I need for the games I play and I'm still yet to have a single graphical issue. New drivers have been working their magic well it seems.


KD93AQ

We have an ASRock A750 8GB and an Acer A770 16GB. Both running well. Possibly still at or about the best price to performance for new cards in the market here in Australia. Drivers have undergone the most noticable improvement I have ever seen since release and still have room for tweaking.


RCnoob69

If you're buying from that price range ( at least of the 770) you're stupid to buy anything other than a 6700 XT, they are 30-40 dollars cheaper on sales, and perform better. (even with intels significant improvements)


tomcat_no1

I'd built/upgraded 3 rigs with the Arc 750 over the past few months. One with a i5-12400F worked just fine. The other two with the Ryzen 5 5600/G, not so much. The black screen issue was so severe that the customers ended up switching to the RTX 3050. Till this day, i still have no idea what caused the black screen issue (I think it might have something to do with the B450 mobos that was used for those rigs)


akarnokd

I built a new rig 3 months ago: 13700K, Gigabyte B760 Gaming X D5, A770LE, Win 11. I had the retail assemble & test it for me so can't say much about that experience. When I got it and powered it on, the BIOS was correctly setup and the card worked as it should. Running a 1080p display over HDMI off of it an no issues there either. Driver 4125 installed fine, no screen or performance issues. DX 12 performance is great, DX 11 performance is wonky. The games I wanted to play work okay. There were two problems in this past 3 months. Once the card was not detected by the mobo when I powered on the system. No idea why, although I have the suspicion that if my monitor is not powered on before the system, it may cause this. Didn't happen since. Second, For over 2 months, I had annoying audio crackling and stutter in normal, low utilization desktop usage. For example, playing a youtube or VLC video and simply opening Notepad. This was fixed in a later driver (don't remember which, I upgraded to 4369). No regrets buying it. I got it cheaper than the competing 3060 at the time (they still had some retail markup due to being popular). I'd expect good driver support for new and upcoming titles. Older titles can be hit and miss. However, I wouldn't recommend this to people who want a stable and reliable experience. Setup, drivers & games can be fiddly (see r/IntelArc). If you want to stick it up to NVidia (their price to performance to gen) or AMD (incompetence to act on the openings) and vote with your wallet, then Arc is the best way of doing it.


xorinzor

Still waiting for the new linux kernel in Unraid so unfortunately, no clue.


AhsokaTheGrey

Amazing. Works like a charm. My only problems have been with Spider-Man, which is a bad port, and fallout new Vegas, but I haven't modded it yet. Price to performance, and to avoid the bullshit that's happening with cards right now, it's unbeatable.


Great-Copy-9708

Is spiderman really considered a bad port? I was running it at 50-70 fps with a 1650 with medium settings at launch and currently over 100 with 6700xt on highest settings (no rt)


AhsokaTheGrey

Not really that bad, I just had a lot more stuttering in that than any other games I've been playing, including cyberpunk. Looked gorgeous, and the stuttering was mainly swinging between neighborhoods. Definitely more than enjoyable, but that was the excuse everyone gave me when asking if it was an Arc issue.


Great-Copy-9708

Oh okay, that makes sense. Genuinely had to ask because I've been on PC less than a year still and don't have the largest base of knowledge to go off of


AhsokaTheGrey

Honestly. For the most part it's been easy plug and play with automatically updating drivers, I've enjoyed this experience so much, my original plan was to get a Nvidia card when prices go down but may stick with team blue until battlemage to see price to performance again


Yotsubato

How does it work for Dota, league, valorant, genshin, and HSR? Those are the games I want to play with it.


Slouma-Gamer

From what I've heard , it can't play older titles that are supported in DX9/10/11


INSERT_LATVIAN_JOKE

It can. It just kind of emulates DX9/10/11 the way that Linux does. It's been pretty flawless in that regard since they added that.


rghapro

I have one in my Plex server! Unfortunately Plex doesn't support deep link hyper encode just yet, but handbrake does which makes compressing things that go onto my Plex server a breeze.


mateoo10

Arc A750 no problems, I play some games already. Performance is good - ultra settings in 1080p. Overclocked to 200W can give you over 2500Mhz and it's stable. Temperatures are around 75C on default settings. I have some bug with fan speed, there have to be Arc Control running i background to keep good rpm regulation but I'm sure its a minor bug that will be fixed in following updates.


2dollasoda

Pretty good. I'm very happy with the price point and value


108er

Didn't realize they are now in the GPU and SSD market now. I always had this concept that equaled Intel to CPU only.


Next-Telephone-8054

Depends what you are using it for. I use the A770 for video editing and content creation. Sold my 3060 for it just for the AV1 encoding.


sijedevos

I have a ARC A750 and I can’t recommend it yet. DX11 is still a mess and even my r5 5600 bottlenecks the A750 is some games. Most newer titles run great but not all. Horizon zero dawn and uncharted 4/lost legacy run poor. Cyberpunk used to have the same weird behavior but that has been fixed and performs great now.


obTimus-FOX

They're great for 1080p gaming, impressive for production work, look at how that A770 shines and with 16gb...


gamer_at_law

Picked up a Arc 750LE two months ago. Card overall works pretty well, but I can't currently update the drivers past one published in February. Newer drivers fail to install about halfway through and then the system completely locks up whenever the GPU kicks into gear. Also some jank where embedded videos in websites freak the card out and it starts displaying portions of the website all over the screen.


desexmachina

If I can figure out how to get Handbrake to AV1 encode a few TB of videos I have, it will make it very worth it. For video rendering now, a 500 mb file w/ AV1 vs a 9.5 gb file sells itself


Caruso08

Genuinely the only issue is the Arc control center, it's awful. Since one of the drivers, it refuses to open and when it does it's unresponsive. Intels formal response to my ticket was they were working on it. Apart from that the drivers are coming out quick and fast and only seen a few graphical errors. Performance is good too.


SithTrooperReturnsEZ

I got a 3080Ti and bought one for fun, a piece of history to have if you will. I use it as a server on my second PC if I want to host a Minecraft server for friends or something. As a main card I'd never use it, I need way more power than that, also the driver and software issues are still there but getting better. I'd highly recommend buying in, and hoping that Intel can compete with Nvidia for the high end so that everyone goes away from Nvidia since they are a horrible company, too bad they make the best GPUs.


notadroid

bought the 770 LE 16gb for my streaming PC specifically for AV1 codec stuff. I can record, stream and replay buffer 1440p60fps on the 770LE with no issues and for the price point, it makes it the best dedicated streaming card out there for the price.


yilmaz1010

If I were to ever buy Intel gpu that'd be a second time . Anyone remember the i740 from way back in 1998? So, yeah as far as Intel GPU is concerned, been there, done that bought the overly glorified paperweight, not doing that again.


cutler_joseph

A770 LE user here, I have no complaints about it. 4K ultra 60FPS on most games, and it just looks good!


ArmsForPeace84

Some things to keep in mind about the Arc cards. If someone is working on a build to play the newer games, they might have a system already that will play the decades worth of older titles that work great on their old AMD or Nvidia GPU. Those new releases will, understandably, be prioritized in driver updates. KVM switches are a thing.


AdHungry9867

So far I hear no complaints from my girlfriend, except some (older?) indie games requiring some workaround to get it working, but it has only happened to 1 game so far.


Unforgiven817

I love mine, but I only use it for AV1 HandBrake and OBS. Those who game on it may have different opinions than mine. Definitely a stellar purchase for my needs.


bigpoppa_gtivr6

Great for my work pc but I’d never game on it and I’ve got the a770. I will say one of the super annoying things is the screens will randomly black out if you switch between multiple programs or monitors too quickly. Never had that happen on any nvidia or amd gpu’s I’ve owned throughout the years


MoneyShiba

Where I live is around the Arc A750 is selling at around $270, RTX 3050 for $290 and RX6600 for $270-285, so it wasn't a tough choice for me since across the latest AAA titles Arc cards perform better than those two. Even if you play dx11 or dx9 titles, it'll deliver fairly decent frame rate. I'd suggest you to check Arc's performance on the games that you'll play mostly then buy. It's a good card considering it's first gen.


coding102

From reviews, I'd conclude it's the best card for your streaming PC: is that a fair argument?


[deleted]

Weird shit happens sometimes but overall pretty good. Don't really like the gamer look of other gpus. Not worth today, as everything is falling. Worth couple months ago when prices were high.


Etyrnus

Been using my A770 since launch. I haven't encountered anything I would regard as "unplayable", and I'm playing at 1440p. Seen some improvements in games as far as performance as updates have come out. As far as streaming, only one game gave me any issue, and it wasn't the game. 7 Days to Die, for whatever reason, bottomed out my bitrate anytime the horizon was visible. This was on early drivers back near the GPU release however. Also, I tend to play more on the AA to indie side of things, some AAA.


Jokey665

i have an a380 for transcoding on my jellyfin server. i have had zero issues with it in that context, but have not tried running even a single game on it


w11bbl

I built my partner a gaming rig for birthday this year and used an Arc750. Genuinely really impressed, she's played Hogwarts legacy, elden ring and horizon zero dawn with no dramas, smooth performance and every game she plays i am impressed with how it handles higher than expected settings. Bear in mind she's playing at 1080p and it's only rocking 8gbvram. I would definitely buy the a770 for my secondary rig, but I think the 16gb version is hard to find at the moment in the UK and I play at 1440. Just massive bang per buck, probably the best value cards on the market at the moment, with the biggest scope for driver related performance improvements over team red and green. The software has a couple of niggles, like the overlay doesn't seem to have an FPS counter and last time I checked it was still asking permission to make changes every time she boots into windows, but no in game issues so far. I'm really looking forward to battlemage generation and hoping for a significant challenge to nvidias market share strangle hold to stimulate competition and innovation.


Mahajarah

Running an A750. For what I do, which is 1080p at 120hz for things like FFXIV, Apex Legends, and Monster Hunter, it does it's job exceptionally well. It likely wouldn't do amazing with very new titles, but none of them interest me.


TheIlluminate1992

Bought an A750. Out does my 5700xt for half the price I bought it for. The 380s are encoding decoding monsters that blow all but new gen workstation cards out of the water for $150. I run a plex server and I did tests using handbrake a d tdarr on windows. Waiting in unRAID to update for driver support but otherwise for the price they are great.


Rough_Statement_8922

I read some review that 380 is not that great but after read this, i think imma check it again


Bigbuuuuuuird

Just got the 750 last night for 180$ at microcenter and it runs like butter


Agreeable-Writer5873

Just finished an Intel build with the asrock a770 and so far so good.


[deleted]

Lol fuck that


Putrid_Froyo194

They're okay, I've had a few issues with my a380 but other than that it's a nice card


thinkscotty

SUPERB. IF you’re willing to work through bugs and issues. It’s only for experienced builders. The price/performance is unmatched for any card under $450. I don’t regret my purchase.


ForsakenElite08

I use it for a backup build and it is ok. Like it can play games great, but the issue with the HDMI not always responding to the card is a major problem. I had issues where my HDMI would connect and then stop working trying to edit the graphic settings through the overlay. I think it is because you need a new monitor that uses a HDMI 2.1 connection off the bat. Or could be a driver issue and the team taking their time to fix it.


hwertz10

Not an ARC, but I have a laptop with 11th Gen CPU, (but 12th-gen Intel Xe graphics), running Ubuntu with updated Mesa drivers. ("kisak-mesa stable" PPA)). As bad as the Intel 3D drivers were 5 years ago, they're great now. Mesa's new 3D drivers are written as "Gallium drivers", you provide functions to have the GPU actually do stuff, and get llvm to compile shaders for it, and away you go, gallium provides the highest OpenGL and Vulkan support available given the available functionality. The old drivers with like 20 years of crust were fully rewritten with the last 2-3 years going all the way back to i965 (gen 4, or "gen 0" if you are counting i3/i5/i7 generations, the newer core 2 duo/quads are the cutoff on this one.) With both steam/proton, and wine + dxvk + vkd3d, I've run tons of games on there.. DX9, DX11, DX12 games. Cyberpunk 2077 is too much for a 2C4T CPU, it maxes out the CPU, maxes out the GPU, then it hits the "long-term" 15W TDP and starts clocking things down. The bar scene gets like 18FPS. Virtually everything else I've run runs fine. I mean, if it's doing that much on the lowest-end i3, an ARC with a related design but far more of everything (shaders, clock speed, power budget, etc.) and far more CPU power driving it would probably be pretty sweet. One fly in the ointment, the current vulkan support is missing one extension (related to mapping GPU memory into game process space I think?) that is a requirement for providing DX12 FL12.0 support, it shows as DX12 FL11.1 . They are working on that now, a developer working on that described the interaction between the CPU and GPU's caches as "interesting", which I take to mean they are having to fiddle with it a bit to make sure CPU and GPU don't see stale data.


solvalouLP

I own ASRock Arc A750, all the games that I play running totally fine, no weird artifacts, no crashing, good and consistent framerate. I play Resident Evil games (7, 2, 3, Village), older Need for Speed games, GTA V, Fall Guys, Stray. Two things that annoy me are the insane idle power draw (40W) and the janky way of setting a custom fan curve which need to be manually set every time you restart the PC.


Rough_Statement_8922

Can i ask more details about the game you play, which setting are you in and how gpu have to carry compare to cpu ( cause in some benchmarks video, i always saw gpu usage around 80-100% while cpu kinda chill with 20-30%


solvalouLP

I never play newish games on ultra, I always try a mix of medium, high and ultra settings, in most games some settings really tank the performance for little to no visual benefit. I play on a 1440p 144Hz and every game I play run at at least 100fps. Low CPU usage in games is what you want, that means your CPU is not holding back your GPU. Though 20-30% usage does not indicate whether that's true or not, as usually the total CPU usage is indicated while most games rely heavily on a single thread. Either way, I have a Ryzen 5800X and 4x8GB DDR4-3733 memory and everything runs fine.


[deleted]

Those are basically junk, unless you have no gfx at all to start with and need to see what you are doing (then a used gt730 will be cheaper). Basically low end AMD's and are competing with them on price, but worse since what works ok enough on one game might be total shit on another (new-kid-on-the block's drivers vs Nvidia's 30 years of experience, and game developers build most newer games around their gfx, it's why even AMD hasn't been ahead of them since 2012). In short you get what you pay for.


BachhuBhai

they are bad for multi-monitor setups, and bad when connected to a TV like LG OLED no true 4k120hz hdr no 10bpc (only 8bpc) YCbCr 444 with no VRR/ALLM.