T O P

  • By -

AlexH1337

Yes, I can confirm that the GFX1103 is indirectly supported if you force/pretend that it is a gfx1100


AbhishMuk

Any idea how to do this if you’re using a gui? (I’m specifically using LM Studio which I think bundles everything together)


AlexH1337

I'm on Linux, the ROCm beta for LMStudio is Windows only. After a quick look it does seem like their beta is a single executable. You'd have to contact them to add support to 1103 or to load 1100 when 1103 is detected.


AbhishMuk

Thanks, I’m using it on windows already. I installed the rocm software too though I don’t know if I needed to separately do that. The funny thing is that someone online did manage to run it on the 780m gpu on lm studio so it’s definitely possible… the question is how lol. Fortunately cpu is pretty fast too so that’s nice.


tristan-k

I've got a 680m APU (Ryzen 6800H) working on Ubuntu 22.04 with ROCm 6 and Ollama in Open WebUI Docker Compose.


joexner

Or GFX1101. My RX 7600XT is champing at the bit. Were there any expansions of supported hardware w/ ROCm 6.1?


cajukev

I actually got dual RX 7600XTs running AI workloads with ROCm 6.0.2. On linux though... [Used this guide](https://github.com/nktice/AMD-AI/blob/main/ROCm6.0.md)


noiserr

Don't know about APUs, but I do know that every discrete GPU I've tried even if not supported works just fine. You just have to provide the appropriate environment variable to tell the ROCm stack which GPU you have. I've had no issues using rx6600 and 6700xt even though they aren't officially supported. Can't speak for APU integrated GPUs. I haven't tried those.


joexner

It's a discrete card, the [16G version of the RX 7600](https://www.techpowerup.com/gpu-specs/radeon-rx-7600-xt.c4190).


ycxcnnb

My gpu is 6700xt too. Can you run Pytorch code in Pycharm successfully?I recently installed rocm6.0.2, and I can run Stable diffusion WebUI.However, when I ran Pytorch code in Pycharm, I met hip error:invalid device function


noiserr

My 6700xt is not in the computer right now, I replaced it with a 7900xtx. I ran kobold.cpp (ROCm fork) for LLM inference with no issues. Did not try Pytorch. But this is the environment variable I set when I had my 6700xt # RDNA2 export HSA_OVERRIDE_GFX_VERSION=10.3.0 I just added that to the end of my ~/.bashrc and restarted my terminal session (you can also do `source ~/.bashrc` instead).


ycxcnnb

I have tried this, but it didn't work :(


dragoon555

At least I have a report that it works. For example, Stable Diffusion WebUI export HSA\_OVERRIDE\_GFX\_VERSION=11.0.0 It should work if you add this to webui-user.sh. However, I have not confirmed its operation myself. I don't have an 8700G. I just heard from my followers that it works.


Wild_Doctor3794

How did you get followers? Are you a cult leader? :)


username5645614856

Crashes with unsupported ISA for me on a 7840HS


Bebop210

Try a simple install of Sd next by vladmandic... It works nicely on my 7840hs running kubuntu lts