And it will keep popping up. Enjoy the rest of your life.
And it will keep popping up. Enjoy the rest of your life.
No, GFX1030 is still supported.
This link is misleading. For example, the Radeon RX6800 IS supported because it is the same chip as one of the Radeon Pros. GFX1030. Many others are too…though support does not go very far back.
Look into llamafile. It makes things so easy.
Also you’re welcome!
No special compiling. Just need to download the ROCm drivers from AMD and the special ROCm PyTorch version.
The install wasn’t too hard. I mean it wasn’t like just running a batch file on Windows, but if you have even a tiny bit of experience with the Linux shell and installing python apps, you will be good. You mostly just need to make sure you’re using the correct (ROCm) version of PyTorch. Happy to help, any time (best on evenings and weekends EST). Please DM.
I ran these last night, but didn’t have the correct VAE, so I’m not sure if that affects anything. 512x512 was about 7.5it/s. 1024x1024 was about 1.3s/it (iirc). I used somebody else’s prompt which used loras and embeddings, so I’m not sure how that affects things either. I’m not a professional benchmarker so consider these numbers anecdotal at best. Hope that helps.
Edit: formatting
Does the resolution or steps or anything else matter?
I use a ton of different ones. I can test specific models if you like.
Sorry, not trying to come at you, but I’m just trying to provide a bit of fact checking. In this link, they tested on Windows which would have to be using DirectML which is super slow. Did Linus Tech Tips do this? Anyway, the cool kids use ROCm on Linux. Much, much faster.
You can run CUDA apps on ROCm HIP. It’s easy.
Using a 6800xt on Linux for several months and I’m super happy with it. There hasn’t been anything I haven’t been able to do. AMA.
You’ve just given them more views
Yes, it seems so according to this person’s testing: https://youtu.be/HPO7fu7Vyw4
The memory bandwidth stinks compared to a discrete gpu. That’s the reason. It’s still possible.
“Do or do not, bruh”
-Grogu
Unfortunately can’t get away from it at work