-
-
Notifications
You must be signed in to change notification settings - Fork 180
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RamaLama can't discover RX 5700 XT #2197
Comments
What image are you on? The version we're shipping is on the -dx image, and mine pulled the rocm container for my 5700XT and that worked out of the box. |
Bluefin-dx stable image. |
Can you try with the one on the image? |
@castrojo I tried:
My system info:
|
Also, I couldn't imagine that ramalama is installed. Where I can see a list of installed non-standard apps? |
Hi, can you run it with the --debug flag? Additionally, try removing the podman image it pulled and try again, maybe there was an issue when the container was pulled. Also, I recommend making an issue in the upstream repo, they might know how to debug it better. |
Sure:
|
I'm using 0.5.2 from Bluefin. Their most probable advice will be to update to 0.5.5 at least. |
Yeah we're working on getting updated versions into Bluefin asap. |
Describe the bug
Installing RamaLama through pip (as official documentation said). And expect it to recognize my GPU (RX 5700 XT).
Yes, it is not officially supported in ROCm but according to user @RealVishy it works in Aurora-dx.
Crossref: ollama/ollama#2503
What did you expect to happen?
I expect RamaLama to recognize my GPU (RX 5700 XT).
Output of
bootc status
Output of
groups
Extra information or context
No response
The text was updated successfully, but these errors were encountered: