Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] #730

Open
3 tasks done
lugangqi opened this issue Feb 2, 2025 · 1 comment
Open
3 tasks done

[BUG] #730

lugangqi opened this issue Feb 2, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@lugangqi
Copy link

lugangqi commented Feb 2, 2025

OS

Windows

GPU Library

CUDA 12.x

Python version

3.12

Pytorch version

2.6

Model

No response

Describe the bug

Does exllamav2 support p104-100 8g graphics card #729

Reproduction steps

4060ti 16g and p104 8g

Expected behavior

Does exllamav2 support p104-100 8g graphics card #729

Logs

No response

Additional context

No response

Acknowledgements

  • I have looked for similar issues before submitting this one.
  • I understand that the developers have lives and my issue will be answered when possible.
  • I understand the developers of this program are human, and I will ask my questions politely.
@lugangqi lugangqi added the bug Something isn't working label Feb 2, 2025
@mindkrypted
Copy link

This is not a bug, please close it.
There are "help wanted" and "question" tags available.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants