Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for RTX 2060 Super 8GB VRAM? #159

Open
krigeta opened this issue Mar 8, 2025 · 9 comments
Open

Support for RTX 2060 Super 8GB VRAM? #159

krigeta opened this issue Mar 8, 2025 · 9 comments

Comments

@krigeta
Copy link

krigeta commented Mar 8, 2025

Does this support the second generation of Nvidia RTX cards, like the RTX 2060 8GB Super, which has 8GB VRAM? Will it work with less than 8GB VRAM?

@nitinmukesh
Copy link

We currently support only NVIDIA GPUs with architectures sm_86 (Ampere: RTX 3090, A6000), sm_89 (Ada: RTX 4090), and sm_80 (A100). See #1 for more details.

@krigeta
Copy link
Author

krigeta commented Mar 8, 2025

We currently support only NVIDIA GPUs with architectures sm_86 (Ampere: RTX 3090, A6000), sm_89 (Ada: RTX 4090), and sm_80 (A100). See #1 for more details.

seems like again I miss a great opportunity to use flux.

@nitinmukesh
Copy link

Does your GPU support CUDA 12.6 drivers and torch 2.5.1+cu124.
If it does let me know.

@lmxyy
Copy link
Collaborator

lmxyy commented Mar 8, 2025

I am not sure if 20-series GPUs work. It seems that they have INT4 tensor cores as in https://images.nvidia.com/aem-dam/en-zz/Solutions/design-visualization/technologies/turing-architecture/NVIDIA-Turing-Architecture-Whitepaper.pdf. I will do a quick check.

@nitinmukesh
Copy link

nitinmukesh commented Mar 8, 2025

You can also check using this repo. If it works for you (see if you are able to generate few images), I will integrate int4 and GGUF version of Flux.
https://github.com/newgenai79/sd-diffuser-webui

@krigeta
Copy link
Author

krigeta commented Mar 9, 2025

Does your GPU support CUDA 12.6 drivers and torch 2.5.1+cu124. If it does let me know.

Yes, it supports this as I am using comfyUI and it shows me the same pair at startup.

@Dahvikiin
Copy link

Does your GPU support CUDA 12.6 drivers and torch 2.5.1+cu124. If it does let me know.

yes, RTX 2000 (Turing, sm7.5) are supported, even in the recent releases like NVIDIA CUDA 12.8 Update 1

@nitinmukesh
Copy link

Does your GPU support CUDA 12.6 drivers and torch 2.5.1+cu124. If it does let me know.

Yes, it supports this as I am using comfyUI and it shows me the same pair at startup.

If you are using Comfy you are set. There are many low VRAM workflows.

@krigeta
Copy link
Author

krigeta commented Mar 9, 2025

Does your GPU support CUDA 12.6 drivers and torch 2.5.1+cu124. If it does let me know.

Yes, it supports this as I am using comfyUI and it shows me the same pair at startup.

If you are using Comfy you are set. There are many low VRAM workflows.

but my question was, how can I use nunchaku with RTX 2060

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants