From e5377a419f8955147baa75d696fba601d9b13eb6 Mon Sep 17 00:00:00 2001 From: LunaMidori5 Date: Mon, 5 Feb 2024 07:54:44 -0800 Subject: [PATCH] Added system requirements guesses --- content/models/onsite_models/_index.md | 32 +++++++++++++++++--------- 1 file changed, 21 insertions(+), 11 deletions(-) diff --git a/content/models/onsite_models/_index.md b/content/models/onsite_models/_index.md index f33af4f..79b1f84 100644 --- a/content/models/onsite_models/_index.md +++ b/content/models/onsite_models/_index.md @@ -7,12 +7,12 @@ weight = 154 All models are highly recommened for newer users as they are super easy to use and use the CHAT templ files from [Twinz](https://github.com/TwinFinz) -| Model Size | Description | -|---|---| -| 7b | CPU Friendly, small, okay quality | -| 2x7b | Normal sized, good quality | -| 8x7b | Big, great quality | -| 70b | Large, hard to run, significant quality | +| Model Size | Description | Links | +|---|---|---| +| 7b | CPU Friendly, small, okay quality | https://huggingface.co/TheBloke/dolphin-2.6-mistral-7B-GGUF | +| 2x7b | Normal sized, good quality | https://huggingface.co/TheBloke/laser-dolphin-mixtral-2x7b-dpo-GGUF | +| 8x7b | Big, great quality | https://huggingface.co/TheBloke/dolphin-2.7-mixtral-8x7b-GGUF | +| 70b | Large, hard to run, significant quality | https://huggingface.co/TheBloke/dolphin-2.2-70B-GGUF | | Quant Mode | Description | |---|---| @@ -23,9 +23,19 @@ All models are highly recommened for newer users as they are super easy to use a | Q8 | Extremely large, extremely low quality loss, hard to use - not recommended | | None | Extremely large, No quality loss, super hard to use - really not recommended | -The models used by this program as of right now are: +The minimum RAM and VRAM requirements for each model size, as a rough estimate. +- 7b: + - System RAM: 10 GB + - VRAM: 2 GB + +- 2x7b: + - System RAM: 25 GB + - V RAM: 8 GB + +- 8x7b: + - System RAM: 55 GB + - VRAM: 28 GB -- 7b: [TheBloke/dolphin-2.6-mistral-7B-GGUF](https://huggingface.co/TheBloke/dolphin-2.6-mistral-7B-GGUF) -- 2x7b: [TheBloke/laser-dolphin-mixtral-2x7b-dpo-GGUF](https://huggingface.co/TheBloke/laser-dolphin-mixtral-2x7b-dpo-GGUF) -- 8x7b: [TheBloke/dolphin-2.7-mixtral-8x7b-GGUF](https://huggingface.co/TheBloke/dolphin-2.7-mixtral-8x7b-GGUF) -- 70b: [TheBloke/dolphin-2.2-70B-GGUF](https://huggingface.co/TheBloke/dolphin-2.2-70B-GGUF) \ No newline at end of file +- 70b: + - System RAM: 105 GB + - VRAM: AI Card or better \ No newline at end of file