@@ -11,23 +11,23 @@ MODEL=large-v2 LANGUAGE=ru docker compose up
11
11
```
12
12
13
13
## Step by step
14
- #### 1. Build CUDA image (single run)
14
+ ### 1. Build CUDA image (single run)
15
15
```
16
16
docker compose build --progress=plain
17
17
```
18
18
19
- #### 2. Download models (single run)
19
+ ### 2. Download models (single run)
20
20
You may want to do it manually in order to see the progress
21
21
```
22
22
./models/download.sh large-v2
23
23
```
24
- This script is a plain copy of [ download-ggml-model.sh] ( https://github.com/ggerganov/whisper.cpp/blob/master/models/download-ggml-model.sh )
24
+ This script is a plain copy of [ download-ggml-model.sh] ( https://github.com/ggerganov/whisper.cpp/blob/master/models/download-ggml-model.sh ) .
25
25
You may find additional information and configurations [ here] ( https://github.com/ggerganov/whisper.cpp/tree/master/models )
26
26
27
- #### 3. Prepare your files
27
+ ### 3. Prepare your files
28
28
Place all the files in the ``` ./volume/input/ ``` directory
29
29
30
- #### 4. Run the docker compose
30
+ ### 4. Run the docker compose
31
31
```
32
32
docker compose up
33
33
```
@@ -42,5 +42,5 @@ LANGUAGE=ru \
42
42
| model | base, medium, large, [ other options] ( https://github.com/ggerganov/whisper.cpp/blob/master/models/download-ggml-model.sh#L25 ) | large-v3
43
43
| language | rn, ru, fr, etc. (depends on the model) | ru
44
44
45
- #### 5. Result
46
- You can find the result in the ``` ./volume/output/ ``` directory
45
+ ### 5. Result
46
+ You can find the result in the ``` ./volume/output/ ``` directory
0 commit comments