Skip to content

Commit c4177d2

Browse files
committed
Updated README Getting Started instructions
Provides guidance to avoid error when downloading pre-trained model
1 parent d72c7fe commit c4177d2

File tree

2 files changed

+16
-3
lines changed

2 files changed

+16
-3
lines changed

.gitignore

+2-1
Original file line numberDiff line numberDiff line change
@@ -160,4 +160,5 @@ cython_debug/
160160
#.idea/
161161

162162
data/
163-
wandb/
163+
wandb/
164+
.idea/

README.md

+14-2
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,18 @@ First, we have to install all the libraries listed in `requirements.txt`
3737
```bash
3838
pip install -r requirements.txt
3939
```
40+
41+
If you see this error:
42+
43+
> OSError: bigcode/starcoder is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
44+
If this is a private repository, make sure to pass a token having permission to this repo with `use_auth_token` or log in with `huggingface-cli login` and pass `use_auth_token=True`.
45+
46+
it means that you need to authenticate to Hugging Face API to download the model: sign up for an account, and accept the [T&C to use BigCode](https://huggingface.co/bigcode/starcoder); then [obtain an API Token](https://huggingface.co/settings/tokens) from HF and use it to authenticate to the CLI:
47+
48+
```shell
49+
huggingface-cli login
50+
```
51+
4052
## Code generation
4153
The code generation pipeline is as follows
4254

@@ -46,7 +58,7 @@ from transformers import AutoModelForCausalLM, AutoTokenizer
4658
checkpoint = "bigcode/starcoder"
4759
device = "cuda" # for GPU usage or "cpu" for CPU usage
4860

49-
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
61+
tokenizer = AutoTokenizer.from_pretrained(checkpoint, use_auth_token=True)
5062
# to save memory consider using fp16 or bf16 by specifying torch_dtype=torch.float16 for example
5163
model = AutoModelForCausalLM.from_pretrained(checkpoint).to(device)
5264

@@ -60,7 +72,7 @@ or
6072
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
6173
checkpoint = "bigcode/starcoder"
6274

63-
model = AutoModelForCausalLM.from_pretrained(checkpoint)
75+
model = AutoModelForCausalLM.from_pretrained(checkpoint, use_auth_token=True)
6476
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
6577

6678
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, device=0)

0 commit comments

Comments
 (0)