diff --git a/intermediate_source/dynamic_quantization_bert_tutorial.rst b/intermediate_source/dynamic_quantization_bert_tutorial.rst index 1ea6ea46dd..e515f53a1d 100644 --- a/intermediate_source/dynamic_quantization_bert_tutorial.rst +++ b/intermediate_source/dynamic_quantization_bert_tutorial.rst @@ -79,7 +79,7 @@ Mac: .. code:: shell - yes y | pip uninstall torch tochvision + yes y | pip uninstall torch torchvision yes y | pip install --pre torch -f https://download.pytorch.org/whl/nightly/cu101/torch_nightly.html @@ -206,7 +206,7 @@ in `examples `_. +We provide the fine-tuned BERT model for MRPC task `here `_. To save time, you can download the model file (~400 MB) directly into your local folder ``$OUT_DIR``. 2.1 Set global configurations @@ -273,7 +273,7 @@ We load the tokenizer and fine-tuned BERT sequence classifier model 2.3 Define the tokenize and evaluation function ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -We reuse the tokenize and evaluation function from `Huggingface `_. +We reuse the tokenize and evaluation function from `HuggingFace `_. .. code:: python