Skip to content

Commit f37a1f2

Browse files
authored
Upgrade to python 3.11 (#10711)
* create conda env with python 3.11 * recommend to use Python 3.11 * update
1 parent 8f45e22 commit f37a1f2

File tree

217 files changed

+319
-319
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

217 files changed

+319
-319
lines changed

docs/readthedocs/source/doc/LLM/Overview/install_cpu.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Please refer to [Environment Setup](#environment-setup) for more information.
1717
1818
.. important::
1919
20-
``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11; Python 3.9 is recommended for best practices.
20+
``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11; Python 3.11 is recommended for best practices.
2121
```
2222

2323
## Recommended Requirements
@@ -39,10 +39,10 @@ Here list the recommended hardware and OS for smooth IPEX-LLM optimization exper
3939

4040
For optimal performance with LLM models using IPEX-LLM optimizations on Intel CPUs, here are some best practices for setting up environment:
4141

42-
First we recommend using [Conda](https://docs.conda.io/en/latest/miniconda.html) to create a python 3.9 enviroment:
42+
First we recommend using [Conda](https://docs.conda.io/en/latest/miniconda.html) to create a python 3.11 enviroment:
4343

4444
```bash
45-
conda create -n llm python=3.9
45+
conda create -n llm python=3.11
4646
conda activate llm
4747

4848
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

docs/readthedocs/source/doc/LLM/Overview/install_gpu.md

+12-12
Original file line numberDiff line numberDiff line change
@@ -22,18 +22,18 @@ To apply Intel GPU acceleration, there're several prerequisite steps for tools i
2222

2323
* Step 4: Install Intel® oneAPI Base Toolkit 2024.0:
2424

25-
First, Create a Python 3.9 enviroment and activate it. In Anaconda Prompt:
25+
First, Create a Python 3.11 enviroment and activate it. In Anaconda Prompt:
2626

2727
```cmd
28-
conda create -n llm python=3.9 libuv
28+
conda create -n llm python=3.11 libuv
2929
3030
conda activate llm
3131
```
3232

3333
```eval_rst
3434
.. important::
3535
36-
``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.9 is recommended for best practices.
36+
``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.11 is recommended for best practices.
3737
```
3838

3939
Then, use `pip` to install the Intel oneAPI Base Toolkit 2024.0:
@@ -111,7 +111,7 @@ pip install --pre --upgrade ipex-llm[xpu]
111111
```eval_rst
112112
.. note::
113113
114-
All the wheel packages mentioned here are for Python 3.9. If you would like to use Python 3.10 or 3.11, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp39`` with ``cp310`` or ``cp311``, respectively.
114+
All the wheel packages mentioned here are for Python 3.11. If you would like to use Python 3.9 or 3.10, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp39`` with ``cp310`` or ``cp311``, respectively.
115115
```
116116

117117
### Runtime Configuration
@@ -164,7 +164,7 @@ If you met error when importing `intel_extension_for_pytorch`, please ensure tha
164164

165165
* Ensure that `libuv` is installed in your conda environment. This can be done during the creation of the environment with the command:
166166
```cmd
167-
conda create -n llm python=3.9 libuv
167+
conda create -n llm python=3.11 libuv
168168
```
169169
If you missed `libuv`, you can add it to your existing environment through
170170
```cmd
@@ -399,12 +399,12 @@ IPEX-LLM GPU support on Linux has been verified on:
399399
### Install IPEX-LLM
400400
#### Install IPEX-LLM From PyPI
401401

402-
We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) to create a python 3.9 enviroment:
402+
We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) to create a python 3.11 enviroment:
403403

404404
```eval_rst
405405
.. important::
406406
407-
``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.9 is recommended for best practices.
407+
``ipex-llm`` is tested with Python 3.9, 3.10 and 3.11. Python 3.11 is recommended for best practices.
408408
```
409409

410410
```eval_rst
@@ -422,7 +422,7 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t
422422
423423
.. code-block:: bash
424424
425-
conda create -n llm python=3.9
425+
conda create -n llm python=3.11
426426
conda activate llm
427427
428428
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
@@ -439,7 +439,7 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t
439439
440440
.. code-block:: bash
441441
442-
conda create -n llm python=3.9
442+
conda create -n llm python=3.11
443443
conda activate llm
444444
445445
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
@@ -461,7 +461,7 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t
461461
462462
.. code-block:: bash
463463
464-
conda create -n llm python=3.9
464+
conda create -n llm python=3.11
465465
conda activate llm
466466
467467
pip install --pre --upgrade ipex-llm[xpu_2.0] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
@@ -470,7 +470,7 @@ We recommend using [miniconda](https://docs.conda.io/en/latest/miniconda.html) t
470470
471471
.. code-block:: bash
472472
473-
conda create -n llm python=3.9
473+
conda create -n llm python=3.11
474474
conda activate llm
475475
476476
pip install --pre --upgrade ipex-llm[xpu_2.0] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
@@ -530,7 +530,7 @@ If you encounter network issues when installing IPEX, you can also install IPEX-
530530
```eval_rst
531531
.. note::
532532
533-
All the wheel packages mentioned here are for Python 3.9. If you would like to use Python 3.10 or 3.11, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp39`` with ``cp310`` or ``cp311``, respectively.
533+
All the wheel packages mentioned here are for Python 3.11. If you would like to use Python 3.9 or 3.10, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp39`` with ``cp310`` or ``cp311``, respectively.
534534
```
535535

536536
### Runtime Configuration

docs/readthedocs/source/doc/LLM/Quickstart/continue_quickstart.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ This guide walks you through setting up and running **Continue** within _Visual
2828

2929
Visit [Run Text Generation WebUI Quickstart Guide](webui_quickstart.html), and follow the steps 1) [Install IPEX-LLM](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#install-ipex-llm), 2) [Install WebUI](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#install-the-webui) and 3) [Start the Server](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#start-the-webui-server) to install and start the Text Generation WebUI API Service. **Please pay attention to below items during installation:**
3030

31-
- The Text Generation WebUI API service requires Python version 3.10 or higher. But [IPEX-LLM installation instructions](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#install-ipex-llm) used ``python=3.9`` as default for creating the conda environment. We recommend changing it to ``3.11``, using below command:
31+
- The Text Generation WebUI API service requires Python version 3.10 or higher. But [IPEX-LLM installation instructions](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/webui_quickstart.html#install-ipex-llm) used ``python=3.11`` as default for creating the conda environment. We recommend changing it to ``3.11``, using below command:
3232
```bash
3333
conda create -n llm python=3.11 libuv
3434
```

docs/readthedocs/source/doc/LLM/Quickstart/install_linux_gpu.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -144,7 +144,7 @@ You can use `conda --version` to verify you conda installation.
144144

145145
After installation, create a new python environment `llm`:
146146
```cmd
147-
conda create -n llm python=3.9
147+
conda create -n llm python=3.11
148148
```
149149
Activate the newly created environment `llm`:
150150
```cmd

docs/readthedocs/source/doc/LLM/Quickstart/install_windows_gpu.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ Visit [Miniconda installation page](https://docs.anaconda.com/free/miniconda/),
5757

5858
Open the **Anaconda Prompt**. Then create a new python environment `llm` and activate it:
5959
```cmd
60-
conda create -n llm python=3.9 libuv
60+
conda create -n llm python=3.11 libuv
6161
conda activate llm
6262
```
6363

docs/readthedocs/source/doc/LLM/Quickstart/llama_cpp_quickstart.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ Visit the [Install IPEX-LLM on Windows with Intel GPU Guide](https://ipex-llm.re
2626

2727
To use `llama.cpp` with IPEX-LLM, first ensure that `ipex-llm[cpp]` is installed.
2828
```cmd
29-
conda create -n llm-cpp python=3.9
29+
conda create -n llm-cpp python=3.11
3030
conda activate llm-cpp
3131
pip install --pre --upgrade ipex-llm[cpp]
3232
```

python/llm/example/CPU/Applications/autogen/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ mkdir autogen
1111
cd autogen
1212

1313
# create respective conda environment
14-
conda create -n autogen python=3.9
14+
conda create -n autogen python=3.11
1515
conda activate autogen
1616

1717
# install fastchat-adapted ipex-llm

python/llm/example/CPU/Applications/hf-agent/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ To run this example with IPEX-LLM, we have some recommended requirements for you
1010
### 1. Install
1111
We suggest using conda to manage environment:
1212
```bash
13-
conda create -n llm python=3.9
13+
conda create -n llm python=3.11
1414
conda activate llm
1515

1616
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/Applications/streaming-llm/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ model = AutoModelForCausalLM.from_pretrained(model_name_or_path, load_in_4bit=Tr
1010
## Prepare Environment
1111
We suggest using conda to manage environment:
1212
```bash
13-
conda create -n llm python=3.9
13+
conda create -n llm python=3.11
1414
conda activate llm
1515

1616
pip install --pre --upgrade ipex-llm[all]

python/llm/example/CPU/Deepspeed-AutoTP/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
#### 1. Install Dependencies
44

5-
Install necessary packages (here Python 3.9 is our test environment):
5+
Install necessary packages (here Python 3.11 is our test environment):
66

77
```bash
88
bash install.sh

python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/AWQ/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a AWQ
3434
We suggest using conda to manage environment:
3535

3636
```bash
37-
conda create -n llm python=3.9
37+
conda create -n llm python=3.11
3838
conda activate llm
3939

4040
pip install autoawq==0.1.8 --no-deps

python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GGUF/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ We suggest using conda to manage the Python environment. For more information ab
2525

2626
After installing conda, create a Python environment for IPEX-LLM:
2727
```bash
28-
conda create -n llm python=3.9 # recommend to use Python 3.9
28+
conda create -n llm python=3.11 # recommend to use Python 3.11
2929
conda activate llm
3030

3131
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Advanced-Quantizations/GPTQ/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Llam
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab
1616

1717
After installing conda, create a Python environment for IPEX-LLM:
1818
```bash
19-
conda create -n llm python=3.9 # recommend to use Python 3.9
19+
conda create -n llm python=3.11 # recommend to use Python 3.11
2020
conda activate llm
2121

2222
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/aquila2/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab
1616

1717
After installing conda, create a Python environment for IPEX-LLM:
1818
```bash
19-
conda create -n llm python=3.9 # recommend to use Python 3.9
19+
conda create -n llm python=3.11 # recommend to use Python 3.11
2020
conda activate llm
2121

2222
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Baic
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/baichuan2/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Baic
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/bluelm/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Blue
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab
1616

1717
After installing conda, create a Python environment for IPEX-LLM:
1818
```bash
19-
conda create -n llm python=3.9 # recommend to use Python 3.9
19+
conda create -n llm python=3.11 # recommend to use Python 3.11
2020
conda activate llm
2121

2222
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm2/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Chat
1010
### 1. Install
1111
We suggest using conda to manage environment:
1212
```bash
13-
conda create -n llm python=3.9
13+
conda create -n llm python=3.11
1414
conda activate llm
1515

1616
pip install ipex-llm[all] # install ipex-llm with 'all' option
@@ -80,7 +80,7 @@ In the example [streamchat.py](./streamchat.py), we show a basic use case for a
8080
### 1. Install
8181
We suggest using conda to manage environment:
8282
```bash
83-
conda create -n llm python=3.9
83+
conda create -n llm python=3.11
8484
conda activate llm
8585

8686
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/chatglm3/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Chat
1010
### 1. Install
1111
We suggest using conda to manage environment:
1212
```bash
13-
conda create -n llm python=3.9
13+
conda create -n llm python=3.11
1414
conda activate llm
1515

1616
pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option
@@ -81,7 +81,7 @@ In the example [streamchat.py](./streamchat.py), we show a basic use case for a
8181
### 1. Install
8282
We suggest using conda to manage environment:
8383
```bash
84-
conda create -n llm python=3.9
84+
conda create -n llm python=3.11
8585
conda activate llm
8686

8787
pip install --pre --upgrade ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/codellama/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Code
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/codeshell/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab
1616

1717
After installing conda, create a Python environment for IPEX-LLM:
1818
```bash
19-
conda create -n llm python=3.9 # recommend to use Python 3.9
19+
conda create -n llm python=3.11 # recommend to use Python 3.11
2020
conda activate llm
2121

2222
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/deciLM-7b/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Deci
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek-moe/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ We suggest using conda to manage the Python environment. For more information ab
1616

1717
After installing conda, create a Python environment for IPEX-LLM:
1818
```bash
19-
conda create -n llm python=3.9 # recommend to use Python 3.9
19+
conda create -n llm python=3.11 # recommend to use Python 3.11
2020
conda activate llm
2121

2222
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/deepseek/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Deep
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/distil-whisper/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ We suggest using conda to manage the Python environment. For more information ab
1212

1313
After installing conda, create a Python environment for IPEX-LLM:
1414
```bash
15-
conda create -n llm python=3.9
15+
conda create -n llm python=3.11
1616
conda activate llm
1717

1818
pip install --pre --upgrade ipex-llm[all] # install the latest ipex-llm nightly build with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v1/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Doll
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/dolly_v2/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Doll
99
### 1. Install
1010
We suggest using conda to manage environment:
1111
```bash
12-
conda create -n llm python=3.9
12+
conda create -n llm python=3.11
1313
conda activate llm
1414

1515
pip install ipex-llm[all] # install ipex-llm with 'all' option

python/llm/example/CPU/HF-Transformers-AutoModels/Model/falcon/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ In the example [generate.py](./generate.py), we show a basic use case for a Falc
1010
### 1. Install
1111
We suggest using conda to manage environment:
1212
```bash
13-
conda create -n llm python=3.9
13+
conda create -n llm python=3.11
1414
conda activate llm
1515

1616
pip install ipex-llm[all] # install ipex-llm with 'all' option

0 commit comments

Comments
 (0)