-
Notifications
You must be signed in to change notification settings - Fork 38
BuildingForLinux
To install software requirements, please follow instructions. This manual is for Ubuntu 20.04, for other OS it may be different.
DLI supports several inference frameworks:
- Intel® Distribution of OpenVINO™ Toolkit.
- Intel® Optimization for Caffe.
- Intel® Optimization for TensorFlow.
- TensorFlow Lite.
- ONNX Runtime (C++ and Python APIs).
- MXNet (Python Gluon API).
- OpenCV DNN (C++ and Python APIs).
- PyTorch.
-
Install Python tools (Python 3.8 is already installed by default in Ubuntu 20.04).
sudo apt install python3-pip python3-venv python3-tk
-
Create and activate virtual environment.
cd ~/ python3 -m venv dl-benchmark-env source ~/dl-benchmark-env/bin/activate python3 -m pip install --upgrade pip
-
Clone repository.
sudo apt install git git clone https://github.com/itlab-vision/dl-benchmark.git
-
Install requirements.
pip install -r ~/dl-benchmark/requirements.txt
If you would like to infer deep models using the Intel® Distribution of OpenVINO™ Toolkit (Python API), please, install openvino-dev
package using pip
.
pip install openvino-dev[caffe,mxnet,onnx,pytorch,tensorflow]==2022.1.0
Note: there is no way to install tensorflow
and tensorflow2
packages to the single virtual environment, so to convert tensorflow2
models, please, create another virtual environment and install openvino-dev
package with the support of tensorflow2
:
pip install openvino-dev[tensorflow2]==2022.1.0
If you would like to infer deep models using the Intel® Distribution of OpenVINO™ Toolkit (C++ API), please, install the OpenVINO toolkit from sources or download pre-built package and follow Benchmark C++ tool build instructions to get OpenVINO C++ benchmark app built.
If you prefer Intel® Optimization for Caffe to infer deep neural networks, please, install Miniconda or Anaconda and corresponding package intel-caffe
.
conda install -c intel-caffe
If you would like to infer deep models using the Intel® Optimization for TensorFlow, please, install package intel-tensorflow
using pip
.
pip install intel-tensorflow
To infer deep learning models using the TensorFlow lite framework please install tensorflow
python package.
pip install tensorflow
To infer deep learning models using ONNX Runtime (Python API), please, install the corresponding package.
pip install onnxruntime
ONNX Runtime requires to be built from sources along with dedicated benchmark tool. Please refer to build instruction to build binaries.
If you would like to infer deep models using MXNet please install mxnet
python package.
pip install mxnet
[TBD]
OpenCV DNN CPP requires to be built from sources along with dedicated benchmark tool. Please refer to build instruction to build binaries.
To infer deep learning models using PyTorch, please, install the following packages:
pip install torch torchvision torchaudio