Skip to content

Commit 267500e

Browse files
Optimum pipelines (#2343)
* optimum pipelines * fix * fix ci tests * test * test * fix ? * test * fix * add ort model test * test * test
1 parent 0342fd1 commit 267500e

File tree

10 files changed

+630
-483
lines changed

10 files changed

+630
-483
lines changed

.github/workflows/test_exporters_common.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,8 @@ jobs:
3636
run: |
3737
pip install --upgrade pip
3838
pip install --no-cache-dir torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
39-
pip install .[tests,exporters]
39+
pip install optimum-onnx@git+https://github.com/huggingface/optimum-onnx.git
40+
pip install .[tests]
4041
4142
- name: Test with pytest
4243
run: |

.github/workflows/test_pipelines.yml

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
name: Optimum Pipelines / Python - Test
2+
3+
on:
4+
push:
5+
branches: [main]
6+
pull_request:
7+
branches: [main]
8+
9+
concurrency:
10+
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
11+
cancel-in-progress: true
12+
13+
env:
14+
UV_SYSTEM_PYTHON: 1
15+
UV_TORCH_BACKEND: auto
16+
TRANSFORMERS_IS_CI: true
17+
18+
jobs:
19+
build:
20+
strategy:
21+
fail-fast: false
22+
matrix:
23+
python-version: [3.9]
24+
runs-on: [ubuntu-22.04]
25+
26+
runs-on: ${{ matrix.runs-on }}
27+
28+
steps:
29+
- name: Checkout code
30+
uses: actions/checkout@v4
31+
32+
- name: Setup Python ${{ matrix.python-version }}
33+
uses: actions/setup-python@v5
34+
with:
35+
python-version: ${{ matrix.python-version }}
36+
37+
- name: Install dependencies
38+
run: |
39+
pip install --upgrade pip uv
40+
uv pip install --no-cache-dir optimum-onnx[onnxruntime]@git+https://github.com/huggingface/optimum-onnx.git
41+
uv pip install --no-cache-dir .[tests]
42+
43+
- name: Test with pytest
44+
run: |
45+
pytest tests/pipelines -vvvv --durations=0

docs/source/quicktour.mdx

Lines changed: 0 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -129,42 +129,6 @@ To train transformers on Habana's Gaudi processors, 🤗 Optimum provides a `Gau
129129

130130
You can find more examples in the [documentation](https://huggingface.co/docs/optimum/habana/quickstart) and in the [examples](https://github.com/huggingface/optimum-habana/tree/main/examples).
131131

132-
133-
#### ONNX Runtime
134-
135-
To train transformers with ONNX Runtime's acceleration features, 🤗 Optimum provides a `ORTTrainer` that is very similar to the 🤗 Transformers [Trainer](https://huggingface.co/docs/transformers/main_classes/trainer). Here is a simple example:
136-
137-
```diff
138-
- from transformers import Trainer, TrainingArguments
139-
+ from optimum.onnxruntime import ORTTrainer, ORTTrainingArguments
140-
141-
# Download a pretrained model from the Hub
142-
model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased")
143-
144-
# Define the training arguments
145-
- training_args = TrainingArguments(
146-
+ training_args = ORTTrainingArguments(
147-
output_dir="path/to/save/folder/",
148-
optim="adamw_ort_fused",
149-
...
150-
)
151-
152-
# Create a ONNX Runtime Trainer
153-
- trainer = Trainer(
154-
+ trainer = ORTTrainer(
155-
model=model,
156-
args=training_args,
157-
train_dataset=train_dataset,
158-
+ feature="text-classification", # The model type to export to ONNX
159-
...
160-
)
161-
162-
# Use ONNX Runtime for training!
163-
trainer.train()
164-
```
165-
166-
You can find more examples in the [documentation](https://huggingface.co/docs/optimum/onnxruntime/usage_guides/trainer) and in the [examples](https://github.com/huggingface/optimum/tree/main/examples/onnxruntime/training).
167-
168132
## Out of the box ONNX export
169133

170134
The Optimum library handles out of the box the ONNX export of Transformers and Diffusers models!

optimum/configuration_utils.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -342,6 +342,7 @@ def to_dict(self) -> Dict[str, Any]:
342342
output["transformers_version"] = transformers_version_str
343343
output["optimum_version"] = __version__
344344

345-
self.dict_torch_dtype_to_str(output)
345+
if hasattr(self, "dict_torch_dtype_to_str"):
346+
self.dict_torch_dtype_to_str(output)
346347

347348
return output

optimum/exporters/utils.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,8 @@
2626

2727
from ..utils import (
2828
DIFFUSERS_MINIMUM_VERSION,
29-
check_if_diffusers_greater,
3029
is_diffusers_available,
30+
is_diffusers_version,
3131
logging,
3232
)
3333
from ..utils.import_utils import _diffusers_version
@@ -38,7 +38,7 @@
3838

3939

4040
if is_diffusers_available():
41-
if not check_if_diffusers_greater(DIFFUSERS_MINIMUM_VERSION.base_version):
41+
if is_diffusers_version("<", DIFFUSERS_MINIMUM_VERSION.base_version):
4242
raise ImportError(
4343
f"We found an older version of diffusers {_diffusers_version} but we require diffusers to be >= {DIFFUSERS_MINIMUM_VERSION}. "
4444
"Please update diffusers by running `pip install --upgrade diffusers`"

optimum/pipelines/__init__.py

Lines changed: 278 additions & 6 deletions
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)