Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

rebase to main #446

Open
wants to merge 14 commits into
base: hf-model-ci
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
version: 2
updates:

# Check for updates to GitHub Actions
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
groups:
github-actions:
patterns:
- "*"
64 changes: 32 additions & 32 deletions .github/workflows/test_e2eshark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -211,7 +211,7 @@ jobs:
CACHE_DIR: ${{ matrix.cache-dir }}
steps:
- name: Checkout Test Suite
uses: actions/checkout@v2
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
repository: nod-ai/SHARK-TestSuite
path: test-suite
Expand Down Expand Up @@ -311,17 +311,17 @@ jobs:
python utils/find_duplicate_models.py -s -r ./test-onnx -o reports/duplicates.json
working-directory: ./test-suite

- uses: actions/upload-artifact@master
- uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: ci_reports_${{ matrix.backend }}_${{ matrix.test-file }}_onnx_md
path: ./test-suite/alt_e2eshark/reports/${{ matrix.test-file }}.md

- uses: actions/upload-artifact@master
- uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: ci_reports_${{ matrix.backend }}_${{ matrix.test-file }}_onnx_json
path: ./test-suite/alt_e2eshark/reports/${{ matrix.test-file }}.json

- uses: actions/upload-artifact@master
- uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: ci_reports_${{ matrix.backend }}_${{ matrix.test-file }}_duplicates_json
path: ./test-suite/alt_e2eshark/duplicates.json
Expand All @@ -346,12 +346,12 @@ jobs:
AZ_PUBLIC_KEY: ${{ secrets.SHARKPUBLIC_AZ_PUBLIC_KEY }}
steps:
- name: Checkout Test Suite
uses: actions/checkout@v2
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
repository: nod-ai/SHARK-TestSuite
path: test-suite
- name: Checkout repo
uses: actions/checkout@v2
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
repository: nod-ai/e2eshark-reports
ref: main
Expand All @@ -366,107 +366,107 @@ jobs:
pip install -r ./test-suite/alt_e2eshark/iree_requirements.txt
pip install --no-deps -r ./test-suite/alt_e2eshark/torch_mlir_requirements.txt
pip install --pre --upgrade iree-base-compiler iree-base-runtime -f https://iree.dev/pip-release-links.html
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_shark-test-suite_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_shark-test-suite_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_shark-test-suite_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_shark-test-suite_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard1_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard1_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard1_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard1_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard2_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard2_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard2_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard2_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard3_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard3_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard3_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-hf-cnn-fp32-shard3_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard1_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard1_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard1_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard1_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard2_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard2_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard2_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard2_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard3_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard3_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard3_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-int8-p0p1-shard3_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-vision-int8_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-vision-int8_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_vai-vision-int8_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_vai-vision-int8_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_migraphx_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_migraphx_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_migraphx_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_migraphx_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_nlp-shard1_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_nlp-shard1_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_nlp-shard1_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_nlp-shard1_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_nlp-shard2_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_nlp-shard2_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_nlp-shard2_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_nlp-shard2_onnx_json
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_nlp-shard3_onnx_md
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_nlp-shard3_onnx_md
- uses: actions/download-artifact@master
- uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
with:
name: ci_reports_${{ matrix.backend }}_nlp-shard3_onnx_json
path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_nlp-shard3_onnx_json
# - uses: actions/download-artifact@master
# - uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
# with:
# name: ci_reports_${{ matrix.backend }}_onnxrt-iree-ep_onnx_md
# path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_onnxrt-iree-ep_onnx_md
# - uses: actions/download-artifact@master
# - uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4.1.8
# with:
# name: ci_reports_${{ matrix.backend }}_onnxrt-iree-ep_onnx_json
# path: ./e2eshark-reports/ci_reports_${{ matrix.backend }}_onnxrt-iree-ep_onnx_json
Expand Down
54 changes: 23 additions & 31 deletions alt_e2eshark/e2e_testing/backends.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,16 @@
# See https://llvm.org/LICENSE.txt for license information.
# SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
import abc
import onnxruntime as ort
from typing import TypeVar, List
from e2e_testing.storage import TestTensors, get_shape_string
from e2e_testing.framework import CompiledOutput, ModelArtifact, CompilerOptions, RuntimeOptions
from onnx import ModelProto
import os
from pathlib import Path
from typing import TypeVar, List

from onnx import ModelProto
import onnxruntime as ort

from e2e_testing.framework import CompiledOutput, ModelArtifact, CompilerOptions, RuntimeOptions
from e2e_testing.logging_utils import run_command_and_log
from e2e_testing.storage import TestTensors, get_shape_string

Invoker = TypeVar("Invoker")

Expand All @@ -26,10 +29,6 @@ def load(self, artifact: CompiledOutput, func_name: str, extra_options : Runtime
"""loads the function with name func_name from compiled artifact. This method should return a function callable from python."""


from iree import compiler as ireec
from iree import runtime as ireert


def flag(arg : str) -> str:
if arg.startswith("--"):
return arg
Expand All @@ -51,6 +50,7 @@ def __init__(self, *, device="local-task", hal_target_backend="llvm-cpu", extra_
]

def compile(self, module, *, save_to: str = None, extra_options : CompilerOptions):
from iree import compiler as ireec
test_specific_args = list(extra_options.common_extra_args)
if self.hal_target_backend in extra_options.backend_specific_flags.keys():
test_specific_args += list(extra_options.backend_specific_flags[self.hal_target_backend])
Expand All @@ -68,6 +68,7 @@ def compile(self, module, *, save_to: str = None, extra_options : CompilerOption
return b

def load(self, artifact, *, func_name="main", extra_options : RuntimeOptions):
from iree import runtime as ireert
config = ireert.Config(self.device)
ctx = ireert.SystemContext(config=config)
vm_module = ireert.VmModule.copy_buffer(ctx.instance, artifact)
Expand Down Expand Up @@ -102,25 +103,17 @@ def __init__(self, *, device="local-task", hal_target_backend="llvm-cpu", target
]

def compile(self, module_path: str, *, save_to : str = None, extra_options : CompilerOptions) -> str:
compile_command = ['iree-compile', module_path, f'--iree-hal-target-backends={self.hal_target_backend}']
compile_command.extend(self.extra_args)
# add test-specific flags
test_specific_args = list(extra_options.common_extra_args)
if self.hal_target_backend in extra_options.backend_specific_flags.keys():
test_specific_args += list(extra_options.backend_specific_flags[self.hal_target_backend])
compile_args = self.extra_args + [flag(arg) for arg in test_specific_args]
compile_command.extend([flag(arg) for arg in test_specific_args])
# set output path
vmfb_path = os.path.join(save_to, "compiled_model.vmfb")
arg_string = f"--iree-hal-target-backends={self.hal_target_backend} "
arg_string += ' '.join(compile_args)
detail_log = os.path.join(save_to, "detail", "compilation.detail.log")
commands_log = os.path.join(save_to, "commands", "compilation.commands.log")
script = f"iree-compile {module_path} {arg_string} -o {vmfb_path} 1> {detail_log} 2>&1"
with open(commands_log, "w") as file:
file.write(script)
# remove old vmfb if it exists
Path(vmfb_path).unlink(missing_ok=True)
os.system(script)
if not os.path.exists(vmfb_path):
error_msg = f"failure executing command: \n{script}\n failed to produce a vmfb at {vmfb_path}.\n"
error_msg += f"Error detail in '{detail_log}'"
raise FileNotFoundError(error_msg)
compile_command.extend(['-o', vmfb_path])
run_command_and_log(compile_command, save_to, "compilation")
return vmfb_path

def load(self, vmfb_path: str, *, func_name=None, extra_options : RuntimeOptions):
Expand All @@ -129,16 +122,15 @@ def load(self, vmfb_path: str, *, func_name=None, extra_options : RuntimeOptions
if self.hal_target_backend in extra_options.backend_specific_flags.keys():
test_specific_args += list(extra_options.backend_specific_flags[self.hal_target_backend])
run_dir = Path(vmfb_path).parent
def func(x: TestTensors) -> str:
script = f"iree-run-module --module='{vmfb_path}' --device={self.device} "
for arg in test_specific_args:
script += f'{flag(arg)} '
def func(x: TestTensors) -> List[str]:
command = ["iree-run-module", f"--module='{vmfb_path}'", f"--device={self.device}"]
command.extend([flag(arg) for arg in test_specific_args])
if func_name:
script += f"--function='{func_name}' "
command.append(f"--function='{func_name}'")
torch_inputs = x.to_torch().data
for index, input in enumerate(torch_inputs):
script += f"--input='{get_shape_string(input)}=@{run_dir}/input.{index}.bin' "
return script
command.append(f"--input='{get_shape_string(input)}=@{run_dir}/input.{index}.bin'")
return command
return func


Expand Down
22 changes: 15 additions & 7 deletions alt_e2eshark/e2e_testing/framework.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ class ImporterOptions(NamedTuple):
param_gb_threshold : Optional[float] = None

class CompilerOptions(NamedTuple):
"""Specify, for specific iree-hal-target-backends, a tuple of extra compiler flags.
"""Specify, for specific iree-hal-target-backends, a tuple of extra compiler flags.
Also allows backend-agnostic options to be included."""
backend_specific_flags : Dict[str, Tuple[str]] = dict()
common_extra_args : Tuple[str] = tuple()
Expand Down Expand Up @@ -64,6 +64,14 @@ def __init__(
self.extra_options = ExtraOptions()
self.update_extra_options()


def update_model_without_ext_data(self):
"""For large models, which fail opset_version updating, use this method to update without loading external data.
This will also trace the graph and copy the external data references which gets wiped out otherwise.
"""
update_no_ext(onnx_model_path=self.model, opset_version=self.opset_version)


def forward(self, input: Optional[TestTensors] = None) -> TestTensors:
"""Applies self.model to self.input. Only override if necessary for specific models"""
input = input.to_numpy().data
Expand All @@ -81,7 +89,7 @@ def forward(self, input: Optional[TestTensors] = None) -> TestTensors:
return TestTensors(model_output)

def update_dim_param_dict(self):
"""Can be overridden to modify a dictionary of dim parameters (self.dim_param_dict) used to
"""Can be overridden to modify a dictionary of dim parameters (self.dim_param_dict) used to
construct inputs for a model with dynamic dims.
"""
pass
Expand Down Expand Up @@ -119,7 +127,7 @@ def construct_inputs(self) -> TestTensors:
def apply_postprocessing(self, output: TestTensors):
"""can be overridden to define post-processing methods for individual models"""
return output

def save_processed_output(self, output: TestTensors, save_to: str, name: str):
"""can be overridden to provide instructions on saving processed outputs (e.g., images, labels, text)"""
pass
Expand All @@ -131,7 +139,7 @@ def get_signature(self, *, from_inputs=True, leave_dynamic=False):
if not os.path.exists(self.model):
self.construct_model()
if not leave_dynamic:
self.update_dim_param_dict()
self.update_dim_param_dict()
return get_signature_for_onnx_model(self.model, from_inputs=from_inputs, dim_param_dict=self.dim_param_dict, leave_dynamic=leave_dynamic)

def load_inputs(self, dir_path):
Expand All @@ -154,7 +162,7 @@ def load_golden_outputs(self, dir_path):
"""computes the input signature of the onnx model and loads golden outputs from bin files"""
shapes, dtypes = self.get_signature(from_inputs=False)
return TestTensors.load_from(shapes, dtypes, dir_path, "golden_output")

def update_opset_version_and_overwrite(self):
if not self.opset_version:
return
Expand All @@ -167,7 +175,7 @@ def update_opset_version_and_overwrite(self):
og_model, self.opset_version
)
onnx.save(model, self.model)

def get_metadata(self):
model_size = os.path.getsize(self.model)
freq = get_op_frequency(self.model)
Expand All @@ -177,7 +185,7 @@ def get_metadata(self):


# TODO: extend TestModel to a union, or make TestModel a base class when supporting other frontends
TestModel = OnnxModelInfo
TestModel = OnnxModelInfo
CompiledArtifact = TypeVar("CompiledArtifact")
ModelArtifact = Union[Module, onnx.ModelProto]
CompiledOutput = Union[CompiledArtifact, ort.InferenceSession]
Expand Down
Loading
Loading