Skip to content

Commit 8ca1507

Browse files
authored
Fix QNN CI job (#9774)
Summary: 1. setup-linux.sh is calling install_executorch.sh already so let's remove the redundant call. 2. Migrate the legacy `executorch.extension.llm.tokenizer` code in `llama.py` to `pytorch_tokenizers`. Test Plan: Reviewers: Subscribers: Tasks: Tags: ### Summary [PLEASE REMOVE] See [CONTRIBUTING.md's Pull Requests](https://github.com/pytorch/executorch/blob/main/CONTRIBUTING.md#pull-requests) for ExecuTorch PR guidelines. [PLEASE REMOVE] If this PR closes an issue, please add a `Fixes #<issue-id>` line. [PLEASE REMOVE] If this PR introduces a fix or feature that should be the upcoming release notes, please add a "Release notes: <area>" label. For a list of available release notes labels, check out [CONTRIBUTING.md's Pull Requests](https://github.com/pytorch/executorch/blob/main/CONTRIBUTING.md#pull-requests). ### Test plan [PLEASE REMOVE] How did you test this PR? Please write down any manual commands you used and note down tests that you have written if applicable.
1 parent 97bca05 commit 8ca1507

File tree

3 files changed

+3
-7
lines changed

3 files changed

+3
-7
lines changed

.ci/scripts/test_qnn_static_llama.sh

+1-2
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
# This source code is licensed under the BSD-style license found in the
66
# LICENSE file in the root directory of this source tree.
77

8-
set -exu
8+
set -euxo pipefail
99

1010
source "$(dirname "${BASH_SOURCE[0]}")/utils.sh"
1111

@@ -56,4 +56,3 @@ if [ $exit_code1 -ne 0 ] || [ $exit_code2 -ne 0 ]; then
5656
else
5757
exit 0
5858
fi
59-
set -e

.github/workflows/pull.yml

-1
Original file line numberDiff line numberDiff line change
@@ -573,7 +573,6 @@ jobs:
573573
574574
BUILD_TOOL="cmake"
575575
576-
./install_requirements.sh --use-pt-pinned-commit
577576
PYTHON_EXECUTABLE=python bash .ci/scripts/setup-qnn-deps.sh
578577
PYTHON_EXECUTABLE=python bash .ci/scripts/build-qnn-sdk.sh
579578

examples/qualcomm/oss_scripts/llama/llama.py

+2-4
Original file line numberDiff line numberDiff line change
@@ -75,10 +75,8 @@
7575
from executorch.exir.passes.memory_planning_pass import MemoryPlanningPass
7676
from executorch.extension.llm.custom_ops import model_sharding
7777
from executorch.extension.llm.export.builder import DType
78-
from executorch.extension.llm.tokenizer.tokenizer import (
79-
Tokenizer as SentencePieceTokenizer,
80-
)
81-
from executorch.extension.llm.tokenizer.utils import get_tokenizer
78+
from pytorch_tokenizers import get_tokenizer
79+
from pytorch_tokenizers.llama2c import Llama2cTokenizer as SentencePieceTokenizer
8280

8381
from torch.ao.quantization.observer import MinMaxObserver
8482
from torch.ao.quantization.quantize_pt2e import convert_pt2e, prepare_pt2e

0 commit comments

Comments
 (0)