Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
TESTING IN PROGRESS
apache/airflow:2.10.2-python3.11
requirements-composer-2.11.1-airflow-2.10.2.txt
file with the appropriate dependenciesdocker build . -t calitp-airflow-test
As of 2025 Feb 6, the following packages in requirements.txt of the airflow image need to be updated to work with Python 3.11:
boto3==1.36.15
, which requires botocore 1.29.165, requires urllib3<1.27, which is too old for the newer Composer images.platformdirs<3,>=2.5
, whereas the current Composer image requires 4.3.6. Oddly, the previous Composer image we were using requiredplatformdirs==3.2.0
, so I'm not actually sure how that was working without conflict (except that our requirements were installed after the Composer requirements).pydantic==1.9
because of typing extension conflicts, which should no longer be an issue, as we are using Python 3.11 in the new Composer image.Additionally, the following package version requirements have been loosened in calitp-data-infra:
pydantic = ">1.9"
pendulum = ">2.1.2"
google-cloud-secret-manager = ">2.16.4"
Replaced all imports from
pydantic
to usepydantic.v1
compatibility layer within Pydantic 2+ (found withpydantic(?!\.v1)
).In order to test the updated calitp-data-infra package, I had to copy the packages folder into the airflow folder, add
COPY ./packages/calitp-data-infra/ /tmp/calitp-data-infra/
to the airflow Dockerfile, and then usecalitp-data-infra @ file:///tmp/calitp-data-infra
in the requirements file.As a follow-on, we should consider upgrading our use of Pydantic to the latest version. The
bump-pydantic
tool should help with that.Description
Describe your changes and why you're making them. Please include the context, motivation, and relevant dependencies.
Resolves #[issue]
Type of change
How has this been tested?
Include commands/logs/screenshots as relevant.
If making changes to dbt models, please run the command
poetry run dbt run -s CHANGED_MODEL
andpoetry run dbt test -s CHANGED_MODEL
, then include the output in this section of the PR.Post-merge follow-ups
Document any actions that must be taken post-merge to deploy or otherwise implement the changes in this PR (for example, running a full refresh of some incremental model in dbt). If these actions will take more than a few hours after the merge or if they will be completed by someone other than the PR author, please create a dedicated follow-up issue and link it here to track resolution.