Skip to content

Commit 8a3a958

Browse files
Merge pull request #346 from tillahoffmann/fetch-requirements
Add script to fetch requirements from GitHub Action artifacts.
2 parents c0f051b + bd81fc7 commit 8a3a958

13 files changed

+1031
-48
lines changed

.github/PULL_REQUEST_TEMPLATE/new_container.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ You have implemented a new container and would like to contribute it? Great! Her
22

33
- [ ] Create a new feature directory and populate it with the package structure [described in the documentation](https://testcontainers-python.readthedocs.io/en/latest/#package-structure). Copying one of the existing features is likely the best way to get started.
44
- [ ] Implement the new feature (typically in `__init__.py`) and corresponding tests.
5-
- [ ] Add a line `-e file:[feature name]` to `requirements.in` and run `make requirements`. This command will find any new requirements and generate lock files to ensure reproducible builds (see the [pip-tools documentation](https://pip-tools.readthedocs.io/en/latest/) for details). Then run `pip install -r requirements/[your python version].txt` to install the new requirements.
65
- [ ] Update the feature `README.rst` and add it to the table of contents (`toctree` directive) in the top-level `README.rst`.
76
- [ ] Add a line `[feature name]` to the list of components in the GitHub Action workflow in `.github/workflows/main.yml` to run tests, build, and publish your package when pushed to the `main` branch.
87
- [ ] Rebase your development branch on `main` (or merge `main` into your development branch).
8+
- [ ] Add a line `-e file:[feature name]` to `requirements.in` and open a pull request. Opening a pull request will automatically generate lock files to ensure reproducible builds (see the [pip-tools documentation](https://pip-tools.readthedocs.io/en/latest/) for details). Finally, run `python get_requirements.py --pr=[your PR number]` to fetch the updated requirement files (the build needs to have succeeded).

.github/workflows/requirements.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ on:
88
jobs:
99
requirements:
1010
strategy:
11+
fail-fast: false
1112
matrix:
1213
runtime:
1314
- machine: ubuntu-latest

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -72,3 +72,4 @@ venv
7272
.DS_Store
7373
.python-version
7474
.env
75+
.github-token

Makefile

Lines changed: 1 addition & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
PYTHON_VERSIONS = 3.7 3.8 3.9 3.10 3.11
22
PYTHON_VERSION ?= 3.10
33
IMAGE = testcontainers-python:${PYTHON_VERSION}
4-
REQUIREMENTS = $(addprefix requirements/ubuntu-latest-,${PYTHON_VERSIONS:=.txt})
54
RUN = docker run --rm -it
65
# Get all directories that contain a setup.py and get the directory name.
76
PACKAGES = $(subst /,,$(dir $(wildcard */setup.py)))
@@ -43,7 +42,7 @@ ${UPLOAD} : %/upload :
4342
fi
4443

4544
# Targets to build docker images
46-
image: requirements/${PYTHON_VERSION}.txt
45+
image: requirements/ubunut-latest-${PYTHON_VERSION}.txt
4746
docker build --build-arg version=${PYTHON_VERSION} -t ${IMAGE} .
4847

4948
# Targets to run tests in docker containers
@@ -63,13 +62,6 @@ doctest : ${DOCTESTS}
6362
${DOCTESTS} : %/doctest :
6463
sphinx-build -b doctest -c doctests $* docs/_build
6564

66-
# Targets to build requirement files
67-
requirements : ${REQUIREMENTS}
68-
${REQUIREMENTS} : requirements/%.txt : requirements.in */setup.py
69-
mkdir -p $(dir $@)
70-
${RUN} -w /workspace -v `pwd`:/workspace --platform=linux/amd64 python:$* bash -c \
71-
"pip install pip-tools && pip-compile --resolver=backtracking -v --upgrade -o $@ $<"
72-
7365
# Remove any generated files.
7466
clean :
7567
rm -rf docs/_build

README.rst

Lines changed: 1 addition & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -108,11 +108,4 @@ Testcontainers is a collection of `implicit namespace packages <https://peps.pyt
108108
Contributing a New Feature
109109
^^^^^^^^^^^^^^^^^^^^^^^^^^
110110

111-
You want to contribute a new feature or container? Great! You can do that in six steps.
112-
113-
1. Create a new feature directory and populate it with the [package structure]_ as described above. Copying one of the existing features is likely the best way to get started.
114-
2. Implement the new feature (typically in :code:`__init__.py`) and corresponding tests.
115-
3. Add a line :code:`-e file:[feature name]` to :code:`requirements.in` and run :code:`make requirements`. This command will find any new requirements and generate lock files to ensure reproducible builds (see the `pip-tools <https://pip-tools.readthedocs.io/en/latest/>`__ documentation for details). Then run :code:`pip install -r requirements/[your python version].txt` to install the new requirements.
116-
4. Update the feature :code:`README.rst` and add it to the table of contents (:code:`toctree` directive) in the top-level :code:`README.rst`.
117-
5. Add a line :code:`[feature name]` to the list of components in the GitHub Action workflow in :code:`.github/workflows/main.yml` to run tests, build, and publish your package when pushed to the :code:`main` branch.
118-
6. Rebase your development branch on :code:`main` (or merge :code:`main` into your development branch).
111+
You want to contribute a new feature or container? Great! You can do that in six steps as outlined `here <https://github.com/testcontainers/testcontainers-python/blob/main/.github/PULL_REQUEST_TEMPLATE/new_container.md>__`.

get_requirements.py

Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
import argparse
2+
import io
3+
import pathlib
4+
import requests
5+
import shutil
6+
import tempfile
7+
import zipfile
8+
9+
10+
def __main__() -> None:
11+
parser = argparse.ArgumentParser()
12+
parser.add_argument("--owner", default="testcontainers")
13+
parser.add_argument("--repo", default="testcontainers-python")
14+
parser.add_argument("--run", help="GitHub Action run id")
15+
parser.add_argument("--pr", help="GitHub PR number")
16+
parser.add_argument("--branch", default="main")
17+
parser.add_argument("--token", help="GitHub autentication token")
18+
args = parser.parse_args()
19+
20+
# Get an access token.
21+
if args.token:
22+
token = args.token
23+
elif (path := pathlib.Path(".github-token")).is_file():
24+
token = path.read_text().strip()
25+
else:
26+
token = input("we need a GitHub access token to fetch the requirements; please visit "
27+
"https://github.com/settings/tokens/new, create a token with `public_repo` "
28+
"scope, and paste it here: ").strip()
29+
cache = input("do you want to cache the token in a `.github-token` file [Ny]? ")
30+
if cache.lower().startswith("y"):
31+
path.write_text(token)
32+
33+
headers = {
34+
"Authorization": f"Bearer {token}",
35+
}
36+
base_url = f"https://api.github.com/repos/{args.owner}/{args.repo}"
37+
38+
if args.run: # Run id was specified.
39+
run = args.run
40+
elif args.pr: # PR was specified, let's get the most recent run id.
41+
print(f"fetching most recent commit for PR #{args.pr}")
42+
response = requests.get(f"{base_url}/pulls/{args.pr}", headers=headers)
43+
response.raise_for_status()
44+
response = response.json()
45+
head_sha = response["head"]["sha"]
46+
else: # Nothing was specified, let's get the most recent run id on the main branch.
47+
print(f"fetching most recent commit for branch `{args.branch}`")
48+
response = requests.get(f"{base_url}/branches/{args.branch}", headers=headers)
49+
response.raise_for_status()
50+
response = response.json()
51+
head_sha = response["commit"]["sha"]
52+
53+
# List all completed runs and find the one that generated the requirements.
54+
response = requests.get(f"{base_url}/actions/runs", headers=headers, params={
55+
"head_sha": head_sha,
56+
"status": "success",
57+
})
58+
response.raise_for_status()
59+
response = response.json()
60+
61+
# Get the requirements run.
62+
runs = [run for run in response["workflow_runs"] if
63+
run["path"].endswith("requirements.yml")]
64+
if len(runs) != 1:
65+
raise RuntimeError(f"could not identify unique workflow run: {runs}")
66+
run = runs[0]["id"]
67+
68+
# Get all the artifacts.
69+
print(f"fetching artifacts for run {run} ...")
70+
url = f"{base_url}/actions/runs/{run}/artifacts"
71+
response = requests.get(url, headers=headers)
72+
response.raise_for_status()
73+
response = response.json()
74+
artifacts = response["artifacts"]
75+
print(f"discovered {len(artifacts)} artifacts")
76+
77+
# Get the content for each artifact and save it.
78+
for artifact in artifacts:
79+
name: str = artifact["name"]
80+
name = name.removeprefix("requirements-")
81+
print(f"fetching artifact {name} ...")
82+
response = requests.get(artifact["archive_download_url"], headers=headers)
83+
response.raise_for_status()
84+
with zipfile.ZipFile(io.BytesIO(response.content)) as zip, \
85+
tempfile.TemporaryDirectory() as tempdir:
86+
zip.extract("requirements.txt", tempdir)
87+
shutil.move(pathlib.Path(tempdir) / "requirements.txt",
88+
pathlib.Path("requirements") / name)
89+
90+
print("done")
91+
92+
93+
if __name__ == "__main__":
94+
__main__()

0 commit comments

Comments
 (0)