Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate to Hatch and add a _legacy dependencies_ flavour #54

Open
wants to merge 8 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions .dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,17 @@
# Now un-exclude:
#
!src
!compile-protos.sh
!compile_protos.py
!dynamic_dependencies.py
!dataclay-common
!requirements.txt
!requirements-legacydeps.txt
!requirements-dev.txt
!pyproject.toml
!README.md
!MANIFEST.in
!tests
!tox.ini
.dockerignore

# Reexclude protos
src/dataclay/proto
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Won't it prevent dataClay running out of the box without having to generate protos?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In theory, the [tool.hatch.build.hooks.custom] section of pyproject.toml is responsible of calling the compile_protos script on build time.

This "build time" should be:

  • when doing a pip install -e . for development
  • when doing a build process and generating the wheel
  • when doing a pip install git+https:// also for development-adjacent

Am I wrong? Could you confirm if that's the case?

20 changes: 15 additions & 5 deletions .github/workflows/docker-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,14 +63,24 @@ jobs:
type=edge,suffix=-py${{ matrix.python-version }},branch=main
type=edge,enable=${{ matrix.python-version == '3.10' }},branch=main

# Extract metadata (tags, labels) for Docker
- name: Extract Docker metadata
id: legacy-deps-meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
type=edge,suffix=-legacydeps-py${{ matrix.python-version }},branch=main
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will create all tags with -legacydeps suffix. Is it correct?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are two sections, the id: meta and the id: legacy-deps-meta, and there are two build, build-and-push and build-and-push-legacy-deps. If I didn't do anything wrong, they should be separate images and we should obtain both the historical regular tags, and the one with -legacydeps that configure legacy dependencies. At least, that is what I intended.

type=edge,suffix=-legacydeps,enable=${{ matrix.python-version == '3.10' }},branch=main

# Build and push Docker image with Buildx (don't push on PR)
- name: Build and push Docker image
id: build-and-push
uses: docker/build-push-action@v6
with:
context: .
platforms: linux/amd64,linux/arm64
build-args:
build-args: |
PYTHON_VERSION=${{ matrix.python-version }}-bullseye
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
Expand All @@ -84,12 +94,12 @@ jobs:
uses: docker/build-push-action@v6
with:
context: .
file: Dockerfile.legacy-deps
platforms: linux/amd64,linux/arm64
build-args:
build-args: |
PYTHON_VERSION=${{ matrix.python-version }}-bullseye
LEGACY_DEPS=True
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}-legacydeps
labels: ${{ steps.meta.outputs.labels }}
tags: ${{ steps.legacy-deps-meta.outputs.tags }}
labels: ${{ steps.legacy-deps-meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -24,3 +24,4 @@ tests/mock/client.properties
/.vscode
.coverage*
coverage.xml
/src/dataclay/proto/
4 changes: 3 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,10 @@ ARG PYTHON_VERSION=3.10-bookworm
# install dataclay
FROM python:$PYTHON_VERSION
COPY . /app

ARG LEGACY_DEPS=False
RUN python -m pip install --upgrade pip \
&& python -m pip install /app[telemetry]
&& python -m pip install --config-settings=LEGACY_DEPS=$LEGACY_DEPS /app[telemetry]

# prepare dataclay storage dir
RUN mkdir -p /data/storage;
Expand Down
4 changes: 3 additions & 1 deletion Dockerfile.dev
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,10 @@ ARG PYTHON_VERSION=3.10-bookworm
# install dataclay
FROM python:$PYTHON_VERSION
COPY . /app

ARG LEGACY_DEPS=False
RUN python -m pip install --upgrade pip \
&& python -m pip install -e /app[telemetry,dev]
&& python -m pip install --config-settings=LEGACY_DEPS=$LEGACY_DEPS -e /app[telemetry,dev]

# prepare dataclay storage dir
RUN mkdir -p /data/storage;
Expand Down
22 changes: 0 additions & 22 deletions Dockerfile.legacy-deps

This file was deleted.

16 changes: 0 additions & 16 deletions Dockerfile.legacy-deps.dev

This file was deleted.

16 changes: 16 additions & 0 deletions PUBLISH.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,22 @@
docker buildx build --platform linux/amd64,linux/arm64 \
-t ghcr.io/bsc-dom/dataclay:$VERSION-py3.13-bookworm \
--build-arg PYTHON_VERSION=3.13-bookworm --push .
# Repeat for Python 3.9 and 3.10 with the _legacy dependency flavour_
# Build and push Python 3.9 bookworm
docker buildx build --platform linux/amd64,linux/arm64 \
-t ghcr.io/bsc-dom/dataclay:$VERSION-legacydeps-py3.9-bookworm \
--build-arg PYTHON_VERSION=3.9-bookworm \
--build-arg LEGACY_DEPS=True \
--push .
# Build and push Python 3.10 bookworm
docker buildx build --platform linux/amd64,linux/arm64 \
-t ghcr.io/bsc-dom/dataclay:$VERSION-legacydeps-py3.10-bookworm \
-t ghcr.io/bsc-dom/dataclay:$VERSION-legacydeps \
--build-arg PYTHON_VERSION=3.10-bookworm \
--build-arg LEGACY_DEPS=True \
--push .
```

6. Publish the release distribution to PyPI:
Expand Down
7 changes: 0 additions & 7 deletions compile-protos.sh

This file was deleted.

72 changes: 72 additions & 0 deletions compile_protos.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
#!/usr/bin/env python3

import inspect
from typing import Any
from importlib import resources
from typing import Any

try:
from hatchling.builders.hooks.plugin.interface import BuildHookInterface
except ModuleNotFoundError:
if __name__ != "__main__":
# If we are not being run interactively, then that is an error
raise
BuildHookInterface = object


def run_protoc():
# Here because during the build process, CustomBuildHook will be imported
# *before* knowing the dependencies of the hook itself.
import grpc_tools.protoc

grpc_tools_proto = (resources.files("grpc_tools") / "_proto").resolve()
grpc_tools.protoc.main(
[
"grpc_tools.protoc",
"--proto_path=dataclay-common",
"--python_out=src",
"--grpc_python_out=src",
"dataclay-common/dataclay/proto/common/common.proto",
"dataclay-common/dataclay/proto/backend/backend.proto",
"dataclay-common/dataclay/proto/metadata/metadata.proto",
f"-I{grpc_tools_proto}",
]
)


def find_config_settings_in_hatchling() -> dict[str, Any]:
# Terrible workaround (their words, not mine) given by @virtuald
# https://github.com/pypa/hatch/issues/1072#issuecomment-2448985229
# Hopefully this will be fixed in the future
for frame_info in inspect.stack():
frame = frame_info.frame
module = inspect.getmodule(frame)
if (
module
and module.__name__.startswith("hatchling.build")
and "config_settings" in frame.f_locals
):
return frame.f_locals["config_settings"] or {}

return {}


class CustomBuildHook(BuildHookInterface):
def initialize(self, version: str, build_data: dict[str, Any]) -> None:
run_protoc()

def dependencies(self):
if find_config_settings_in_hatchling().get("LEGACY_DEPS", "False").lower() in (
"true",
"on",
"1",
"y",
"yes",
):
return ["grpcio-tools==1.48.2"]
else:
return ["grpcio-tools==1.67.1"]


if __name__ == "__main__":
run_protoc()
81 changes: 81 additions & 0 deletions dynamic_dependencies.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
import inspect
import os
import re
from typing import Any

from hatchling.metadata.plugin.interface import MetadataHookInterface
from packaging.requirements import Requirement
from packaging.utils import canonicalize_name

# The helper functions in this file are heavily inspired on the job layed
# out by the hatch plugin `hatch-requirements-txt`


COMMENT_RE = re.compile(r"(^|\s+)#.*$")
PIP_COMMAND_RE = re.compile(r"\s+(-[A-Za-z]|--[A-Za-z]+)")


def find_config_settings_in_hatchling() -> dict[str, Any]:
# Terrible workaround (their words, not mine) given by @virtuald
# https://github.com/pypa/hatch/issues/1072#issuecomment-2448985229
# Hopefully this will be fixed in the future
for frame_info in inspect.stack():
frame = frame_info.frame
module = inspect.getmodule(frame)
if (
module
and module.__name__.startswith("hatchling.build")
and "config_settings" in frame.f_locals
):
return frame.f_locals["config_settings"] or {}

return {}


def parse_requirements(requirements: list[str]) -> tuple[list[Requirement], list[str]]:
comments = []
parsed_requirements: list[Requirement] = []

for line in requirements:
if line.lstrip().startswith("#"):
comments.append(line)
elif line.lstrip().startswith("-"):
# Likely an argument to pip from a requirements.txt file intended for pip
# (e.g. from pip-compile)
pass
elif line:
# Strip comments from end of line
line = COMMENT_RE.sub("", line)
if "-" in line:
line = PIP_COMMAND_RE.split(line)[0]
req = Requirement(line)
req.name = canonicalize_name(req.name)
parsed_requirements.append(req)

return parsed_requirements, comments


def load_requirements(filename: str) -> list[str]:
if not os.path.isfile(filename):
raise FileNotFoundError(filename)
with open(filename, encoding="UTF-8") as fp:
contents = fp.read()
# Unfold lines ending with \
contents = re.sub(r"\\\s*\n", " ", contents)
parsed_requirements, _ = parse_requirements(contents.splitlines())

return [str(r) for r in parsed_requirements]


class DynamicDependenciesMetaDataHook(MetadataHookInterface):
def update(self, metadata):
if find_config_settings_in_hatchling().get("LEGACY_DEPS", "False").lower() in (
"true",
"on",
"1",
"y",
"yes",
):
metadata["dependencies"] = load_requirements("requirements-legacydeps.txt")
else:
metadata["dependencies"] = load_requirements("requirements.txt")
22 changes: 5 additions & 17 deletions noxfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@
# Define which Python versions to test with
PYPROJECT = nox.project.load_toml("pyproject.toml")
PYTHON_VERSIONS = nox.project.python_versions(PYPROJECT)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I get the following error when running nox:

AttributeError: module 'nox.project' has no attribute 'python_versions'

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a feature that was added on october lol.

wntrblm/nox@040a93c

Upgrade your nox. Given that nox is for developers only, I think that we can be a little bit early adopter on that 🙃

DEFAULT_PYTHON = "3.10" # Arbitrary decision, choose a reliable version
DEFAULT_PYTHON = "3.10" # Modern-ish version compatible with the legacy-deps

# Default sessions (these will be executed in Github Actions)
# Maintain a clear separation between code checking and code altering tasks (don't add format)
nox.options.sessions = ["lint", "tests"]
nox.options.sessions = ["lint", "tests", "legacy_deps_tests"]
# nox.options.reuse_existing_virtualenvs = True # TODO: Check if necessary


Expand All @@ -22,21 +22,9 @@ def tests(session):
@nox.session(python=["3.9", "3.10"], tags=["citests"])
def legacy_deps_tests(session):
"""Run the test suite with legacy dependencies."""
session.install("grpcio-tools==1.48.2", "pytest", "pytest-asyncio", "pytest-docker", "pytest-cov", "-r", "requirements-legacydeps.txt")
session.run(
# See compile-protos.sh, it should be the same command
"python3",
"-m",
"grpc_tools.protoc",
"--proto_path=dataclay-common",
"--python_out=src",
"--grpc_python_out=src",
"dataclay-common/dataclay/proto/common/common.proto",
"dataclay-common/dataclay/proto/backend/backend.proto",
"dataclay-common/dataclay/proto/metadata/metadata.proto",
)

session.install(".", "--no-deps")
session.install("pytest", "pytest-asyncio", "pytest-docker", "pytest-cov")

session.install("--config-settings=LEGACY_DEPS=True", ".")
session.run("pytest", "--disable-warnings", "--cov", "--cov-report=term-missing", "--build-legacy-deps", "tests/functional")


Expand Down
Loading
Loading