Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/main' into 782-add-rbac-to-workf…
Browse files Browse the repository at this point in the history
…lows
  • Loading branch information
Sparrow1029 committed Feb 26, 2025
2 parents c72f8b4 + 06ed403 commit 43aeacb
Show file tree
Hide file tree
Showing 90 changed files with 1,314 additions and 296 deletions.
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 2.10.0rc1
current_version = 3.0.0rc1
commit = False
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(rc(?P<build>\d+))?
Expand Down
71 changes: 71 additions & 0 deletions .github/workflows/run-codspeed-tests.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
name: CodSpeed

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]

jobs:
codspeed:
name: Run benchmarks
runs-on: ubuntu-latest
container:
image: python:3.11
options: --privileged
services:
postgres:
image: postgres:15-alpine
# Provide the password for postgres
env:
POSTGRES_PASSWORD: nwa
POSTGRES_USER: nwa
# Set health checks to wait until postgres has started
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
redis:
# Docker Hub image
image: redis
# Set health checks to wait until redis has started
options: >-
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
uses: actions/checkout@v3

- name: Install dependencies
run: |
apt update
apt install curl git build-essential libpq-dev libffi-dev -y
python -m pip install --upgrade pip
pip install flit
flit install --deps develop --symlink
echo "GIT_COMMIT_HASH=\"test\"" > orchestrator/version.py
env:
FLIT_ROOT_INSTALL: 1

# Prevent error "repository path is not owned by the current user"
- name: Fix git owner
run: git config --global --add safe.directory "*"

# Speculatively add the cargo binary directory to the PATH because codspeed's installer script somehow doesn't
- name: Add $HOME/.cargo/bin to PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"

- uses: CodSpeedHQ/action@v3
with:
run: CACHE_URI=redis://redis DATABASE_URI=postgresql://$POSTGRES_USER:$POSTGRES_PASSWORD@$POSTGRES_HOST/$POSTGRES_DB pytest test/unit_tests --codspeed
token: ${{ secrets.CODSPEED_TOKEN }}
env:
POSTGRES_DB: orchestrator-core-test
POSTGRES_USER: nwa
POSTGRES_PASSWORD: nwa
POSTGRES_HOST: postgres
ENVIRONMENT: TESTING
7 changes: 0 additions & 7 deletions .github/workflows/run-linting-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,13 +29,6 @@ jobs:
python -m pip install --upgrade pip
pip install flit
flit install --deps develop --symlink
- name: Check formatting
run: |
black --check .
- name: Lint with ruff
run: |
# stop the build if there are Python syntax errors or undefined names
ruff check .
- name: Check with mypy
run: |
mypy .
Expand Down
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ default_language_version:
exclude: ^test/unit_tests/cli/data/generate/.*\.py|orchestrator/vendor.*
repos:
- repo: https://github.com/psf/black
rev: 24.8.0
rev: 25.1.0
hooks:
- id: black
# - repo: https://github.com/asottile/blacken-docs
Expand All @@ -14,12 +14,12 @@ repos:
# Disabling since this cannot parse bit shift operators how we've overloaded them and you can't ignore lines.
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.6.1
rev: v0.9.6
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix, --show-fixes]
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
rev: v5.0.0
hooks:
- id: trailing-whitespace
exclude: .bumpversion.cfg
Expand Down
7 changes: 7 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -191,3 +191,10 @@ You can do the necessary change with a clean, e.g. every change committed, branc
```shell
bumpversion patch --new-version 0.4.1-rc3
```

### Changing the Core database schema
When you would like to change the core database schema, execute the following steps.

- Create the new model `orchestrator/database/models.py`
- `cd orchestrator/migrations`
- `alembic revision --autogenerate -m "Name of the migratioin"`
14 changes: 14 additions & 0 deletions docs/contributing/guidelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,20 @@ free to raise an issue in the project. We will strive to reply to your enquiry A
## Documentation
We use [**MKDOCS**](https://www.mkdocs.org) as a documentation tool. Please create a PR if you have any additions or contributions to make. All docs can be written in MD or html. Full guidelines on how to set this up can be found [here](development.md).

## Pre-commit hooks
We use pre-commit hooks to ensure that the code is formatted correctly and that the tests pass. To install the
pre-commit hooks, run the following command:

```shell
pre-commit install
```

To run the pre-commit hooks manually, run the following command:

```shell
pre-commit run --all-files
```

## Orchestrator release
The `orchestrator-core` has no release schedule but is actively used and maintained by the workflow orchestrator group.
Creating a new release is done by the developers of the project and the procedure is as follows.
Expand Down
18 changes: 18 additions & 0 deletions docs/migration-guide/3.0.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
hide:
- navigation
---
# 3.0 Migration Guide

In this document we describe the steps that should be taken to migrate from `orchestrator-core` v2 to v3.

## About 3.0

In this release, deprecated import statements from the `orchestrator.types` module are removed, that now come from
`pydantic-forms.types` instead. These will have to be updated in your implementation of the orchestrator as well.

## Steps

To update the import statements you may have in your implementation of Workflow Orchestrator, we offer a migration
script that can be run as follows: `python -m orchestrator.devtools.scripts.migrate_30 <dir>` where `<dir>` points to
your orchestrator implementation.
4 changes: 3 additions & 1 deletion mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,9 @@ nav:
# - Workflow Lifecycles: reference-docs/workflows/workflow-lifecycles.md
- Callbacks: reference-docs/workflows/callbacks.md
- Websockets: reference-docs/websockets.md
- Migration guide: migration-guide/2.0.md
- Migration guides:
- 2.0: migration-guide/2.0.md
- 3.0: migration-guide/3.0.md

- Workshops:
# - Beginner:
Expand Down
35 changes: 0 additions & 35 deletions mutmut_config.py

This file was deleted.

2 changes: 1 addition & 1 deletion orchestrator/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@

"""This is the orchestrator workflow engine."""

__version__ = "2.10.0rc1"
__version__ = "3.0.0rc1"

from orchestrator.app import OrchestratorCore
from orchestrator.settings import app_settings
Expand Down
27 changes: 24 additions & 3 deletions orchestrator/api/api_v1/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,12 +19,15 @@
from orchestrator.api.api_v1.endpoints import (
health,
processes,
product_blocks,
products,
resource_types,
settings,
subscription_customer_descriptions,
subscriptions,
translations,
user,
workflows,
ws,
)
from orchestrator.security import authorize
Expand All @@ -34,14 +37,32 @@
api_router.include_router(
processes.router, prefix="/processes", tags=["Core", "Processes"], dependencies=[Depends(authorize)]
)
api_router.include_router(
subscriptions.router,
prefix="/subscriptions",
tags=["Core", "Subscriptions"],
dependencies=[Depends(authorize)],
)
api_router.include_router(processes.ws_router, prefix="/processes", tags=["Core", "Processes"])
api_router.include_router(
products.router, prefix="/products", tags=["Core", "Product"], dependencies=[Depends(authorize)]
)
api_router.include_router(
subscriptions.router,
prefix="/subscriptions",
tags=["Core", "Subscriptions"],
product_blocks.router,
prefix="/product_blocks",
tags=["Core", "Product Blocks"],
dependencies=[Depends(authorize)],
)
api_router.include_router(
resource_types.router,
prefix="/resource_types",
tags=["Core", "Resource Types"],
dependencies=[Depends(authorize)],
)
api_router.include_router(
workflows.router,
prefix="/workflows",
tags=["Core", "Workflows"],
dependencies=[Depends(authorize)],
)
api_router.include_router(
Expand Down
2 changes: 1 addition & 1 deletion orchestrator/api/api_v1/endpoints/processes.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,6 @@
)
from orchestrator.services.settings import get_engine_settings
from orchestrator.settings import app_settings
from orchestrator.types import JSON, State
from orchestrator.utils.enrich_process import enrich_process
from orchestrator.websocket import (
WS_CHANNELS,
Expand All @@ -64,6 +63,7 @@
websocket_manager,
)
from orchestrator.workflow import ProcessStatus
from pydantic_forms.types import JSON, State

router = APIRouter()

Expand Down
56 changes: 56 additions & 0 deletions orchestrator/api/api_v1/endpoints/product_blocks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Copyright 2019-2020 SURF.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from http import HTTPStatus
from uuid import UUID

from fastapi.param_functions import Body
from fastapi.routing import APIRouter

from orchestrator.api.error_handling import raise_status
from orchestrator.db import db
from orchestrator.db.models import ProductBlockTable
from orchestrator.schemas.product_block import ProductBlockPatchSchema, ProductBlockSchema

router = APIRouter()


@router.get("/{product_block_id}", response_model=ProductBlockSchema)
def get_product_block_description(product_block_id: UUID) -> str:
product_block = db.session.get(ProductBlockTable, product_block_id)
if product_block is None:
raise_status(HTTPStatus.NOT_FOUND)
return product_block


@router.patch("/{product_block_id}", status_code=HTTPStatus.CREATED, response_model=ProductBlockSchema)
async def patch_product_block_by_id(
product_block_id: UUID, data: ProductBlockPatchSchema = Body(...)
) -> ProductBlockTable:
product_block = db.session.get(ProductBlockTable, product_block_id)
if not product_block:
raise_status(HTTPStatus.NOT_FOUND, f"Product_block id {product_block_id} not found")

return await _patch_product_block_description(data, product_block)


async def _patch_product_block_description(
data: ProductBlockPatchSchema,
product_block: ProductBlockTable,
) -> ProductBlockTable:

updated_properties = data.model_dump(exclude_unset=True)
description = updated_properties.get("description", product_block.description)
product_block.description = description
db.session.commit()
return product_block
Loading

0 comments on commit 43aeacb

Please sign in to comment.