Skip to content
This repository was archived by the owner on Apr 2, 2025. It is now read-only.

Update build and deps #143

Merged
merged 4 commits into from
Feb 24, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
83 changes: 60 additions & 23 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,29 +1,66 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
- repo: local
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know we've liked local pre-commit for dev, but they do not fully play nice with some popular tooling (aka VSCode), so I think it could be easier for new contributers to avoid dealing with

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tylanderson Could you give a bit more context on the problems with a local repo for pre-commit? I feel like I've experienced issues with local pre-commit in the past (I also use VSCode), but I haven't had a problem with that in a while, so I'd benefit from a reminder on what they are 🙂 . I do remember problems where the tool versions specified in the pre-commit yaml (via remote repos) were out of sync with the same packages that were in my requirements-dev.txt: I'd end up where formatting with the version of ruff installed in my venv did something different than the version of ruff specified in the pre-commit yaml. I remember this because I still stumble across this problem in repos that do not use a local repo for pre-commit.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

avoiding having a version difference between the local venv (which is what vscode and I assume other tools use) and the pre-commit env was the intention with this. I haven't had any problems with this approach and VSCode, so also interested to hear about the problems.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pjhartzell I believe this issue tracks the issue I had seen in the past, it's a bit non-obvious from what I remember.

I do agree that the benefit of catching formatting, etc early is nice.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should document that issue in the README -- I think having a consistent environment that doesn't have to be independently maintained and with possibly different default configuration is worth making it easier to use a single IDE that has broken behavior (at least as I see it).

hooks:
- id: check-merge-conflict
- id: end-of-file-fixer
- id: mixed-line-ending
- id: ruff
name: Format and lint with ruff
entry: ./bin/run-ruff.bash
language: system
types: [python]
pass_filenames: false
verbose: true
- id: mypy
name: Check typing with mypy
entry: ./bin/run-mypy.bash
language: system
types: [python]
pass_filenames: false
verbose: true
- id: pymarkdown
name: Markdownlint
description: Run markdownlint on Markdown files
entry: pymarkdown scan
language: python
files: \.(md|mdown|markdown)$
exclude: ^.github/pull_request_template.md$
- id: check-added-large-files
name: Check for added large files
entry: check-added-large-files
language: system
- id: check-toml
name: Check Toml
entry: check-toml
language: system
types: [toml]
- id: check-yaml
name: Check Yaml
entry: check-yaml
language: system
types: [yaml]
- id: mixed-line-ending
name: Check mixed line endings
entry: mixed-line-ending
language: system
types: [text]
stages: [pre-commit, pre-push, manual]
- id: end-of-file-fixer
name: Fix End of Files
entry: end-of-file-fixer
language: system
types: [text]
stages: [pre-commit, pre-push, manual]
- id: trailing-whitespace
name: Trim Trailing Whitespace
entry: trailing-whitespace-fixer
language: system
types: [text]
stages: [pre-commit, pre-push, manual]
- id: check-merge-conflict
name: Check merge conflicts
entry: check-merge-conflict
language: system
- id: no-commit-to-branch
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: v0.7.4
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
- id: ruff-format
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.13.0
hooks:
- id: mypy
args: [] # override default of [--strict, --ignore-missing-imports]
files: src/
additional_dependencies:
- types-pyRFC3339~=1.1.1
- pydantic~=2.10.1
- returns~=0.23.0
- fastapi~=0.115.0
- geojson_pydantic~=1.1.1
name: Check not committting to main
entry: no-commit-to-branch
language: system
args: ["--branch", "main"]
pass_filenames: false
5 changes: 3 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -5,7 +5,7 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## [unreleased]
## [v0.6.0] - 2025-02-11

### Added

@@ -169,7 +169,8 @@ Initial release
- Add links `opportunities` and `create-order` to Product
- Add link `create-order` to OpportunityCollection

[unreleased]: https://github.com/stapi-spec/stapi-fastapi/compare/v0.5.0...main
<!-- [unreleased]: https://github.com/stapi-spec/stapi-fastapi/compare/v0.5.0...main -->
[v0.6.0]: https://github.com/stapi-spec/stapi-fastapi/tree/v0.6.0
[v0.5.0]: https://github.com/stapi-spec/stapi-fastapi/tree/v0.5.0
[v0.4.0]: https://github.com/stapi-spec/stapi-fastapi/tree/v0.4.0
[v0.3.0]: https://github.com/stapi-spec/stapi-fastapi/tree/v0.3.0
58 changes: 50 additions & 8 deletions adrs/constraints.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,73 @@
# Constraints and Opportunity Properties

Previously, the Constraints and Opportunity Properties were the same concept/representation. However, these represent distinct but related attributes. Constraints represents the terms that can be used in the filter sent to the Opportunities Search and Order Create endpoints. These are frequently the same or related values that will be part of the STAC Items that are used to fulfill an eventual Order. Opportunity Properties represent the expected range of values that these STAC Items are expected to have. An opportunity is a prediction about the future, and as such, the values for the Opportunity are fuzzy. For example, the sun azimuth angle will (likely) be within a predictable range of values, but the exact value will not be known until after the capture occurs. Therefore, it is necessary to describe the Opportunity in a way that describes these ranges.
Previously, the Constraints and Opportunity Properties were the same
concept/representation. However, these represent distinct but related
attributes. Constraints represents the terms that can be used in the filter sent
to the Opportunities Search and Order Create endpoints. These are frequently the
same or related values that will be part of the STAC Items that are used to
fulfill an eventual Order. Opportunity Properties represent the expected range
of values that these STAC Items are expected to have. An opportunity is a
prediction about the future, and as such, the values for the Opportunity are
fuzzy. For example, the sun azimuth angle will (likely) be within a predictable
range of values, but the exact value will not be known until after the capture
occurs. Therefore, it is necessary to describe the Opportunity in a way that
describes these ranges.

For example, for the concept of "off_nadir":

The Constraint will be a term "off_nadir" that can be a value 0 to 45.
This is used in a CQL2 filter to the Opportunities Search endpoint to restrict the allowable values from 0 to 15
The Opportunity that is returned from Search has an Opportunity Property "off_nadir" with a description that the value of this field in the resulting STAC Items will be between 4 and 8, which falls within the filter restriction of 0-15.
The Opportunity that is returned from Search has an Opportunity Property
"off_nadir" with a description that the value of this field in the resulting
STAC Items will be between 4 and 8, which falls within the filter restriction of 0-15.
An Order is created with the original filter and other fields.
The Order is fulfilled with a STAC Item that has an off_nadir value of 4.8.

As of Dec 2024, the STAPI spec says only that the Opportunity Properties must have a datetime interval field `datetime` and a `product_id` field. The remainder of the Opportunity description proprietary is up to the provider to define. The example given this this repo for `off_nadir` is of a custom format with a "minimum" and "maximum" field describing the limits.
As of Dec 2024, the STAPI spec says only that the Opportunity Properties must
have a datetime interval field `datetime` and a `product_id` field. The
remainder of the Opportunity description proprietary is up to the provider to
define. The example given this this repo for `off_nadir` is of a custom format
with a "minimum" and "maximum" field describing the limits.

## JSON Schema

Another option would be to use either a full JSON Schema definition for an attribute value in the properties (e.g., `schema`) or individual attribute definitions for the properties values. This option should be investigated further in the future.
Another option would be to use either a full JSON Schema definition for an
attribute value in the properties (e.g., `schema`) or individual attribute
definitions for the properties values. This option should be investigated
further in the future.

JSON Schema is a well-defined specification language that can support this type of data description. It is already used as the language for OGC API Queryables to define the constraints on various terms that may be used in CQL2 expressions, and likewise within STAPI for the Constraints that are used in Opportunity Search and the Order Parameters that are set on an order. The use of JSON Schema for Constraints (as with Queryables) is not to specify validation for a JSON document, but rather to well-define a set of typed and otherwise-constrained terms. Similarly, JSON Schema would be used for the Opportunity to define the predicted ranges of properties within the Opportunity that is bound to fulfill an Order.
JSON Schema is a well-defined specification language that can support this type
of data description. It is already used as the language for OGC API Queryables
to define the constraints on various terms that may be used in CQL2 expressions,
and likewise within STAPI for the Constraints that are used in Opportunity
Search and the Order Parameters that are set on an order. The use of JSON Schema
for Constraints (as with Queryables) is not to specify validation for a JSON
document, but rather to well-define a set of typed and otherwise-constrained
terms. Similarly, JSON Schema would be used for the Opportunity to define the
predicted ranges of properties within the Opportunity that is bound to fulfill
an Order.

The geometry is not one of the fields that will be expressed as a schema constraint, since this is part of the Opportunity/Item/Feature top-level. The Opportunity geometry will express both uncertainty about the actual capture area and a “maximum extent” of capture, e.g., a small area within a larger data strip – this is intentionally vague so it can be used to express whatever semantics the provider wants.
The geometry is not one of the fields that will be expressed as a schema
constraint, since this is part of the Opportunity/Item/Feature top-level. The
Opportunity geometry will express both uncertainty about the actual capture area
and a “maximum extent” of capture, e.g., a small area within a larger data strip
– this is intentionally vague so it can be used to express whatever semantics
the provider wants.

The ranges of predicted Opportunity values can be expressed using JSON in the following way:

- numeric value - number with const, enum, or minimum/maximum/exclusiveMinimum/exclusiveMaximum
- string value - string with const or enum
- datetime - type string using format date-time. The limitation wit this is that these values are not treated with JSON Schema as temporal, but rather a string pattern. As such, there is no formal way to define a temporal interval that the instance value must be within. Instead, we will repurpose the description field as a datetime interval in the same format as a search datetime field, e.g., 2024-01-01T00:00:00Z/2024-01-07T00:00:00Z. Optionally, the pattern field can be defined if the valid datetime values also match a regular expression, e.g., 2024-01-0[123456]T.*, which while not as useful semantically as the description interval does provide a formal validation of the resulting object, which waving hand might be useful in some way waving hand .
- datetime - type string using format date-time. The limitation wit this is that
these values are not treated with JSON Schema as temporal, but rather a string
pattern. As such, there is no formal way to define a temporal interval that the
instance value must be within. Instead, we will repurpose the description field
as a datetime interval in the same format as a search datetime field, e.g.,
2024-01-01T00:00:00Z/2024-01-07T00:00:00Z. Optionally, the pattern field can be
defined if the valid datetime values also match a regular expression, e.g.,
2024-01-0[123456]T.*, which while not as useful semantically as the description
interval does provide a formal validation of the resulting object, which waving
hand might be useful in some way waving hand.

```json
{
@@ -82,7 +125,6 @@ The ranges of predicted Opportunity values can be expressed using JSON in the fo

The Item that fulfills and Order placed on this Opportunity might be like:


```json
{
"type": "Feature",
6 changes: 6 additions & 0 deletions bin/run-mypy.bash
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/usr/bin/env bash

set -Eeuo pipefail
# set -x # print each command before executing

MYPYPATH=src mypy src/ tests/
7 changes: 7 additions & 0 deletions bin/run-ruff.bash
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/usr/bin/env bash

set -Eeuo pipefail
# set -x # print each command before executing

ruff check --fix # lint python files
ruff format # format python files
Loading