Skip to content
This repository was archived by the owner on Apr 2, 2025. It is now read-only.

Commit ff2a923

Browse files
authored
Update build and deps (#143)
* release prep for v0.6.0 * refactor build and deps * update poetry dependencies to be >= rather than caret * update poetry dependencies to be >= rather than caret
1 parent 96bcc8f commit ff2a923

12 files changed

+813
-396
lines changed

Diff for: .pre-commit-config.yaml

+60-23
Original file line numberDiff line numberDiff line change
@@ -1,29 +1,66 @@
11
repos:
2-
- repo: https://github.com/pre-commit/pre-commit-hooks
3-
rev: v5.0.0
2+
- repo: local
43
hooks:
5-
- id: check-merge-conflict
6-
- id: end-of-file-fixer
7-
- id: mixed-line-ending
4+
- id: ruff
5+
name: Format and lint with ruff
6+
entry: ./bin/run-ruff.bash
7+
language: system
8+
types: [python]
9+
pass_filenames: false
10+
verbose: true
11+
- id: mypy
12+
name: Check typing with mypy
13+
entry: ./bin/run-mypy.bash
14+
language: system
15+
types: [python]
16+
pass_filenames: false
17+
verbose: true
18+
- id: pymarkdown
19+
name: Markdownlint
20+
description: Run markdownlint on Markdown files
21+
entry: pymarkdown scan
22+
language: python
23+
files: \.(md|mdown|markdown)$
24+
exclude: ^.github/pull_request_template.md$
25+
- id: check-added-large-files
26+
name: Check for added large files
27+
entry: check-added-large-files
28+
language: system
829
- id: check-toml
30+
name: Check Toml
31+
entry: check-toml
32+
language: system
33+
types: [toml]
934
- id: check-yaml
35+
name: Check Yaml
36+
entry: check-yaml
37+
language: system
38+
types: [yaml]
39+
- id: mixed-line-ending
40+
name: Check mixed line endings
41+
entry: mixed-line-ending
42+
language: system
43+
types: [text]
44+
stages: [pre-commit, pre-push, manual]
45+
- id: end-of-file-fixer
46+
name: Fix End of Files
47+
entry: end-of-file-fixer
48+
language: system
49+
types: [text]
50+
stages: [pre-commit, pre-push, manual]
1051
- id: trailing-whitespace
52+
name: Trim Trailing Whitespace
53+
entry: trailing-whitespace-fixer
54+
language: system
55+
types: [text]
56+
stages: [pre-commit, pre-push, manual]
57+
- id: check-merge-conflict
58+
name: Check merge conflicts
59+
entry: check-merge-conflict
60+
language: system
1161
- id: no-commit-to-branch
12-
- repo: https://github.com/charliermarsh/ruff-pre-commit
13-
rev: v0.7.4
14-
hooks:
15-
- id: ruff
16-
args: [--fix, --exit-non-zero-on-fix]
17-
- id: ruff-format
18-
- repo: https://github.com/pre-commit/mirrors-mypy
19-
rev: v1.13.0
20-
hooks:
21-
- id: mypy
22-
args: [] # override default of [--strict, --ignore-missing-imports]
23-
files: src/
24-
additional_dependencies:
25-
- types-pyRFC3339~=1.1.1
26-
- pydantic~=2.10.1
27-
- returns~=0.23.0
28-
- fastapi~=0.115.0
29-
- geojson_pydantic~=1.1.1
62+
name: Check not committting to main
63+
entry: no-commit-to-branch
64+
language: system
65+
args: ["--branch", "main"]
66+
pass_filenames: false

Diff for: adrs/constraints.md

+50-8
Original file line numberDiff line numberDiff line change
@@ -1,30 +1,73 @@
11
# Constraints and Opportunity Properties
22

3-
Previously, the Constraints and Opportunity Properties were the same concept/representation. However, these represent distinct but related attributes. Constraints represents the terms that can be used in the filter sent to the Opportunities Search and Order Create endpoints. These are frequently the same or related values that will be part of the STAC Items that are used to fulfill an eventual Order. Opportunity Properties represent the expected range of values that these STAC Items are expected to have. An opportunity is a prediction about the future, and as such, the values for the Opportunity are fuzzy. For example, the sun azimuth angle will (likely) be within a predictable range of values, but the exact value will not be known until after the capture occurs. Therefore, it is necessary to describe the Opportunity in a way that describes these ranges.
3+
Previously, the Constraints and Opportunity Properties were the same
4+
concept/representation. However, these represent distinct but related
5+
attributes. Constraints represents the terms that can be used in the filter sent
6+
to the Opportunities Search and Order Create endpoints. These are frequently the
7+
same or related values that will be part of the STAC Items that are used to
8+
fulfill an eventual Order. Opportunity Properties represent the expected range
9+
of values that these STAC Items are expected to have. An opportunity is a
10+
prediction about the future, and as such, the values for the Opportunity are
11+
fuzzy. For example, the sun azimuth angle will (likely) be within a predictable
12+
range of values, but the exact value will not be known until after the capture
13+
occurs. Therefore, it is necessary to describe the Opportunity in a way that
14+
describes these ranges.
415

516
For example, for the concept of "off_nadir":
617

718
The Constraint will be a term "off_nadir" that can be a value 0 to 45.
819
This is used in a CQL2 filter to the Opportunities Search endpoint to restrict the allowable values from 0 to 15
9-
The Opportunity that is returned from Search has an Opportunity Property "off_nadir" with a description that the value of this field in the resulting STAC Items will be between 4 and 8, which falls within the filter restriction of 0-15.
20+
The Opportunity that is returned from Search has an Opportunity Property
21+
"off_nadir" with a description that the value of this field in the resulting
22+
STAC Items will be between 4 and 8, which falls within the filter restriction of 0-15.
1023
An Order is created with the original filter and other fields.
1124
The Order is fulfilled with a STAC Item that has an off_nadir value of 4.8.
1225

13-
As of Dec 2024, the STAPI spec says only that the Opportunity Properties must have a datetime interval field `datetime` and a `product_id` field. The remainder of the Opportunity description proprietary is up to the provider to define. The example given this this repo for `off_nadir` is of a custom format with a "minimum" and "maximum" field describing the limits.
26+
As of Dec 2024, the STAPI spec says only that the Opportunity Properties must
27+
have a datetime interval field `datetime` and a `product_id` field. The
28+
remainder of the Opportunity description proprietary is up to the provider to
29+
define. The example given this this repo for `off_nadir` is of a custom format
30+
with a "minimum" and "maximum" field describing the limits.
1431

1532
## JSON Schema
1633

17-
Another option would be to use either a full JSON Schema definition for an attribute value in the properties (e.g., `schema`) or individual attribute definitions for the properties values. This option should be investigated further in the future.
34+
Another option would be to use either a full JSON Schema definition for an
35+
attribute value in the properties (e.g., `schema`) or individual attribute
36+
definitions for the properties values. This option should be investigated
37+
further in the future.
1838

19-
JSON Schema is a well-defined specification language that can support this type of data description. It is already used as the language for OGC API Queryables to define the constraints on various terms that may be used in CQL2 expressions, and likewise within STAPI for the Constraints that are used in Opportunity Search and the Order Parameters that are set on an order. The use of JSON Schema for Constraints (as with Queryables) is not to specify validation for a JSON document, but rather to well-define a set of typed and otherwise-constrained terms. Similarly, JSON Schema would be used for the Opportunity to define the predicted ranges of properties within the Opportunity that is bound to fulfill an Order.
39+
JSON Schema is a well-defined specification language that can support this type
40+
of data description. It is already used as the language for OGC API Queryables
41+
to define the constraints on various terms that may be used in CQL2 expressions,
42+
and likewise within STAPI for the Constraints that are used in Opportunity
43+
Search and the Order Parameters that are set on an order. The use of JSON Schema
44+
for Constraints (as with Queryables) is not to specify validation for a JSON
45+
document, but rather to well-define a set of typed and otherwise-constrained
46+
terms. Similarly, JSON Schema would be used for the Opportunity to define the
47+
predicted ranges of properties within the Opportunity that is bound to fulfill
48+
an Order.
2049

21-
The geometry is not one of the fields that will be expressed as a schema constraint, since this is part of the Opportunity/Item/Feature top-level. The Opportunity geometry will express both uncertainty about the actual capture area and a “maximum extent” of capture, e.g., a small area within a larger data strip – this is intentionally vague so it can be used to express whatever semantics the provider wants.
50+
The geometry is not one of the fields that will be expressed as a schema
51+
constraint, since this is part of the Opportunity/Item/Feature top-level. The
52+
Opportunity geometry will express both uncertainty about the actual capture area
53+
and a “maximum extent” of capture, e.g., a small area within a larger data strip
54+
– this is intentionally vague so it can be used to express whatever semantics
55+
the provider wants.
2256

2357
The ranges of predicted Opportunity values can be expressed using JSON in the following way:
2458

2559
- numeric value - number with const, enum, or minimum/maximum/exclusiveMinimum/exclusiveMaximum
2660
- string value - string with const or enum
27-
- datetime - type string using format date-time. The limitation wit this is that these values are not treated with JSON Schema as temporal, but rather a string pattern. As such, there is no formal way to define a temporal interval that the instance value must be within. Instead, we will repurpose the description field as a datetime interval in the same format as a search datetime field, e.g., 2024-01-01T00:00:00Z/2024-01-07T00:00:00Z. Optionally, the pattern field can be defined if the valid datetime values also match a regular expression, e.g., 2024-01-0[123456]T.*, which while not as useful semantically as the description interval does provide a formal validation of the resulting object, which waving hand might be useful in some way waving hand .
61+
- datetime - type string using format date-time. The limitation wit this is that
62+
these values are not treated with JSON Schema as temporal, but rather a string
63+
pattern. As such, there is no formal way to define a temporal interval that the
64+
instance value must be within. Instead, we will repurpose the description field
65+
as a datetime interval in the same format as a search datetime field, e.g.,
66+
2024-01-01T00:00:00Z/2024-01-07T00:00:00Z. Optionally, the pattern field can be
67+
defined if the valid datetime values also match a regular expression, e.g.,
68+
2024-01-0[123456]T.*, which while not as useful semantically as the description
69+
interval does provide a formal validation of the resulting object, which waving
70+
hand might be useful in some way waving hand.
2871

2972
```json
3073
{
@@ -82,7 +125,6 @@ The ranges of predicted Opportunity values can be expressed using JSON in the fo
82125

83126
The Item that fulfills and Order placed on this Opportunity might be like:
84127

85-
86128
```json
87129
{
88130
"type": "Feature",

Diff for: bin/run-mypy.bash

+6
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
#!/usr/bin/env bash
2+
3+
set -Eeuo pipefail
4+
# set -x # print each command before executing
5+
6+
MYPYPATH=src mypy src/ tests/

Diff for: bin/run-ruff.bash

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
#!/usr/bin/env bash
2+
3+
set -Eeuo pipefail
4+
# set -x # print each command before executing
5+
6+
ruff check --fix # lint python files
7+
ruff format # format python files

0 commit comments

Comments
 (0)