-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Work towards OpenAPI 3.1 support #22
Draft
kevindew
wants to merge
53
commits into
main
Choose a base branch
from
openapi-3.1
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Changes from all commits
Commits
Show all changes
53 commits
Select commit
Hold shift + click to select a range
0cb50b5
Add 3.1 to list of supported versions
kevindew 58256ed
Put v3.0 examples into a specific directory
kevindew 9ec9260
Add some 3.1 integration tests
kevindew b97bc98
Add #openapi_version methods to context classes
kevindew aee5278
Add OpenapiVersion class for version comparisons
kevindew dca077d
Required option in field config accepts lambda
kevindew 8109b18
Only require paths for OpenAPI < 3.1
kevindew 12bd103
Add a webhooks field to the root OpenAPI node
kevindew c3d6e9b
Fields have a concept of allowed
kevindew a0f747c
Only allow webhooks on OpenAPI >= 3.1 documents
kevindew 11994dc
Start an OpenAPI v3.1 checklist
kevindew 93cab13
Don't require responses on an Operation in OpenAPI 3.1
kevindew ecd50e0
Require at least one of components, paths and webhooks
kevindew c94e21c
Add summary field to Info for OpenAPI v3.1
kevindew 5c2d043
Fix typo on "at least"
kevindew 16a9806
Put together notes on OpenAPI 3.1
kevindew a96a733
Breaking: Create OpenAPI version specific Schema class
kevindew 734b005
Create an object for the OAS Dialect 3.1 schema
kevindew 08e1a5b
Make extension regex configurable
kevindew 24b4264
Refactor reference resolution for OpenAPI 3.1
kevindew 79df56c
Mark Node::Context as a long class
kevindew 4f213a1
Allow 3.1 references to have summary and description
kevindew 13854eb
Add identifier, as a mutually exclusive field, to License
kevindew 995b5ae
No changes needed for ServerVariable
kevindew 8b4c4a0
Add pathItems to components for OpenAPI 3.1
kevindew 4711f4c
Mark mutualTLS as resolved
kevindew 145cef3
Version specific allow_extensions for Discriminator
kevindew c6b9618
Fix rubocop issues from rebase
kevindew 4c60fbe
Add explicit require for node_factory
kevindew ab44778
Share behaviour between schema objects
kevindew fe52a9a
Configure type field for OpenAPI 3.1
kevindew 882ab1a
Add const field to OpenAPI 3.1 schema
kevindew 43d197b
Add a few basic JSON schema fields
kevindew e8163b2
Update fields we're tracking for Schema in 3.1
kevindew d7f2f81
Add content* fields for v3.1 schema object
kevindew 70b10af
Add definitions for if, then and else schema fields
kevindew a35b54b
Add prefixItems field to Schema
kevindew af96354
Contains field for schema
kevindew 123b76b
Correct handling of minimum and maximum of schema
kevindew ed007b9
Add patternProperties to Schema
kevindew 8733278
Add some notes about boolean schemas
kevindew 3ed1a7d
Add dependentRequired field for 3.1 schema
kevindew 2f11358
Add DependentSchemas schema field
kevindew f75b2d6
Check explicitly for ::Hash rather than respond_to?(:[])
kevindew 6d16da6
Support JsonSchemas that are a boolean value
kevindew 8a52e58
Add some dubious hacks to get appropriate errors
kevindew 07dabbf
Move additionalProperties to 3.0 Schema node
kevindew f43d88e
Add methods for boolean schemas
kevindew b12715f
Add support for additionalProperties, unevaluatedItems and unevaluate…
kevindew 53fae01
Add output of warnings to Kernel#warn
kevindew 2215772
Rename Validators::Url to Validators::Uri
kevindew a4d6dbb
Add jsonSchemaDialect field to openapi node
kevindew 0292aff
JSON schema dialect warnings
kevindew File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,118 @@ | ||
# JSON schema with 3.1 | ||
|
||
Temporary document to be removed with the merge of support for OpenAPI 3.1 | ||
|
||
Things have got complex with schemas in OpenAPI 3.1 | ||
|
||
How things might work: | ||
|
||
- when a schema factory is created, it determines whether the dialect is supported | ||
- it then creates a factory based on the dialect | ||
- if there is a reference in it this is resolved | ||
- there could be complexities in the resolving process because of the id field - does it become relative to this? | ||
- skip dynamicAnchor and dynamicRef for now - they are quite complex: https://stackoverflow.com/questions/69728686/explanation-of-dynamicref-dynamicanchor-in-json-schema-as-opposed-to-ref-and | ||
- lets allow extra properties for schema since it's complex | ||
- there's all the $defs stuff but this might just work as being a type of reference - presumably not used in OpenAPI anyway really | ||
|
||
So how might we start: | ||
|
||
- Perhaps add a class method to Schema which can identify which Schema factory is used: a OAS 3.1 one, an optionally referenced OAS 3.0 one, or non optional reference (if such a need exists), based on context. Error if given an unexpected dialect | ||
- Learn whether you have to care about $id for resolving | ||
- Create a node factory for OAS 3.1 Schema: | ||
- allow arbitrary fields perhaps? Probably not needed, just a pain to keep up with JsonSchema | ||
- load a merged reference | ||
- perhaps have context support a merge concept for source location | ||
- Think about dealing with recursive defined as "#" | ||
|
||
Dealing with the new JSON Schema approach for OpenAPI 3.1. | ||
|
||
There is some meta fields: | ||
|
||
$ref - in 3.0 | ||
$dynamicRef | ||
$defs | ||
$schema | ||
$id | ||
$comment | ||
$anchor | ||
$dynamicAnchor | ||
|
||
Then a ton of fields: | ||
|
||
type: string - in 3.0 | ||
enum: array - in 3.0 | ||
const: any type - done | ||
multipleOf: number - in 3.0 | ||
maximum: number - in 3.0 | ||
exclusiveMaximum: number - done | ||
minimum: number - in 3.0 | ||
exclusiveMinimum: number - done | ||
maxLength: integer >= 0 - in 3.0 (missing >= val) | ||
minLength: integer >= 0 - in 3.0 | ||
pattern: string - in 3.0 | ||
maxItems: integer >= 0 - in 3.0 | ||
minItems: integer >= 0 - in 3.0 | ||
uniqueItems: boolean - in 3.0 | ||
maxContains: integer >= 0 - done | ||
minContains: integer >= 0 - done | ||
maxProperties: integer >= 0 - in 3.0 | ||
minProperties: integer >= 0 - in 3.0 | ||
required: array, strings, unique - in 3.0 (missing unique) | ||
dependentRequired: something complex done | ||
contentEncoding: string - done | ||
contentMediaType: string / media type - done | ||
contentSchema: schema - done | ||
title: string - in 3.0 | ||
description: string - in 3.0 | ||
default: any - in 3.0 | ||
deprecated: boolean (default false) - in 3.0 | ||
readOnly: boolean (default false) - in 3.0 | ||
writeOnly: boolean (default false) - in 3.0 | ||
examples: array - done | ||
format: any - in 3.0 | ||
|
||
allOf - non empty array of schemas - in 3.0 | ||
anyOf - non empty array of schemas - in 3.0 | ||
oneOf - non empty array of schemas - in 3.0 | ||
not - schema - in 3.0 | ||
|
||
if - single schema - done | ||
then - single schema - done | ||
else - single schema - done | ||
dependentSchemas - map of schemas - done | ||
|
||
prefixItems: array of schema - done | ||
items: schema - in 3.0 | ||
contains: schema - done | ||
|
||
properties: object, each value json schema - in 3.0 | ||
patternProperties: object each value JSON schema key regex - done | ||
additionalProperties: single json schema - done | ||
|
||
unevaluatedItems - single schema - done | ||
unevaluatedProperties: single schema - done | ||
|
||
|
||
## Returning to this in 2025 | ||
|
||
Assumption: it'll be extremely rare for usage of the advanced schema fields like dynamicRefs and dynamicAnchors, let's see what we can implement that meets most use cases and hopefully doesn't crash on complex ones | ||
|
||
Current idea is create a Schema::Common which can share methods between both schema objects that are shared, then add distinctions for differences | ||
|
||
At point of shutting down on 10th January 2025 I was wondering about how schemas merge. I also decided to defer thinking about referenceable node object factory. | ||
|
||
I learnt that merging seems largely undefined in JSON Schema, as far as I can tell and I'm just going with a strategy of most recent field wins. | ||
|
||
I've set up a Node::Schema class for common schema methods and Node::Schema::v3_0 and v3_1Up classes for specific changes. Need to flesh out | ||
tests and then behaviour that differs between them. | ||
|
||
Little things: | ||
- schema integer fields generally are required to be non-negative | ||
- quite common for arrays to be invalid if not unique (required, type) | ||
- probably want a quick way to get coverage of the methods on nodes | ||
- could validate that pattern and patternProperties contain regexs | ||
|
||
JSON Schema specs: | ||
|
||
meta: https://datatracker.ietf.org/doc/html/draft-bhutton-json-schema-00 | ||
validation: https://datatracker.ietf.org/doc/html/draft-bhutton-json-schema-validation-00 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what about piggyback on existing implementation like https://github.com/voxpupuli/json-schema?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm a bit rusty in memory but there's some tricky challenges as I recall. If I remember right, that gem doesn't support a version of JSON schema as new as what OpenAPI supports and I have recollections that it's more useful for validation than it is for traversal.