diff --git a/openspec/changes/archive/2026-03-07-add-search-web-frontend/.openspec.yaml b/openspec/changes/archive/2026-03-07-add-search-web-frontend/.openspec.yaml new file mode 100644 index 0000000..f1842c5 --- /dev/null +++ b/openspec/changes/archive/2026-03-07-add-search-web-frontend/.openspec.yaml @@ -0,0 +1,2 @@ +schema: spec-driven +created: 2026-03-07 diff --git a/openspec/changes/archive/2026-03-07-add-search-web-frontend/design.md b/openspec/changes/archive/2026-03-07-add-search-web-frontend/design.md new file mode 100644 index 0000000..76378b7 --- /dev/null +++ b/openspec/changes/archive/2026-03-07-add-search-web-frontend/design.md @@ -0,0 +1,75 @@ +## Context + +Grogbot currently has a search core library plus CLI and API packages, but no browser-facing frontend. The desired first web experience is intentionally small: a root page for the domain, a Search landing page at `/search`, and a Search results page at `/search/query?q=...`. + +The repository is currently Python-based and already uses FastAPI/ASGI patterns. The new web package should fit that ecosystem, render HTML on the server, and read from the server-side replicated SQLite database through `SearchService`. This change explicitly avoids changing the existing `packages/api` package so that the first frontend slice stays focused on human-facing pages instead of JSON API redesign. + +## Goals / Non-Goals + +**Goals:** +- Add a new `packages/web` package that provides a Python-rendered frontend. +- Provide domain-level HTML routes for `/`, `/search`, and `/search/query`. +- Render the initial search experience on the server, including the first page of results. +- Reuse `SearchService` directly so web and other surfaces share the same search behavior and database configuration. +- Keep the implementation small and easy to evolve into additional top-level systems later. + +**Non-Goals:** +- Changing or removing any routes from `packages/api`. +- Introducing a JavaScript-heavy SPA architecture. +- Redesigning search ranking or converting chunk results into document-deduplicated results. +- Defining a final unified public deployment topology for web and API on the same host. + +## Decisions + +### 1. Build `packages/web` as a Python-rendered ASGI package +- Decision: The frontend will be a Python package in the uv workspace, using server-rendered templates and static asset serving. +- Why: This matches the current Python-only repository, avoids introducing a second toolchain, and is sufficient for the initial search experience. +- Alternative considered: adding a JS/TS frontend package; rejected for now because the first scope is simple and does not justify extra build and deployment complexity. + +### 2. Call `SearchService` directly from web routes +- Decision: The web package will resolve configuration the same way as CLI/API and will execute searches through `SearchService` directly. +- Why: This avoids internal HTTP calls, reduces latency and coupling, and keeps the API package out of scope. +- Alternative considered: rendering pages by calling the existing API over HTTP; rejected because it adds unnecessary indirection and would couple page rendering to API route design. + +### 3. Use top-level human-facing routes +- Decision: The web package will own `/`, `/search`, and `/search/query`. +- Why: These routes match the intended product shape where each Grogbot system lives at a top-level path segment and Search owns its main interface directly under `/search`. +- Alternative considered: nesting Search UI deeper under another prefix; rejected because it weakens the intended domain structure. + +### 4. Server-render `/search/query` from the query string +- Decision: `GET /search/query?q=...` will render HTML with results already present in the response. +- Why: This keeps the first version simple, makes URLs shareable, works without JavaScript, and aligns with the familiar search-engine interaction model. +- Alternative considered: serving an app shell that fetches results client-side after load; rejected because it adds complexity without clear benefit for v1. + +### 5. Redirect empty queries back to `/search` +- Decision: Requests to `/search/query` with a missing or blank `q` parameter will redirect to `/search` rather than rendering an empty results page. +- Why: The landing page is the canonical empty-search experience, and redirecting keeps the route semantics clean. +- Alternative considered: rendering a no-results or empty-state page at `/search/query`; rejected because it duplicates the landing-page role. + +### 6. Preserve current chunk-level result behavior in the web UI +- Decision: The results page will display the top 25 results returned by `SearchService.search(..., limit=25)` in service order, even when that means duplicate documents appear. +- Why: This keeps the frontend aligned with current engine behavior and avoids introducing document-grouping semantics in the first web change. +- Alternative considered: deduplicating or regrouping results by document in the web layer; rejected for now because it changes presentation semantics and raises ranking questions outside this change. + +### 7. Keep deployment coupling loose +- Decision: The design defines the web package and its route behavior, but does not require a specific merged deployment with the existing API package. +- Why: The current package scope is intentionally limited to web. Deployment composition can be decided later without blocking implementation of the frontend itself. +- Alternative considered: coupling this design to an `/api/*` migration or combined host strategy; rejected because that would expand scope into API redesign. + +## Risks / Trade-offs + +- **[Risk] Route collisions with the existing API if both are later exposed on the same host/path space** → **Mitigation:** treat deployment composition as a later decision and keep API changes out of this change. +- **[Risk] Duplicate documents in results may feel less polished than mainstream search engines** → **Mitigation:** accept this as an explicit v1 trade-off and revisit grouped results in a later change if needed. +- **[Risk] Server-rendered templates may need refactoring if the frontend becomes highly interactive later** → **Mitigation:** keep presentation concerns isolated in `packages/web` so richer client-side behavior can be added incrementally. +- **[Risk] Web requests depend directly on database availability and search-service performance** → **Mitigation:** reuse existing service/config patterns and keep the initial page design lightweight. + +## Migration Plan + +1. Add `packages/web` to the workspace and create its ASGI entrypoint, templates, and static assets. +2. Implement root and search routes using existing config resolution and `SearchService` access. +3. Deploy the new web package in the chosen environment without changing `packages/api` behavior. +4. If rollback is needed, undeploy or disable the web package; no data migration is required because this change only adds a read-only presentation layer. + +## Open Questions + +- No blocking product questions remain for this first slice. Public coexistence with the existing API can be decided in a later change if both need to share a single hostname. diff --git a/openspec/changes/archive/2026-03-07-add-search-web-frontend/proposal.md b/openspec/changes/archive/2026-03-07-add-search-web-frontend/proposal.md new file mode 100644 index 0000000..a85bc46 --- /dev/null +++ b/openspec/changes/archive/2026-03-07-add-search-web-frontend/proposal.md @@ -0,0 +1,30 @@ +## Why + +Grogbot currently exposes search through the CLI and API, but it does not provide a human-facing web interface. Adding a simple web frontend now creates the first top-level system experience on `www.grogbot.com`, lets users search through a browser without going through the JSON API, and establishes the package and route pattern for future systems. + +## What Changes + +- Add a new Python-rendered `packages/web` package as a web frontend surface for Grogbot. +- Add a root web page at `/` that acts as the top-level domain entry point for Grogbot systems. +- Add a search landing page at `/search` with a “Grogbot Search” heading, query input, and Search button. +- Add a search results page at `/search/query?q=...` that renders the top 25 search results and includes a query input at the top for running another search. +- Make the web package use `SearchService` directly against the server-side replicated SQLite database instead of going through the API package. +- Redirect `/search/query` requests with a missing or blank `q` parameter back to `/search`. +- Allow duplicate documents in initial results when multiple chunks from the same document rank in the top 25. +- Keep the existing `packages/api` package unchanged and out of scope for this change. + +## Capabilities + +### New Capabilities +- `search-web`: Python-rendered web pages for Grogbot root navigation and the Search landing/results experience. + +### Modified Capabilities +- None. + +## Impact + +- New package: `packages/web`. +- New Python web dependencies for HTML rendering and static asset serving. +- New public HTML routes at `/`, `/search`, and `/search/query`. +- New templates/static assets for the web frontend. +- No API contract changes and no `packages/api` modifications in this change. diff --git a/openspec/changes/archive/2026-03-07-add-search-web-frontend/specs/search-web/spec.md b/openspec/changes/archive/2026-03-07-add-search-web-frontend/specs/search-web/spec.md new file mode 100644 index 0000000..0dd28b9 --- /dev/null +++ b/openspec/changes/archive/2026-03-07-add-search-web-frontend/specs/search-web/spec.md @@ -0,0 +1,40 @@ +## ADDED Requirements + +### Requirement: Grogbot root page +The system SHALL provide an HTML page at `/` that serves as the top-level Grogbot domain entry point. + +#### Scenario: Root page is available +- **WHEN** a browser requests `GET /` +- **THEN** the system returns an HTML page for the Grogbot root experience + +### Requirement: Search landing page +The system SHALL provide an HTML search landing page at `/search` that displays the title "Grogbot Search", a text input for the query, and a Search submit control. + +#### Scenario: Search landing page renders form +- **WHEN** a browser requests `GET /search` +- **THEN** the system returns an HTML page containing the text "Grogbot Search" +- **THEN** the page contains a query input field +- **THEN** the page contains a Search submit control + +### Requirement: Search results page +The system SHALL provide an HTML search results page at `/search/query?q=` that renders the top 25 search results for the query and includes a query input at the top for running another search. + +#### Scenario: Search results page renders ranked results +- **WHEN** a browser requests `GET /search/query?q=hello+world` +- **THEN** the system returns an HTML page containing a query input at the top +- **THEN** the page displays up to 25 ranked search results for `hello world` + +#### Scenario: Duplicate documents are preserved in v1 results +- **WHEN** the top 25 ranked search results contain multiple chunks from the same document +- **THEN** the page renders those results in ranked order without deduplicating by document + +### Requirement: Empty search redirects to landing page +The system SHALL redirect requests for `/search/query` without a non-blank `q` parameter to `/search`. + +#### Scenario: Missing query redirects to search landing +- **WHEN** a browser requests `GET /search/query` without a `q` parameter +- **THEN** the system responds with a redirect to `/search` + +#### Scenario: Blank query redirects to search landing +- **WHEN** a browser requests `GET /search/query?q= ` +- **THEN** the system responds with a redirect to `/search` diff --git a/openspec/changes/archive/2026-03-07-add-search-web-frontend/tasks.md b/openspec/changes/archive/2026-03-07-add-search-web-frontend/tasks.md new file mode 100644 index 0000000..c5f57d8 --- /dev/null +++ b/openspec/changes/archive/2026-03-07-add-search-web-frontend/tasks.md @@ -0,0 +1,24 @@ +## 1. Workspace and package setup + +- [x] 1.1 Add `packages/web` to the uv workspace and create the package metadata/build configuration. +- [x] 1.2 Create the web package module structure, including the ASGI entrypoint, template directory, and static asset directory. +- [x] 1.3 Add the Python web rendering/static-serving dependencies needed by the new package. + +## 2. Root and landing page implementation + +- [x] 2.1 Implement the root `/` HTML route for the top-level Grogbot entry page. +- [x] 2.2 Implement the `/search` HTML route with the “Grogbot Search” heading, query input, and Search submit control. +- [x] 2.3 Add the shared base layout and initial CSS needed for the root and search landing pages. + +## 3. Search results page integration + +- [x] 3.1 Implement the `/search/query` HTML route that reads the `q` parameter and redirects blank or missing queries to `/search`. +- [x] 3.2 Integrate the results route with `SearchService` using the existing configuration/database resolution pattern. +- [x] 3.3 Render the top 25 search results on the results page, preserving service order and allowing duplicate documents. +- [x] 3.4 Add the results-page search box at the top so users can submit a new query from the results screen. + +## 4. Verification + +- [x] 4.1 Add automated tests for `/`, `/search`, and `/search/query` covering HTML rendering and redirect behavior. +- [x] 4.2 Add automated tests verifying `/search/query` renders up to 25 results and preserves duplicate-document results. +- [x] 4.3 Run the relevant test suite(s) and confirm the new web package works without changing `packages/api`. diff --git a/packages/web/pyproject.toml b/packages/web/pyproject.toml new file mode 100644 index 0000000..9f25e3f --- /dev/null +++ b/packages/web/pyproject.toml @@ -0,0 +1,30 @@ +[project] +name = "grogbot-web" +version = "0.1.0" +description = "Grogbot Python-rendered web frontend" +requires-python = ">=3.11" +dependencies = [ + "fastapi>=0.110", + "grogbot-search-core", + "jinja2>=3.1", + "uvicorn>=0.29", +] + +[project.optional-dependencies] +test = [ + "httpx>=0.27", + "pytest>=8.0", +] + +[build-system] +requires = ["hatchling>=1.24"] +build-backend = "hatchling.build" + +[tool.hatch.build] +include = [ + "src/grogbot_web/templates/*.html", + "src/grogbot_web/static/*.css", +] + +[tool.hatch.build.targets.wheel] +packages = ["src/grogbot_web"] diff --git a/packages/web/src/grogbot_web/__init__.py b/packages/web/src/grogbot_web/__init__.py new file mode 100644 index 0000000..3aa511f --- /dev/null +++ b/packages/web/src/grogbot_web/__init__.py @@ -0,0 +1,5 @@ +"""Grogbot web frontend package.""" + +from grogbot_web.app import app + +__all__ = ["app"] diff --git a/packages/web/src/grogbot_web/app.py b/packages/web/src/grogbot_web/app.py new file mode 100644 index 0000000..c5a2fb6 --- /dev/null +++ b/packages/web/src/grogbot_web/app.py @@ -0,0 +1,64 @@ +from __future__ import annotations + +from pathlib import Path +from fastapi import FastAPI, Request +from fastapi.responses import HTMLResponse, RedirectResponse +from fastapi.staticfiles import StaticFiles +from fastapi.templating import Jinja2Templates + +from grogbot_search import SearchService, load_config + +PACKAGE_DIR = Path(__file__).resolve().parent +TEMPLATES_DIR = PACKAGE_DIR / "templates" +STATIC_DIR = PACKAGE_DIR / "static" + +app = FastAPI(title="Grogbot Web") +app.mount("/assets", StaticFiles(directory=str(STATIC_DIR)), name="assets") +templates = Jinja2Templates(directory=str(TEMPLATES_DIR)) + + +def search_results(query: str, *, limit: int = 25): + config = load_config() + with SearchService(config.db_path) as service: + return service.search(query, limit=limit) + + +@app.get("/", response_class=HTMLResponse) +def root_page(request: Request): + return templates.TemplateResponse( + request, + "index.html", + { + "page_title": "Grogbot", + }, + ) + + +@app.get("/search", response_class=HTMLResponse) +def search_page(request: Request, q: str = ""): + return templates.TemplateResponse( + request, + "search_landing.html", + { + "page_title": "Grogbot Search", + "query": q.strip(), + }, + ) + + +@app.get("/search/query", response_class=HTMLResponse) +def search_query_page(request: Request, q: str | None = None): + query = (q or "").strip() + if not query: + return RedirectResponse(url="/search", status_code=302) + + results = search_results(query, limit=25) + return templates.TemplateResponse( + request, + "search_results.html", + { + "page_title": f"{query} - Grogbot Search", + "query": query, + "results": results, + }, + ) diff --git a/packages/web/src/grogbot_web/static/app.css b/packages/web/src/grogbot_web/static/app.css new file mode 100644 index 0000000..3282fb8 --- /dev/null +++ b/packages/web/src/grogbot_web/static/app.css @@ -0,0 +1,181 @@ +:root { + color-scheme: light; + font-family: Arial, Helvetica, sans-serif; + --text: #202124; + --muted: #5f6368; + --link: #1a0dab; + --border: #dfe1e5; + --bg: #ffffff; + --panel: #f8f9fa; +} + +* { + box-sizing: border-box; +} + +body.page-body { + margin: 0; + color: var(--text); + background: var(--bg); +} + +.shell { + min-height: 100vh; + padding: 2rem; +} + +.shell--centered { + display: grid; + place-items: center; +} + +.card { + width: min(40rem, 100%); +} + +.card--hero { + text-align: center; +} + +.eyebrow { + margin: 0 0 0.5rem; + color: var(--muted); + text-transform: uppercase; + letter-spacing: 0.08em; + font-size: 0.8rem; +} + +.hero-title, +.search-home__title { + margin: 0 0 1rem; + font-size: clamp(2rem, 6vw, 4rem); + font-weight: 600; +} + +.hero-copy { + margin: 0 0 1.5rem; + color: var(--muted); +} + +.system-list { + display: flex; + justify-content: center; +} + +.system-link, +.brand-link, +.search-result__title a, +.search-result__url { + color: var(--link); + text-decoration: none; +} + +.search-home { + width: min(42rem, 100%); + text-align: center; +} + +.search-form { + display: flex; + gap: 0.75rem; +} + +.search-form--home { + flex-direction: column; + align-items: center; +} + +.search-form--results { + flex: 1; + max-width: 42rem; +} + +.search-input { + width: 100%; + border: 1px solid var(--border); + border-radius: 999px; + padding: 0.9rem 1.25rem; + font-size: 1rem; +} + +.search-button { + border: 1px solid transparent; + border-radius: 999px; + padding: 0.8rem 1.5rem; + background: var(--panel); + color: var(--text); + cursor: pointer; + font-size: 1rem; +} + +.results-shell { + padding: 1.5rem 2rem 3rem; +} + +.results-header { + display: flex; + align-items: center; + gap: 1rem; + margin-bottom: 2rem; + padding-bottom: 1rem; + border-bottom: 1px solid var(--border); +} + +.results-header__brand { + min-width: max-content; +} + +.results-list { + width: min(48rem, 100%); + margin-left: clamp(0rem, 10vw, 8rem); +} + +.search-result { + margin-bottom: 2rem; +} + +.search-result__url { + display: inline-block; + margin-bottom: 0.35rem; + color: var(--muted); + font-size: 0.9rem; +} + +.search-result__title { + margin: 0 0 0.35rem; + font-size: 1.25rem; + font-weight: 400; +} + +.search-result__snippet, +.empty-state { + margin: 0; + color: var(--muted); + line-height: 1.5; +} + +.sr-only { + position: absolute; + width: 1px; + height: 1px; + padding: 0; + margin: -1px; + overflow: hidden; + clip: rect(0, 0, 0, 0); + border: 0; +} + +@media (max-width: 720px) { + .results-header { + flex-direction: column; + align-items: stretch; + } + + .results-list { + margin-left: 0; + } + + .search-form--results { + max-width: none; + } +} diff --git a/packages/web/src/grogbot_web/templates/base.html b/packages/web/src/grogbot_web/templates/base.html new file mode 100644 index 0000000..e6f63d5 --- /dev/null +++ b/packages/web/src/grogbot_web/templates/base.html @@ -0,0 +1,12 @@ + + + + + + {{ page_title }} + + + + {% block body %}{% endblock %} + + diff --git a/packages/web/src/grogbot_web/templates/index.html b/packages/web/src/grogbot_web/templates/index.html new file mode 100644 index 0000000..1ae3390 --- /dev/null +++ b/packages/web/src/grogbot_web/templates/index.html @@ -0,0 +1,14 @@ +{% extends "base.html" %} + +{% block body %} +
+
+

Grogbot

+

Grogbot

+

A home for Grogbot systems.

+ +
+
+{% endblock %} diff --git a/packages/web/src/grogbot_web/templates/search_landing.html b/packages/web/src/grogbot_web/templates/search_landing.html new file mode 100644 index 0000000..775b184 --- /dev/null +++ b/packages/web/src/grogbot_web/templates/search_landing.html @@ -0,0 +1,21 @@ +{% extends "base.html" %} + +{% block body %} +
+
+

Grogbot Search

+
+ + + +
+
+
+{% endblock %} diff --git a/packages/web/src/grogbot_web/templates/search_results.html b/packages/web/src/grogbot_web/templates/search_results.html new file mode 100644 index 0000000..6677b77 --- /dev/null +++ b/packages/web/src/grogbot_web/templates/search_results.html @@ -0,0 +1,41 @@ +{% extends "base.html" %} + +{% block body %} +
+
+ +
+ + + +
+
+ +
+ {% if results %} + {% for result in results %} + + {% endfor %} + {% else %} +

No results found.

+ {% endif %} +
+
+{% endblock %} diff --git a/packages/web/tests/test_app.py b/packages/web/tests/test_app.py new file mode 100644 index 0000000..542a46b --- /dev/null +++ b/packages/web/tests/test_app.py @@ -0,0 +1,121 @@ +from __future__ import annotations + +import importlib +from datetime import datetime, timezone + +from fastapi.testclient import TestClient + +from grogbot_search.models import Chunk, Document, SearchResult + +web_app_module = importlib.import_module("grogbot_web.app") + + +def _result(*, index: int, document_id: str, canonical_url: str, title: str | None, snippet: str) -> SearchResult: + document = Document( + id=document_id, + source_id="source-1", + canonical_url=canonical_url, + title=title, + published_at=datetime(2025, 1, 1, tzinfo=timezone.utc), + content_hash=f"hash{index:02d}", + ) + chunk = Chunk( + id=index, + document_id=document_id, + chunk_index=index - 1, + content_text=snippet, + ) + return SearchResult( + chunk=chunk, + document=document, + score=1.0 / index, + fts_score=1.0 / index, + vector_score=0.0, + link_score=0.0, + ) + + +def test_root_page_renders_html(): + with TestClient(web_app_module.app) as client: + response = client.get("/") + + assert response.status_code == 200 + assert response.headers["content-type"].startswith("text/html") + assert "Grogbot" in response.text + assert 'href="/search"' in response.text + assert "/assets/app.css" in response.text + + +def test_search_page_renders_form(): + with TestClient(web_app_module.app) as client: + response = client.get("/search") + + assert response.status_code == 200 + assert "Grogbot Search" in response.text + assert 'action="/search/query"' in response.text + assert 'name="q"' in response.text + assert ">Search<" in response.text + + +def test_search_query_redirects_without_query(): + with TestClient(web_app_module.app) as client: + response = client.get("/search/query", follow_redirects=False) + + assert response.status_code == 302 + assert response.headers["location"] == "/search" + + +def test_search_query_redirects_blank_query(): + with TestClient(web_app_module.app) as client: + response = client.get("/search/query?q=%20%20%20", follow_redirects=False) + + assert response.status_code == 302 + assert response.headers["location"] == "/search" + + +def test_search_query_renders_top_25_results_and_preserves_duplicates(monkeypatch): + captured: list[tuple[str, int]] = [] + results = [ + _result( + index=1, + document_id="doc-shared", + canonical_url="https://example.com/shared", + title="Shared Document", + snippet="Duplicate snippet 1", + ), + _result( + index=2, + document_id="doc-shared", + canonical_url="https://example.com/shared", + title="Shared Document", + snippet="Duplicate snippet 2", + ), + ] + results.extend( + _result( + index=index, + document_id=f"doc-{index}", + canonical_url=f"https://example.com/article-{index}", + title=f"Article {index}", + snippet=f"Unique snippet {index}", + ) + for index in range(3, 31) + ) + + def fake_search_results(query: str, *, limit: int = 25): + captured.append((query, limit)) + return results[:limit] + + monkeypatch.setattr(web_app_module, "search_results", fake_search_results) + + with TestClient(web_app_module.app) as client: + response = client.get("/search/query?q=hello+world") + + assert response.status_code == 200 + assert captured == [("hello world", 25)] + assert 'value="hello world"' in response.text + assert response.text.count('class="search-result"') == 25 + assert response.text.count("https://example.com/shared") >= 2 + assert "Unique snippet 25" in response.text + assert "Unique snippet 26" not in response.text + assert response.text.index("Duplicate snippet 1") < response.text.index("Duplicate snippet 2") diff --git a/pyproject.toml b/pyproject.toml index 0f94707..e057bad 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -3,9 +3,11 @@ members = [ "packages/search-core", "packages/cli", "packages/api", + "packages/web", ] [tool.uv.sources] "grogbot-search-core" = { workspace = true } "grogbot-cli" = { workspace = true } "grogbot-api" = { workspace = true } +"grogbot-web" = { workspace = true } diff --git a/uv.lock b/uv.lock index c104ca1..ee89701 100644 --- a/uv.lock +++ b/uv.lock @@ -11,6 +11,7 @@ members = [ "grogbot-api", "grogbot-cli", "grogbot-search-core", + "grogbot-web", ] [[package]] @@ -366,6 +367,34 @@ requires-dist = [ ] provides-extras = ["test"] +[[package]] +name = "grogbot-web" +version = "0.1.0" +source = { editable = "packages/web" } +dependencies = [ + { name = "fastapi" }, + { name = "grogbot-search-core" }, + { name = "jinja2" }, + { name = "uvicorn" }, +] + +[package.optional-dependencies] +test = [ + { name = "httpx" }, + { name = "pytest" }, +] + +[package.metadata] +requires-dist = [ + { name = "fastapi", specifier = ">=0.110" }, + { name = "grogbot-search-core", editable = "packages/search-core" }, + { name = "httpx", marker = "extra == 'test'", specifier = ">=0.27" }, + { name = "jinja2", specifier = ">=3.1" }, + { name = "pytest", marker = "extra == 'test'", specifier = ">=8.0" }, + { name = "uvicorn", specifier = ">=0.29" }, +] +provides-extras = ["test"] + [[package]] name = "h11" version = "0.16.0"