Read the docs with demo videos here.
is an open protocol that standardizes how apps provide context to LLMs.
- Seamlessly integrates LLM with growing list of community integrations found here
- No LLM provider lock in
for Customizable Agentic Orchestration
- Native streaming for UX in complex Agentic Workflows
- Native persisted chat history and state management
for Python backend API
for Python SQL database interactions (ORM + Validation).
for Data Validation and Settings Management.
for DB RBAC
Reverse Proxy
for development and production.
for LLM Observability and LLM Metrics
for scraping Metrics
for visualizing Metrics
SaaS for JWT authentication
- CI/CD via Github Actions
Inspector communicates via SSE protocol with each MCP Server, while each server adheres to MCP specification.
graph LR
subgraph localhost
A[Inspector]
B[DBHub Server]
C[Youtube Server]
D[Custom Server]
end
subgraph Supabase Cloud
E[Supabase DB]
end
subgraph Google Cloud
F[Youtube API]
end
A<-->|Protocol|B
A<-->|Protocol|C
A<-->|Protocol|D
B<-->E
C<-->F
The current template does not connect to all MCP servers. Additionally, the API server communicates with the database using a SQL ORM.
graph LR
subgraph localhost
A[API Server]
B[DBHub Server]
C[Youtube Server]
D[Custom Server]
end
subgraph Supabase Cloud
E[Supabase DB]
end
A<-->|Protocol|D
A<-->E
graph LR
A[Web Browser]
subgraph localhost
B[Nginx Reverse Proxy]
C[API Server]
end
A-->B
B-->C
graph LR
subgraph localhost
A[API Server]
end
subgraph Grafana Cloud
B[Grafana]
end
subgraph Langfuse Cloud
C[Langfuse]
end
A -->|Metrics & Logs| B
A -->|Traces & Events| C
Build community youtube MCP image with:
./community/youtube/build.sh
Tip
Instead of cloning or submoduling the repository locally, then building the image, this script builds the Docker image inside a temporary Docker-in-Docker container. This approach avoids polluting your local environment with throwaway files by cleaning up everything once the container exits.
Then build the other images with:
docker compose -f compose-dev.yaml build
Copy environment file:
cp .env.sample .env
Add your following API keys and value to the respective file: ./envs/backend.env
, ./envs/youtube.env
and .env
.
OPENAI_API_KEY=sk-proj-...
POSTGRES_DSN=postgresql://postgres...
YOUTUBE_API_KEY=...
Set environment variables in shell: (compatible with bash
and zsh
)
set -a; for env_file in ./envs/*; do source $env_file; done; set +a
Start production containers:
docker compose up -d
First, set environment variables as per above.
Warning
Only replace the following if you plan to start debugger for FastAPI server in VSCode.
Replace ./compose-dev.yaml
entrypoint to allow debugging FastAPI server:
# ...
api:
# ...
# entrypoint: uv run fastapi run api/main.py --root-path=/api --reload
# replace above with:
entrypoint: bash -c "sleep infinity"
# ...
code --no-sandbox .
Press F1
and type Dev Containers: Rebuild and Reopen in Container
to open containerized environment with IntelliSense and Debugger for FastAPI.
Run development environment with:
docker compose -f compose-dev.yaml up -d
The following markdown files provide additional details on other features:
Sometimes in development, nginx reverse proxy needs to reload its config to route services properly.
docker compose -f compose-dev.yaml exec nginx sh -c "nginx -s reload"