AI-powered platform to automate trade matching, detect breaks, predict reconciliation failures, route exceptions, and provide operational analytics.
- Multi-system ingestion from OMS/custodian connectors (extensible to prime broker/exchange)
- Fuzzy semantic trade matching with weighted scoring
- Predictive break detection with ML
- Exception routing with SLA escalation
- Root-cause pattern analysis
- Auto-remediation suggestions for low-risk breaks
- Reporting APIs for SLA, aging, and run history
- API: FastAPI (
src/api) - Data Layer: SQLAlchemy models (
src/models) - Ingestion: Connector framework (
src/ingestion) - Matching: Fuzzy engine + reconciliation orchestration (
src/matching) - ML: Feature engineering, train, and inference (
src/ml) - Workflows: Routing, root cause, remediation (
src/workflows) - Async Tasks: Celery workers + beat (
src/tasks) - Analytics: Reporting service (
src/reporting)
trade-reconciliation-ai/
src/
api/
config/
ingestion/
matching/
ml/
models/
reporting/
rules/
tasks/
workflows/
tests/
data/
models/
scripts/
dashboards/
- Create env file:
cp .env.example .envDefault .env.example is configured for Kraken public API mode (no credentials required).
- Install dependencies:
python3 -m pip install -r requirements.txtOptional extended packages (NLP, extra connectors, advanced ML):
python3 -m pip install -r requirements.optional.txt- Run API:
uvicorn src.api.main:app --host 0.0.0.0 --port 8000 --reload- Run worker and scheduler:
celery -A src.tasks.worker.celery_app worker --loglevel=info --concurrency=4
celery -A src.tasks.worker.celery_app beat --loglevel=info- Optional demo data:
python3 scripts/seed_demo_data.py- One-shot smoke flow (health + seed + reconcile + reports):
make smokedocker compose up --buildDocker Compose injects container-safe defaults (POSTGRES_HOST=postgres, REDIS_URL=redis://redis:6379/0) for API/worker/beat, so .env can remain local-first.
For a standalone API container (without Postgres) you can run:
docker build -t trade-reconciliation-ai:local .
docker run --rm -p 8000:8000 -e DATABASE_URL=sqlite+pysqlite:///./app.db trade-reconciliation-ai:localGET /api/v1/healthPOST /api/v1/ingestion/runPOST /api/v1/reconciliation/runPOST /api/v1/exceptions/{break_id}/routePOST /api/v1/exceptions/{break_id}/auto-remediateGET /api/v1/exceptions/overdueGET /api/v1/breaks/openGET /api/v1/reports/summaryGET /api/v1/reports/agingGET /api/v1/reports/runsGET /api/v1/reports/root-causePOST /api/v1/prediction/score
Beat schedule configured in src/tasks/worker.py:
tasks.check_sla_breachesevery 15 minutes
Additional tasks in src/tasks/jobs.py:
tasks.run_ingestiontasks.run_reconciliationtasks.daily_pipeline
- Configure secrets in
.env(leave unknown keys blank initially). - Deploy API and workers using Docker Compose or your orchestration platform.
- Set CI pipeline:
python3 -m compileall -q src testspytest
- Configure alerting from SLA and escalation endpoints.
- Add production-grade connectors and model retraining schedule.
pytest -q
python3 -m compileall -q src testsGitHub Actions workflow: .github/workflows/ci-docker.yml
- Runs unit tests on Python 3.11
- Builds Docker image
- Starts container and verifies
GET /api/v1/health
- Credentials and API keys are intentionally not hard-coded.
- Alternative free OMS option is Kraken public trades (no key required): set
OMS_API_URL=https://api.kraken.comand optionalKRAKEN_PAIR(e.g.,XBTUSD). - Optional free OMS option is Alpaca paper trading (
OMS_API_URL=https://paper-api.alpaca.marketsplus Alpaca key/secret). - If
models/<BREAK_PREDICTION_MODEL>does not exist, prediction endpoint returns 404 with a clear message. - Default testing path uses SQLite; production should use Postgres.