Onramp is an intelligent onboarding assistant that helps developers discover, understand, and contribute to open-source projects. Built for hackathons and real-world use, it combines real GitHub data with AI-powered analysis to provide personalized guidance.
Demo Video Link- https://vimeo.com/1163975111?share=copy&fl=sv&fe=ci
New developers face significant barriers when trying to contribute to open-source:
- Overwhelming codebases - Hard to understand large, unfamiliar projects
- Finding the right project - Difficult to match skills with suitable repositories
- Getting started - No clear path from interest to first contribution
- Issue selection - Hard to identify beginner-friendly tasks
Onramp provides an AI-powered onboarding experience that:
- Analyzes repositories - Understands architecture, modules, and complexity
- Matches developers - Recommends projects based on skills and interests
- Guides contributions - Provides step-by-step onboarding paths
- Classifies issues - Identifies beginner-friendly contribution opportunities
- Real GitHub Integration - Fetches actual repository data (stars, structure, README)
- Intelligent Analysis - Analyzes architecture patterns, modules, and technologies
- Entry Points - Identifies best files to start exploring
- Detailed Insights - Shows file lists, complexity levels, and module purposes
- Profile-Based - Matches based on languages, frameworks, and experience level
- Guest Mode - Works without signup for instant access
- Personalized Scores - Calculates match scores across multiple dimensions
- Curated Recommendations - Suggests React, VS Code, Next.js, and more
- 4-Step Onboarding Path - From documentation to first contribution
- Personalized Difficulty - Adjusts based on experience level
- Resource Links - Direct links to README, issues, and documentation
- Progress Tracking - Mark steps as complete
- Modern Design - Clean, professional interface with smooth animations
- Welcome Modal - Sign in, sign up, or continue as guest
- Responsive - Works on desktop and mobile
- Fast & Smooth - Optimized performance, no lag
- Graceful Fallbacks - Works without API keys (demo mode)
- Error Handling - Comprehensive try-catch blocks throughout
- Caching - Redis caching for performance
- Type Safety - Full TypeScript coverage
- Node.js + Express
- TypeScript
- PostgreSQL (Prisma ORM)
- Redis (caching)
- OpenAI/Anthropic (LLM)
- Vitest + fast-check (testing)
- React + TypeScript
- Vite
- Tailwind CSS
- React Router
- Axios
- Node.js 18+ and npm
- Docker and Docker Compose
- GitHub Personal Access Token
- OpenAI API Key or Anthropic API Key
git clone <repository-url>
cd onrampnpm installdocker-compose up -dThis starts PostgreSQL and Redis containers.
Note: If you don't have Docker installed, the app will run in Demo Mode with sample data. See the Demo Mode section below for details.
Copy the example file and update with your credentials:
cp packages/backend/.env.example packages/backend/.envFor Demo Mode (no API keys needed):
- The app will work with sample data
- You'll see "
⚠️ Demo Mode" warnings in the analysis - Perfect for testing the UI and flow
For Full Functionality (requires API keys):
GITHUB_TOKEN: Your GitHub Personal Access TokenOPENAI_API_KEY: Your OpenAI API key (orANTHROPIC_API_KEYfor Anthropic)DATABASE_URL: PostgreSQL connection string (default works with Docker Compose)REDIS_URL: Redis connection string (default works with Docker Compose)
The frontend uses Vite's proxy in development, so no configuration is needed. For production:
cp packages/frontend/.env.example packages/frontend/.env.productionUpdate VITE_API_BASE_URL with your production backend URL.
cd packages/backend
npx prisma migrate dev
npx prisma generate
cd ../..cd packages/backend
npm run devBackend runs on http://localhost:5000
cd packages/frontend
npm run devFrontend runs on http://localhost:3000
Onramp includes a Demo Mode that allows you to test the application without setting up API keys or external services. This is perfect for hackathons, quick demos, or exploring the UI.
✅ Full UI Experience: All pages and components work normally
✅ Sample Data: Repository analysis returns realistic mock data
✅ No Setup Required: Just run npm run dev in both packages
✅ Clear Indicators: Demo data is marked with "
To get real GitHub data and AI-powered analysis:
-
Get API Keys:
- GitHub Token: https://github.com/settings/tokens (select
reposcope) - OpenAI Key: https://platform.openai.com/api-keys
- GitHub Token: https://github.com/settings/tokens (select
-
Update
.env:GITHUB_TOKEN=ghp_your_token_here OPENAI_API_KEY=sk-your_key_here
-
Restart Backend:
cd packages/backend npm run dev
The app will automatically detect valid API keys and switch to full functionality!
cd packages/backend
npm testcd packages/frontend
npm run build# Lint all packages
npm run lint
# Format code
npm run formatonramp/
├── packages/
│ ├── backend/ # Express API server
│ │ ├── src/
│ │ │ ├── clients/ # External service clients (GitHub, LLM, Cache)
│ │ │ ├── services/ # Business logic services
│ │ │ ├── routes/ # API route handlers
│ │ │ ├── middleware/ # Express middleware
│ │ │ ├── types/ # TypeScript types and interfaces
│ │ │ ├── validation/ # Zod schemas
│ │ │ └── utils/ # Utility functions
│ │ └── prisma/ # Database schema
│ └── frontend/ # React application
│ └── src/
│ ├── components/ # React components
│ ├── pages/ # Page components
│ ├── services/ # API client services
│ ├── context/ # React Context providers
│ └── types/ # TypeScript types
├── docker-compose.yml # Infrastructure services
└── package.json # Workspace configuration
POST /api/repositories/analyze- Analyze a GitHub repositoryGET /api/repositories/:owner/:repo- Get cached repository analysis
POST /api/users/profile- Create/update user profileGET /api/users/:userId/profile- Get user profile
POST /api/recommendations- Get project recommendations
POST /api/guidance- Get contribution path for a repository
GET /api/issues/:owner/:repo- Get classified issues for a repository
GET /api/health- Health check endpoint
| Variable | Description | Required | Default |
|---|---|---|---|
NODE_ENV |
Environment (development/production) | No | development |
PORT |
Server port | No | 5000 |
DATABASE_URL |
PostgreSQL connection string | Yes | - |
REDIS_URL |
Redis connection string | Yes | - |
GITHUB_TOKEN |
GitHub Personal Access Token | Yes | - |
LLM_PROVIDER |
LLM provider (openai/anthropic) | No | openai |
OPENAI_API_KEY |
OpenAI API key | Conditional | - |
ANTHROPIC_API_KEY |
Anthropic API key | Conditional | - |
CORS_ORIGIN |
Allowed CORS origin | No | * |
RATE_LIMIT_WINDOW_MS |
Rate limit window | No | 900000 |
RATE_LIMIT_MAX_REQUESTS |
Max requests per window | No | 100 |
CACHE_TTL_REPOSITORY_ANALYSIS |
Cache TTL for repository analysis (seconds) | No | 3600 |
CACHE_TTL_ISSUE_CLASSIFICATION |
Cache TTL for issue classification (seconds) | No | 1800 |
| Variable | Description | Required | Default |
|---|---|---|---|
VITE_API_BASE_URL |
Backend API URL | No | /api (proxied) |
cd packages/backend
npm run build
npm startcd packages/frontend
npm run buildThe built files will be in packages/frontend/dist/ and can be served by any static file server.
See Task 21.3 for Docker configurations (coming soon).
The project uses Vitest for unit testing and fast-check for property-based testing.
- 108 tests covering all services, clients, middleware, and validation
- Property-based tests for critical correctness properties
- Run with
npm testin the backend package
- Component rendering verified through build process
- End-to-end testing can be added with Playwright or Cypress
This is a hackathon MVP project. Contributions are welcome!
MIT