Thank you for your interest in contributing to OCode! This guide will help you get started.
By participating in this project, you agree to abide by our Code of Conduct: be respectful, inclusive, and constructive in all interactions.
- Check existing issues first to avoid duplicates
- Use the issue templates when available
- Include:
- Clear description of the problem
- Steps to reproduce
- Expected vs actual behavior
- System information (OS, Python version, Ollama version)
- Relevant logs or error messages
- Open a discussion in the Ideas category
- Describe the use case and benefits
- Consider implementation complexity
- Be open to feedback and alternatives
# Fork on GitHub, then:
git clone https://github.com/YOUR-USERNAME/ocode.git
cd ocode
git remote add upstream https://github.com/haasonsaas/ocode.git# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install in development mode
pip install -e ".[dev]"
# Install pre-commit hooks
pre-commit install# Update main branch
git checkout main
git pull upstream main
# Create feature branch
git checkout -b feature/your-feature-name
# Or for fixes:
git checkout -b fix/issue-descriptionFollow these guidelines:
- Follow PEP 8 (enforced by Black formatter)
- Use type hints where appropriate
- Maximum line length: 88 characters
- Use descriptive variable names
- Add docstrings to all public functions/classes
- Use Google docstring style
- Update relevant documentation
- Include examples where helpful
- Write tests for new functionality
- Ensure all tests pass
- Maintain or improve code coverage
- Test edge cases
- Use clear, descriptive commit messages
- Follow conventional commits format:
type(scope): subject body (optional) footer (optional) - Types: feat, fix, docs, style, refactor, test, chore
# Format code
make format
# Run linters
make lint
# Run tests
make test
# Run all checks
make ci-
Push your branch:
git push origin feature/your-feature-name
-
Create PR on GitHub with:
- Clear title and description
- Link to related issues
- List of changes made
- Screenshots if UI changes
- Test results
-
Address review feedback promptly
ocode/
├── ocode_python/
│ ├── core/ # Core engine and CLI
│ ├── tools/ # Tool implementations
│ ├── languages/ # Language analyzers
│ ├── mcp/ # MCP integration
│ └── utils/ # Utilities
├── tests/
│ ├── unit/ # Unit tests
│ ├── integration/ # Integration tests
│ └── fixtures/ # Test data
├── docs/ # Documentation
└── examples/ # Example code
-
Create tool file in
ocode_python/tools/:from .base import Tool, ToolDefinition, ToolParameter, ToolResult class YourTool(Tool): @property def definition(self) -> ToolDefinition: return ToolDefinition( name="your_tool", description="Clear description", parameters=[ ToolParameter( name="param", description="Parameter description", type="string", required=True ) ], category="appropriate_category" ) async def execute(self, **kwargs) -> ToolResult: try: # Implementation return ToolResult( success=True, output="Result" ) except Exception as e: return ToolResult( success=False, output="", error=str(e) )
-
Register in
ToolRegistry.register_core_tools() -
Add tests in
tests/unit/test_your_tool.py -
Document in
docs/user-guide/tool-reference/
-
Create analyzer in
ocode_python/languages/:from .base import LanguageAnalyzer, Symbol class YourLangAnalyzer(LanguageAnalyzer): file_extensions = [".ext"] def extract_symbols(self, content: str) -> List[Symbol]: # Parse and extract symbols def extract_imports(self, content: str) -> List[str]: # Extract imports
-
Register in language registry
-
Add tests with sample files
-
Update documentation
import pytest
from ocode_python.tools.your_tool import YourTool
@pytest.mark.asyncio
async def test_your_tool_success():
tool = YourTool()
result = await tool.execute(param="value")
assert result.success
assert "expected" in result.output
@pytest.mark.asyncio
async def test_your_tool_error():
tool = YourTool()
result = await tool.execute(param="invalid")
assert not result.success
assert result.error- Test component interactions
- Use real file operations when needed
- Mock external services
@pytest.mark.unit- Unit tests@pytest.mark.integration- Integration tests@pytest.mark.slow- Slow tests@pytest.mark.security- Security tests
def process_data(data: List[Dict[str, Any]],
options: ProcessOptions) -> ProcessResult:
"""Process data according to specified options.
Args:
data: List of data items to process
options: Processing configuration options
Returns:
ProcessResult containing processed data and metadata
Raises:
ValueError: If data is empty or invalid
ProcessingError: If processing fails
Example:
>>> result = process_data([{"id": 1}], options)
>>> print(result.count)
1
"""- Clear, concise language
- Practical examples
- Common use cases
- Troubleshooting section
- Async Operations: Use async/await for I/O
- Streaming: Stream large responses
- Caching: Cache expensive operations
- Memory: Avoid loading large files entirely
- Concurrency: Use asyncio for parallel operations
- Input Validation: Always validate user input
- Path Security: Use path validation utilities
- Command Injection: Sanitize shell commands
- Permissions: Respect security configuration
- Secrets: Never log sensitive information
-
Pre-submission Checklist:
- Code follows style guidelines
- Tests pass locally
- Documentation updated
- Commits are clean and descriptive
- PR description is complete
-
Review Process:
- Automated checks must pass
- At least one maintainer review
- Address feedback promptly
- Keep PR focused and manageable
-
Merge Requirements:
- All tests passing
- Documentation complete
- No unresolved conversations
- Approved by maintainer
- Version Numbering: Semantic versioning (MAJOR.MINOR.PATCH)
- Changelog: Update CHANGELOG.md following Keep a Changelog
- Testing: Full test suite passes
- Documentation: All docs updated
- Tag: Create git tag for release
- Package: Build and publish to PyPI
- Discord: Join our community server
- Discussions: GitHub Discussions for questions
- Issues: GitHub Issues for bugs
- Email: jonathan@haasonsaas.com for security issues
Contributors are recognized in:
- CONTRIBUTORS.md file
- Release notes
- Project documentation
By contributing, you agree that your contributions will be licensed under the project's AGPL-3.0 license.