Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow OpenAI compatible model providers #17

Merged
merged 4 commits into from
Feb 1, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 10 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,21 +14,23 @@ A developer-friendly Lua interface for working with various generative AI provid
- Stream output for real-time responses
- Structured JSON response abstraction layer
- Token usage tracking with cost calculation
- Open-source models via OpenAI compatibility

### Providers

- [OpenAI](https://platform.openai.com/docs/overview)
- OpenAI: https://platform.openai.com/docs/overview

- [Anthropic](https://docs.anthropic.com/en/home)
- Anthropic: https://docs.anthropic.com/en/home

- Anything OpenAI compatible e.g. **Perplexity, Together AI, etc.** by prefixing endpoint with openai and double colon: `"openai::https://api.perplexity.ai/chat/completions"`

### Roadmap

1. Advanced error handling
2. Google Gemini integration
3. Audio models
4. Image models
5. Open-Source model integration
6. Video models
- [ ] Audio models

- [ ] Image models

- [ ] Video models

## Installation

Expand Down
12 changes: 11 additions & 1 deletion src/genai/genai.lua
Original file line number Diff line number Diff line change
Expand Up @@ -27,13 +27,23 @@ end
---@return table? provider_module Collection of functions determining input and output structure
function GenAI:_determine_provider(providers)
local provider = nil
local endpoint = self._endpoint
for provider_name, provider_module in pairs(providers) do
if self._endpoint:find(provider_name) then provider = provider_module end
if endpoint:find(provider_name) then provider = provider_module end
end
assert(provider, "GenAI provider could not be determined from provided endpoint")
self._endpoint = self:check_if_openai_compatible(endpoint)
return provider
end

---Check if the endpoint starts with 'openai::' for API compatibility
---@param endpoint string
---@return string endpoint
function GenAI:check_if_openai_compatible(endpoint)
local prefix, url = endpoint:match("^(.-)::(.+)$")
return (prefix == "openai") and url or endpoint
end

---Prepare streaming requirements if set to stream
---@param processor function? Display of streamed text chunks
---@return table? accumulator Schema storing full streamed response
Expand Down