Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setup luarock structure and expose stream handler #14

Merged
merged 15 commits into from
Dec 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 29 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Unified Lua Interface for Generative AI
# Generative AI SDK for Lua

A developer-friendly Lua interface for working with multiple generative AI providers, abstracting away provider-specific payload structures and response parsing so you can easily switch between various models and providers without rewriting any code.
A developer-friendly Lua interface for working with various generative AI providers, abstracting away provider-specific payload structures and response parsing so that using multiple models is easy.

## Providers

Expand All @@ -15,33 +15,49 @@ A developer-friendly Lua interface for working with multiple generative AI provi
- Easily switch between AI chat model providers
- Pass in prompts and get replies without the provider complexity
- Easily integrate new models and adjust settings
- Work directly with the `src.ai` client for more granular control
- Use the `chat` object for integrated message handling
- Use the `genai` client directly for more granular control if needed
- Abstraction for structured response JSON output
- Token usage tracking with cost calculation

## Installation

```
luarocks install lua-genai
```

## Usage

```lua
local AI = require("src.ai")
local genai = require("genai")

local client = genai.new("<YOUR_API_KEY>", "https://api.openai.com/v1/chat/completions")

local client = AI.new("<YOUR_API_KEY>", "https://api.openai.com/v1/chat/completions")
local chat = client:chat("gpt-4o-mini")
print(chat:say("Hello, world!"))
```

### Minimal
### System Prompt

```lua
local chat = client:chat("gpt-4o-mini")
print(chat:say("Hello, world!"))
local chat = client:chat("gpt-4o-mini", { system_prompt = "You are a fish." })
print(chat:say("What are you?"))
```

### Streaming

```lua
local chat = client:chat("gpt-4o-mini", { settings = { stream = true } })
chat:say("Hello, world!")
local process_stream = function(text)
io.write(text)
io.flush()
end

local chat = client:chat("gpt-4o-mini", { settings = { stream = process_stream } })
chat:say("Tell me a very short story.")
print()
```

### JSON
### JSON Response

```lua
local npc_schema = {
Expand All @@ -60,9 +76,9 @@ local chat = client:chat("gpt-4o-mini", { settings = { json = json_object } })
print(chat:say("Create a powerful wizard called Torben."))
```

See `main.lua` for a more detailed example.
See `example.lua` for a full-featured Anthropic implementation.

### Dependencies
## Dependencies

- [lua-cjson](https://github.com/openresty/lua-cjson)

Expand Down
50 changes: 50 additions & 0 deletions example.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
local genai = require("genai")

local api_key = "<YOUR_API_KEY>"
local endpoint = "https://api.anthropic.com/v1/messages"
local model = "claude-3-5-sonnet-20241022"

local client = genai.new(api_key, endpoint)

local response_schema = {
name = {
type = "string",
},
response = {
type = "string",
},
}

local chat = client:chat(model, {
system_prompt = "Respond extremely briefly.",
settings = {
json = {
title = "NPC",
description = "Response schema of NPCs.",
schema = response_schema,
},
stream = function(text)
io.write(text)
io.flush()
end,
},
})

while true do
local user_prompt = "You are King Torben giving advice."
print(user_prompt)
print()

local reply = chat:say(user_prompt) -- API call

if not chat.settings.stream then
print(reply)
else
print()
end
print()
break
end

local usd_token_cost = chat:get_cost()
print(usd_token_cost .. "usd")
59 changes: 0 additions & 59 deletions main.lua

This file was deleted.

35 changes: 35 additions & 0 deletions rockspecs/lua-genai-0.1-1.rockspec
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
package = "lua-genai"
version = "0.1-1"

source = {
url = "https://github.com/emilrueh/lua-genai.git",
tag = "v0.1",
}

description = {
summary = "Generative AI SDK",
detailed = "Interface for generative AI providers like OpenAI, Anthropic, Google Gemini, etc. abstracting away provider-specific payload structures and response parsing to simplify switching models.",
homepage = "https://github.com/emilrueh/lua-genai",
license = "Zlib",
}

dependencies = {
"lua >= 5.1",
"lua-cjson",
"luasec",
}

build = {
type = "builtin",
-- copy_directories = { "docs" },
modules = {
["genai"] = "src/genai/init.lua",
["genai.genai"] = "src/genai/genai.lua",
["genai.utils"] = "src/genai/utils.lua",
["genai.features"] = "src/genai/features/init.lua",
["genai.features.chat"] = "src/genai/features/chat.lua",
["genai.providers"] = "src/genai/providers/init.lua",
["genai.providers.anthropic"] = "src/genai/providers/anthropic.lua",
["genai.providers.openai"] = "src/genai/providers/openai.lua",
},
}
6 changes: 0 additions & 6 deletions src/features.lua

This file was deleted.

File renamed without changes.
20 changes: 10 additions & 10 deletions src/features/chat.lua → src/genai/features/chat.lua
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
local utils = require("src.utils")
local utils = require("genai.utils")

---@class Chat Accumulating chat history and usage
---@field ai table
---@field client table
---@field model string
---@field settings table?
---@field usage table
Expand All @@ -10,21 +10,21 @@ local utils = require("src.utils")
local Chat = {}
Chat.__index = Chat

---@param ai table
---@param client table
---@param model string
---@param opts table? Containing **settings** and or **system_prompt**
function Chat.new(ai, model, opts)
function Chat.new(client, model, opts)
local self = setmetatable({}, Chat)

self.ai = ai
self.client = client
self.model = model
self.settings = opts and opts.settings or {}
self.usage = { input = 0, output = 0 }
self.history = {}
self.system_prompt = opts and opts.system_prompt

-- insert system prompt into chat history at the start if provided
local system_message = self.ai.provider.construct_system_message(self.system_prompt)
local system_message = self.client.provider.construct_system_message(self.system_prompt)
if system_message then -- some providers use system message as top-level arg
table.insert(self.history, system_message)
end
Expand All @@ -36,9 +36,9 @@ end
---@param user_prompt string
---@return string reply Full response text whether streamed or not
function Chat:say(user_prompt)
table.insert(self.history, self.ai.provider.construct_user_message(user_prompt))
local reply, input_tokens, output_tokens = self.ai:call(self)
table.insert(self.history, self.ai.provider.construct_assistant_message(reply))
table.insert(self.history, self.client.provider.construct_user_message(user_prompt))
local reply, input_tokens, output_tokens = self.client:call(self)
table.insert(self.history, self.client.provider.construct_assistant_message(reply))
self.usage.input = self.usage.input + input_tokens
self.usage.output = self.usage.output + output_tokens
return reply
Expand All @@ -47,7 +47,7 @@ end
---Caculate model pricing from input and output tokens in USD
---@return number
function Chat:get_cost()
return utils.calc_token_cost(self.model, self.usage, self.ai.provider.pricing)
return utils.calc_token_cost(self.model, self.usage, self.client.provider.pricing)
end

return Chat
6 changes: 6 additions & 0 deletions src/genai/features/init.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
---@module "genai.features"
local features = {}

features.Chat = require("genai.features.chat")

return features
Loading