You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: RELEASE.md
+7Lines changed: 7 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,12 @@
1
1
# Releases
2
2
3
+
## 0.15.15 - Docker Compose
4
+
5
+
* Quick Start using Docker compose for Chatbot.
6
+
* Chatbot - Bug Fix: Remove token limit on response. The `MAXTOKENS` setting is used to prune content sent to LLM. If not set, no pruning will happen.
7
+
* Chatbot - Added additional LiteLLM support with the environmental settings `LITELLM_PROXY` and `LITELLM_KEY`. If set, these will override the OpenAI API settings to use LiteLLM and will remove `EXTRA_BODY` defaults that conflict with LiteLLM.
8
+
* LiteLLM - Added docker compose to start LiteLLM, PostgreSQL, and Chatbot.
9
+
3
10
## 0.15.14 - Multi-model Support
4
11
5
12
* Chatbot - Add `/model` command to list available models and dynamically set models during the session.
Copy file name to clipboardExpand all lines: chatbot/README.md
+23-3Lines changed: 23 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,26 @@ Below are steps to get the Chatbot and Document Manager running.
12
12
13
13
The Chatbot can be launched as a Docker container or via command line.
14
14
15
-
### Docker
15
+
### Method 1: Docker Compose
16
+
17
+
A quickstart method is located in the [litellm](./litellm/) folder. This setup will launch the Chatbot + LiteLLM and PostgreSQL. This works on Mac and Linux (or WSL) systems.
18
+
19
+
```bash
20
+
cd litellm
21
+
22
+
# Edit compose.yaml and config.yaml for your setup.
23
+
nano compose.yaml
24
+
nano config.yaml
25
+
26
+
# Launch
27
+
docker compose up -d
28
+
```
29
+
30
+
The containers will download and launch. The database will be set up in the `./db` folder.
31
+
- The Chatbot will be available at http://localhost:5000
32
+
- The LiteLLM usage dashboard will be available at http://localhost:4000/ui
33
+
34
+
### Method 2: Docker
16
35
17
36
```bash
18
37
# Create placeholder prompts.json
@@ -85,7 +104,8 @@ docker run \
85
104
-d \
86
105
-p 5000:5000 \
87
106
-e PORT=5000 \
88
-
-e OPENAI_API_BASE="http://localhost:4000/v1" \
107
+
-e LITELLM_PROXY="http://localhost:4000/v1" \
108
+
-e LITELLM_KEY="sk-mykey" \
89
109
-e LLM_MODEL="local-pixtral" \
90
110
-e TZ="America/Los_Angeles" \
91
111
-v $PWD/.tinyllm:/app/.tinyllm \
@@ -98,7 +118,7 @@ The Chatbot will try to use the specified model (`LLM_MODEL`) but if it is not a
This folder contains a docker-compose file that will start a TinyLLM Chatbot with a LiteLLM proxy and a PostgreSQL database. The Chatbot will connect to the LiteLLM proxy to access the models. The LiteLLM proxy will connect to the PostgreSQL database to store usage data.
4
+
5
+
## Instructions
6
+
7
+
1. Edit the config.yaml file to add your models and settings.
8
+
2. Edit the compose.yaml file to adjust the environment variables in the services as needed.
9
+
3. Run `docker compose up -d` to start the services.
10
+
11
+
The containers will download and launch. The database will be set up in the `./db` folder.
12
+
13
+
- The Chatbot will be available at http://localhost:5000
14
+
- The LiteLLM proxy will be available at http://localhost:4000/ui
15
+
- The PostgreSQL pgAdmin interface will be available at http://localhost:5050
0 commit comments