Open WebUI - Integration #97
aarseneau-idexx
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
Just wanted to post this here in case anyone uses Open WebUI and wanted to use their server as the back-end for Elia.
For reference Open WebUI has two APIs, the Ollama API Proxy and the Standard API. Ignore the Ollama API proxy, that will only allow you to forward requests to a local ollama server (if configured). Instead we want to use the Standard API endopoints for Open WebUI itself. This will let you use Open WebUI as a proxy for both internal and externally hosted LLMs.
<model_name>: This is the full name of the model. If you aren't sure what to put here you can select the model in the Web UI and copy the url link. That link will have a query parameter in it
model=<model_name>, copy the <model_name> part and you should be all set. Or you can curl the API to get the list of models: see https://docs.openwebui.com/getting-started/api-endpoints#-list-available-models<open-webui-url>: The http/https url to your Open WebUI instance.<api-token>: The api token for your Open WebUI user: see https://docs.openwebui.com/getting-started/api-endpoints/#authentication for how to generate your token.Beta Was this translation helpful? Give feedback.
All reactions