(Docs) LiteLLM container networking config not working as documented #5638
Unanswered
jordantgh
asked this question in
Troubleshooting
Replies: 2 comments
-
Jordan, check out this Discord chat: https://discord.com/channels/1086345563026489514/1335943424850788422 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Jordan, check out this Discord chat: https://discord.com/channels/1086345563026489514/1335943424850788422 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What happened?
I spent a while trying to follow the docs here:
https://www.librechat.ai/blog/2023-11-30_litellm
I expected to be able to chat with models listed in litellm-config.yaml, but was unable to. In the LibreChat error logs, the error was as shown below.
I established that the LiteLLM container was working fine with some
curl
s. On the suggestion of Claude, I changed from the documentedbaseURL: "host.docker.internal:4000"
tobaseURL: "http://litellm:8000"
inlibrechat.yaml
. That worked! Everything is fine now, seemingly. I even have Langfuse monitoring. However, since I spent a long time on this today, I'm wondering if it is worth updating the docs. Do you think the issue I experienced is because the docs are outdated, or a bespoke issue on my side (or even just thathttp://litellm:8000
is the safer option anyway)?Steps to Reproduce
docker compose up in LibreChat repo dir
navigate to LiteLLM endpoint, try to interact with default model
Relevant log output
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions