Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Ollama #16

Open
ClarkKentIsSuperman opened this issue Jan 30, 2025 · 1 comment
Open

Support for Ollama #16

ClarkKentIsSuperman opened this issue Jan 30, 2025 · 1 comment

Comments

@ClarkKentIsSuperman
Copy link

I'd like to use this against an Ollama model - which follows the OpenAI formats/spec usually - but I was wondering if the only current way to do this would be to modify the OpenAI version to customize to localhost and any other minor changes, then rebuild?

Also just wondering if anyone has already done this or its in the works to not duplicate work.

@jplhughes
Copy link
Owner

This seems like the easiest approach. You might come up against problems with how it keeps track of rate limits, costs and context length. You'll have to either bypass that or add a way to get that via ollama. Alternatively you just create a separate class that makes requests to the localhost url. No one else is working on this AFAIK so feel free to pick it up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants