We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama_host = "https://ollama.example.com" # https://github.com/kagisearch/pyllms/blob/main/README.md?plain=1#L192 models_list = [ "tinyllama:latest" ] models = llms.init(model=models_list, ollama_host=ollama_host) --------------------------------------------------------------------------- ValueError Traceback (most recent call last) Cell In[13], line 1 ----> 1 models = llms.init(model=models_list, ollama_host=ollama_host) File ~/Lab1/others/venv-jupyter/lib64/python3.12/site-packages/llms/__init__.py:8, in init(*args, **kwargs) 4 if len(args) > 1 and not kwargs.get('model'): 5 raise ValueError( 6 "Please provide a list of models, like this: model=['j2-grande-instruct', 'claude-v1', 'gpt-3.5-turbo']" 7 ) ----> 8 return LLMS(*args, **kwargs) File ~/Lab1/others/venv-jupyter/lib64/python3.12/site-packages/llms/llms.py:92, in LLMS.__init__(self, model, **kwargs) 90 self._load_api_keys(kwargs) 91 self._set_models(model) ---> 92 self._initialize_providers(kwargs) File ~/Lab1/others/venv-jupyter/lib64/python3.12/site-packages/llms/llms.py:904, in LLMS._initialize_providers(self, kwargs) 896 self._providers = [ 897 provider.provider(model=single_model, **({**kwargs, 'api_key': provider.api_key} if provider.needs_api_key else kwargs)) 898 for single_model in self._models 899 for provider in self._provider_map.values() 900 if self._validate_model(single_model, provider) 901 ] 903 if not self._providers: --> 904 raise ValueError("No valid providers found for the specified models") 906 for provider in self._providers: 907 LOGGER.info( 908 f"Initialized {provider.model} with {provider.__class__.__name__}" 909 ) ValueError: No valid providers found for the specified models
but https://github.com/kagisearch/pyllms/blob/main/llms/providers/ollama.py#L49
Tested with current pypi pyllms version 0.7.1 and ollama 0.5.7
The text was updated successfully, but these errors were encountered:
No branches or pull requests
but https://github.com/kagisearch/pyllms/blob/main/llms/providers/ollama.py#L49
Tested with current pypi pyllms version 0.7.1 and ollama 0.5.7
The text was updated successfully, but these errors were encountered: