fix: auto-enable Ollama provider when configured via environment variables #1957
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR fixes issue #1881 where the Ollama local AI provider was not appearing in the UI despite being correctly configured in the
.envfile.Problem
.envSolution
The fix implements automatic detection and enablement of environment-configured local providers:
Changes Made
/app/routes/_index.tsxuseLoaderData/app/routes/api.models.tsModelsResponseinterface to includeconfiguredProvidersarrayOLLAMA_API_BASE_URL,LMSTUDIO_API_BASE_URL, andOPENAI_LIKE_API_BASE_URL/app/lib/stores/settings.tsTesting
✅ Tested with fresh installation - Ollama appears immediately when configured
✅ Tested with existing saved settings - User preferences preserved
✅ Tested with multiple local providers configured
✅ Build and lint checks pass
Benefits
.envFixes
Fixes #1881
Documentation
Comprehensive documentation has been added in
/docs/OLLAMA_AUTO_ENABLE_FIX.mdexplaining the issue, solution, and implementation details.