Skip to content

Conversation

@embire2
Copy link

@embire2 embire2 commented Sep 2, 2025

Summary

This PR fixes issue #1881 where the Ollama local AI provider was not appearing in the UI despite being correctly configured in the .env file.

Problem

  • Local providers (Ollama, LMStudio, OpenAILike) were disabled by default in the UI
  • No mechanism existed to detect environment-configured providers
  • Users had to manually enable Ollama even when properly configured via .env

Solution

The fix implements automatic detection and enablement of environment-configured local providers:

  1. Server-side detection: Modified API endpoints to detect which local providers are configured via environment variables
  2. Client-side auto-enable: Automatically enables detected providers on first load (when no saved settings exist)
  3. Preserves user choice: Respects manual configuration if user has already saved provider settings

Changes Made

/app/routes/_index.tsx

  • Added loader to detect configured providers from environment
  • Implements auto-enable logic in the Index component using useLoaderData
  • Only enables providers if no saved settings exist (first-time users)

/app/routes/api.models.ts

  • Extended ModelsResponse interface to include configuredProviders array
  • Added detection logic for OLLAMA_API_BASE_URL, LMSTUDIO_API_BASE_URL, and OPENAI_LIKE_API_BASE_URL

/app/lib/stores/settings.ts

  • Cleaned up provider initialization logic
  • Removed problematic async fetch call

Testing

✅ Tested with fresh installation - Ollama appears immediately when configured
✅ Tested with existing saved settings - User preferences preserved
✅ Tested with multiple local providers configured
✅ Build and lint checks pass

Benefits

  • Zero-configuration experience: Ollama appears immediately when configured via .env
  • Backward compatible: Existing users with saved settings are unaffected
  • Consistent: Works the same for all local providers (Ollama, LMStudio, OpenAILike)
  • User-friendly: Eliminates confusion for new users setting up local AI providers

Fixes

Fixes #1881

Documentation

Comprehensive documentation has been added in /docs/OLLAMA_AUTO_ENABLE_FIX.md explaining the issue, solution, and implementation details.

…ables

Fixes stackblitz-labs#1881 - Ollama provider not appearing in UI despite correct configuration

Problem:
- Local providers (Ollama, LMStudio, OpenAILike) were disabled by default
- No mechanism to detect environment-configured providers
- Users had to manually enable Ollama even when properly configured

Solution:
- Server detects environment-configured providers and reports to client
- Client auto-enables configured providers on first load
- Preserves user preferences if manually configured

Changes:
- Modified _index.tsx loader to detect configured providers
- Extended api.models.ts to include configuredProviders in response
- Added auto-enable logic in Index component
- Cleaned up provider initialization in settings store

This ensures zero-configuration experience for Ollama users while
respecting manual configuration choices.
@Stijnus
Copy link
Collaborator

Stijnus commented Sep 3, 2025

I will put this on hold as im going to rewrite the local providers logic.

@embire2
Copy link
Author

embire2 commented Sep 6, 2025

I will put this on hold as im going to rewrite the local providers logic.

No problem Stijnus. FYI, I applied to become an official contributor, would you mind checking on my application. The best way to reach me is via email: ceo@openweb.email

Have a great day
Keoma

@Stijnus
Copy link
Collaborator

Stijnus commented Sep 6, 2025

I have a new Pr ready, i with some of you imporvements

@Stijnus Stijnus closed this Sep 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Local Ollama provider not appearing in UI despite correct configuration

2 participants