Skip to content

Conversation

@caxu-rh
Copy link
Collaborator

@caxu-rh caxu-rh commented Oct 28, 2025

Description

This change makes it possible to use Llama Stack as the LLM provider by setting llamastack as the value for the LLM_PROVIDER environment variable. The default provider is still openai.

How was this tested / what tests were added?

  • Manual: Confirmed that setting llamastack as the value of LLM_PROVIDER causes queries to be sent to Llama Stack.

Screenshots or recordings (if applicable)

Documentation updates

  • Changes are self documenting
  • Inline docs added
  • Formal docs added

@caxu-rh caxu-rh force-pushed the feat/api-llamastack-support branch 4 times, most recently from a3b0df5 to 78d8460 Compare October 28, 2025 16:59
@caxu-rh caxu-rh force-pushed the feat/api-llamastack-support branch from 78d8460 to d6fac71 Compare November 12, 2025 16:02
@sauagarwa sauagarwa merged commit f46c833 into rh-ai-quickstart:main Nov 13, 2025
3 checks passed
@yashoza19
Copy link
Collaborator

🎉 This PR is included in version 2.2.0 🎉

The release is available on GitHub release

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants