Problem
While setting up Sugar-AI locally, I noticed that app/ai.py only
supports HuggingFace models through the RAGAgent class. There's no
way for users to plug in their own OpenAI, Anthropic, or Ollama models.
This means if a school has an OpenAI API key and wants to use it with
Sugar-AI, they currently can't. Passing openai/gpt-4 to the
/change-model endpoint would crash the app since it tries to load
it as a HuggingFace model.
Proposed Solution
Add provider detection in app/ai.py, specifically in the __init__
and set_model methods of RAGAgent that checks the model string
prefix and routes to the correct client.
This way run and run_chat_completion would work the same way
regardless of provider.
Problem
While setting up Sugar-AI locally, I noticed that
app/ai.pyonlysupports HuggingFace models through the
RAGAgentclass. There's noway for users to plug in their own OpenAI, Anthropic, or Ollama models.
This means if a school has an OpenAI API key and wants to use it with
Sugar-AI, they currently can't. Passing
openai/gpt-4to the/change-modelendpoint would crash the app since it tries to loadit as a HuggingFace model.
Proposed Solution
Add provider detection in
app/ai.py, specifically in the__init__and
set_modelmethods ofRAGAgentthat checks the model stringprefix and routes to the correct client.
This way
runandrun_chat_completionwould work the same wayregardless of provider.