Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Sugar-AI provides a Docker-based deployment option for an isolated and reproduci
Open your terminal in the project's root directory and run:

```sh
docker build -t sugar-ai .
docker build -t sugar-ai
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why remove the dot at the end? The argument is needed and you can confirm this by running with and without it.

```

### Run the Docker container
Expand Down Expand Up @@ -69,7 +69,7 @@ Sugar-AI provides three different endpoints for different use cases:
|----------|---------|--------------|----------|
| `/ask` | RAG-enabled answers | Query parameter | • Retrieval-Augmented Generation<br>• Sugar/Pygame/GTK documentation<br>• Child-friendly responses |
| `/ask-llm` | Direct LLM without RAG | Query parameter | • No document retrieval<br>• Direct model access<br>• Faster responses<br>• Default system prompt and parameters |
| `/ask-llm-prompted(promoted mode[default])` | Custom prompt with advanced controls | JSON body | • Custom system prompts<br>• Configurable model parameters |
| `/ask-llm-prompted(prompted mode[default])` | Custom prompt with advanced controls | JSON body | • Custom system prompts<br>• Configurable model parameters |
| `/ask-llm-prompted(chat=True)` | Accepts chat history with system prompt | JSON body | • Send chat history along with system prompt<br>• Configurable model parameters |

- **GET endpoint**
Expand Down Expand Up @@ -233,7 +233,7 @@ Sugar-AI provides three different endpoints for different use cases:
```

**Use Cases:**
Prompted Mode: Different activites can now use different system prompts and different generation parameters to achieve a model that is personalized to that activites needs.
Prompted Mode: Different activities can now use different system prompts and different generation parameters to achieve a model that is personalized to that activities needs.
Copy link
Copy Markdown
Member

@chimosky chimosky Mar 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The second activities should be "activity's needs" as it's referring to just one activity.

Chat Mode: They can also use the chat mode to give context of chat history to the LLM better suited for conversational style features.

**Generation Parameter Guidelines:**
Expand Down