diff --git a/README.md b/README.md
index 2fc373f..499a1a1 100644
--- a/README.md
+++ b/README.md
@@ -11,7 +11,7 @@ Sugar-AI provides a Docker-based deployment option for an isolated and reproduci
Open your terminal in the project's root directory and run:
```sh
-docker build -t sugar-ai .
+docker build -t sugar-ai
```
### Run the Docker container
@@ -69,7 +69,7 @@ Sugar-AI provides three different endpoints for different use cases:
|----------|---------|--------------|----------|
| `/ask` | RAG-enabled answers | Query parameter | • Retrieval-Augmented Generation
• Sugar/Pygame/GTK documentation
• Child-friendly responses |
| `/ask-llm` | Direct LLM without RAG | Query parameter | • No document retrieval
• Direct model access
• Faster responses
• Default system prompt and parameters |
-| `/ask-llm-prompted(promoted mode[default])` | Custom prompt with advanced controls | JSON body | • Custom system prompts
• Configurable model parameters |
+| `/ask-llm-prompted(prompted mode[default])` | Custom prompt with advanced controls | JSON body | • Custom system prompts
• Configurable model parameters |
| `/ask-llm-prompted(chat=True)` | Accepts chat history with system prompt | JSON body | • Send chat history along with system prompt
• Configurable model parameters |
- **GET endpoint**
@@ -233,7 +233,7 @@ Sugar-AI provides three different endpoints for different use cases:
```
**Use Cases:**
- Prompted Mode: Different activites can now use different system prompts and different generation parameters to achieve a model that is personalized to that activites needs.
+ Prompted Mode: Different activities can now use different system prompts and different generation parameters to achieve a model that is personalized to that activities needs.
Chat Mode: They can also use the chat mode to give context of chat history to the LLM better suited for conversational style features.
**Generation Parameter Guidelines:**