You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: rust/llm_chatbot/README.md
+16-14Lines changed: 16 additions & 14 deletions
Original file line number
Diff line number
Diff line change
@@ -3,31 +3,25 @@
3
3
The LLM Chatbot example demonstrates how an ICP smart contract can be used to interact with a large language model (LLM) to generate text. The user can input a prompt, and the smart contract will use the LLM to generate a response.
4
4
The response is then returned to the user, and the user can submit some follow-up prompts to continue the conversation.
5
5
6
-
This application's logic is written in [Rust](https://internetcomputer.org/docs/building-apps/developer-tools/cdks/rust/intro-to-rust), a primary programming language for developing canisters on ICP.
7
-
8
6
## Deploying from ICP Ninja
9
7
10
8
When viewing this project in ICP Ninja, you can deploy it directly to the mainnet for free by clicking "Deploy" in the upper right corner. Open this project in ICP Ninja:
The `/frontend` folder contains web assets for the application's user interface. The user interface is written using the React framework.
16
+
## Build and deploy from the command-line
22
17
23
-
##Continue building locally
18
+
### 1. [Download and install the IC SDK.](https://internetcomputer.org/docs/building-apps/getting-started/install)
24
19
25
-
To migrate your ICP Ninja project off of the web browser and develop it locally, follow these steps.
26
-
To open this project in ICP Ninja, click [here](https://icp.ninja/i?g=https://github.com/dfinity/examples/tree/master/rust/llm_chatbot).
20
+
### 2. Download your project from ICP Ninja using the 'Download files' button on the upper left corner, or [clone the GitHub examples repository.](https://github.com/dfinity/examples/)
27
21
28
-
### 1. Download your project from ICP Ninja using the 'Download files' button on the upper left corner under the pink ninja star icon.
22
+
### 3. Navigate into the project's directory.
29
23
30
-
### 2. Setting up Ollama
24
+
### 4. Set up Ollama
31
25
32
26
To be able to test the agent locally, you'll need a server for processing the agent's prompts. For that, we'll use `ollama`, which is a tool that can download and serve LLMs.
33
27
See the documentation on the [Ollama website](https://ollama.com/) to install it. Once it's installed, run:
@@ -45,4 +39,12 @@ ollama run llama3.1:8b
45
39
46
40
The above command will download an 8B parameter model, which is around 4GiB. Once the command executes and the model is loaded, you can terminate it. You won't need to do this step again.
47
41
48
-
### 3. Open the `BUILD.md` file for further instructions.
42
+
### 5. Deploy the project to your local environment:
43
+
44
+
```
45
+
dfx start --background --clean && dfx deploy
46
+
```
47
+
48
+
## Security considerations and best practices
49
+
50
+
If you base your application on this example, it is recommended that you familiarize yourself with and adhere to the [security best practices](https://internetcomputer.org/docs/building-apps/security/overview) for developing on ICP. This example may not implement all the best practices.
0 commit comments