Skip to content

Commit c29064a

Browse files
Merge pull request #1205 from dfinity/jessiemongeon1-patch-36
Standardize README
2 parents 32c5092 + 8c29683 commit c29064a

File tree

1 file changed

+16
-14
lines changed

1 file changed

+16
-14
lines changed

rust/llm_chatbot/README.md

Lines changed: 16 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -3,31 +3,25 @@
33
The LLM Chatbot example demonstrates how an ICP smart contract can be used to interact with a large language model (LLM) to generate text. The user can input a prompt, and the smart contract will use the LLM to generate a response.
44
The response is then returned to the user, and the user can submit some follow-up prompts to continue the conversation.
55

6-
This application's logic is written in [Rust](https://internetcomputer.org/docs/building-apps/developer-tools/cdks/rust/intro-to-rust), a primary programming language for developing canisters on ICP.
7-
86
## Deploying from ICP Ninja
97

108
When viewing this project in ICP Ninja, you can deploy it directly to the mainnet for free by clicking "Deploy" in the upper right corner. Open this project in ICP Ninja:
119

1210
[![](https://icp.ninja/assets/open.svg)](https://icp.ninja/i?g=https://github.com/dfinity/examples/rust/llm_chatbot)
1311

14-
## Project structure
15-
16-
The `/backend` folder contains the Rust smart contract:
12+
## Deploying from ICP Ninja
1713

18-
- `Cargo.toml`, which defines the crate that will form the backend
19-
- `lib.rs`, which contains the actual smart contract, and exports its interface
14+
[![](https://icp.ninja/assets/open.svg)](https://icp.ninja/editor?g=https://github.com/dfinity/examples/tree/master/rust/counter)
2015

21-
The `/frontend` folder contains web assets for the application's user interface. The user interface is written using the React framework.
16+
## Build and deploy from the command-line
2217

23-
## Continue building locally
18+
### 1. [Download and install the IC SDK.](https://internetcomputer.org/docs/building-apps/getting-started/install)
2419

25-
To migrate your ICP Ninja project off of the web browser and develop it locally, follow these steps.
26-
To open this project in ICP Ninja, click [here](https://icp.ninja/i?g=https://github.com/dfinity/examples/tree/master/rust/llm_chatbot).
20+
### 2. Download your project from ICP Ninja using the 'Download files' button on the upper left corner, or [clone the GitHub examples repository.](https://github.com/dfinity/examples/)
2721

28-
### 1. Download your project from ICP Ninja using the 'Download files' button on the upper left corner under the pink ninja star icon.
22+
### 3. Navigate into the project's directory.
2923

30-
### 2. Setting up Ollama
24+
### 4. Set up Ollama
3125

3226
To be able to test the agent locally, you'll need a server for processing the agent's prompts. For that, we'll use `ollama`, which is a tool that can download and serve LLMs.
3327
See the documentation on the [Ollama website](https://ollama.com/) to install it. Once it's installed, run:
@@ -45,4 +39,12 @@ ollama run llama3.1:8b
4539

4640
The above command will download an 8B parameter model, which is around 4GiB. Once the command executes and the model is loaded, you can terminate it. You won't need to do this step again.
4741

48-
### 3. Open the `BUILD.md` file for further instructions.
42+
### 5. Deploy the project to your local environment:
43+
44+
```
45+
dfx start --background --clean && dfx deploy
46+
```
47+
48+
## Security considerations and best practices
49+
50+
If you base your application on this example, it is recommended that you familiarize yourself with and adhere to the [security best practices](https://internetcomputer.org/docs/building-apps/security/overview) for developing on ICP. This example may not implement all the best practices.

0 commit comments

Comments
 (0)