|
1 |
| -This is a [Next.js](https://nextjs.org) project bootstrapped with [`create-next-app`](https://nextjs.org/docs/app/api-reference/cli/create-next-app). |
| 1 | +# Next.js with Together AI and LlamaIndex Integration |
| 2 | + |
| 3 | +This project demonstrates how to build a Next.js application that integrates Together AI and LlamaIndex for AI-powered functionalities. The project consists of a frontend built with Next.js and a backend service handling AI operations. |
| 4 | + |
| 5 | +## Project Structure |
| 6 | + |
| 7 | +- `frontend/`: Next.js frontend application |
| 8 | +- `backend/`: Node.js backend service with Together AI and LlamaIndex integration |
| 9 | + |
| 10 | +## Prerequisites |
| 11 | + |
| 12 | +- Node.js (LTS version) |
| 13 | +- Docker |
| 14 | +- Together AI API key |
| 15 | +- Restack Engine setup |
2 | 16 |
|
3 | 17 | ## Getting Started
|
4 | 18 |
|
5 |
| -First, run the development server: |
| 19 | +### 1. Install Restack Web UI |
| 20 | + |
| 21 | +First, install the Restack Web UI using Docker: |
| 22 | + |
| 23 | +```bash |
| 24 | +docker run -d --pull always --name studio -p 5233:5233 -p 6233:6233 -p 7233:7233 ghcr.io/restackio/restack:main |
| 25 | +``` |
| 26 | + |
| 27 | +### 2. Set Up Backend |
| 28 | + |
| 29 | +1. Navigate to the backend directory: |
| 30 | + |
| 31 | +```bash |
| 32 | +cd backend |
| 33 | +``` |
| 34 | + |
| 35 | +2. Install dependencies: |
| 36 | + |
| 37 | +```bash |
| 38 | +npm install |
| 39 | +``` |
| 40 | + |
| 41 | +3. Create a `.env` file with your credentials: |
| 42 | + |
| 43 | +``` |
| 44 | +TOGETHER_API_KEY=your_together_api_key |
| 45 | +RESTACK_ENGINE_ID=your_engine_id |
| 46 | +RESTACK_ENGINE_ADDRESS=your_engine_address |
| 47 | +RESTACK_ENGINE_API_KEY=your_engine_api_key |
| 48 | +``` |
| 49 | + |
| 50 | +4. Start the backend service: |
6 | 51 |
|
7 | 52 | ```bash
|
8 | 53 | npm run dev
|
9 |
| -# or |
10 |
| -yarn dev |
11 |
| -# or |
12 |
| -pnpm dev |
13 |
| -# or |
14 |
| -bun dev |
15 | 54 | ```
|
16 | 55 |
|
17 |
| -Open [http://localhost:3000](http://localhost:3000) with your browser to see the result. |
| 56 | +### 3. Set Up Frontend |
18 | 57 |
|
19 |
| -You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file. |
| 58 | +1. Navigate to the frontend directory: |
20 | 59 |
|
21 |
| -This project uses [`next/font`](https://nextjs.org/docs/app/building-your-application/optimizing/fonts) to automatically optimize and load [Geist](https://vercel.com/font), a new font family for Vercel. |
| 60 | +```bash |
| 61 | +cd frontend |
| 62 | +``` |
22 | 63 |
|
23 |
| -# Install Restack Web UI |
| 64 | +2. Install dependencies: |
24 | 65 |
|
25 |
| -To install the Restack Web UI, you can use Docker. |
| 66 | +```bash |
| 67 | +npm install |
26 | 68 | ```
|
27 |
| -docker run -d --pull always --name studio -p 5233:5233 -p 6233:6233 -p 7233:7233 ghcr.io/restackio/restack:main |
| 69 | + |
| 70 | +3. Create a `.env` file: |
| 71 | + |
| 72 | +``` |
| 73 | +RESTACK_ENGINE_ID=your_engine_id |
| 74 | +RESTACK_ENGINE_ADDRESS=your_engine_address |
| 75 | +RESTACK_ENGINE_API_KEY=your_engine_api_key |
28 | 76 | ```
|
29 |
| -# Schedule Restack workflow from NextJS frontend |
30 | 77 |
|
31 |
| -The example is a NextJS application with front and backend. You can schedule the workflow example from the user interface. |
| 78 | +4. Start the development server: |
32 | 79 |
|
33 |
| - |
| 80 | +```bash |
| 81 | +npm run dev |
| 82 | +``` |
34 | 83 |
|
35 |
| -When the client successfully schedules the workflow, you can see the started workflow in the Restack Web UI. You should see the following screen: |
| 84 | +Visit [http://localhost:3000](http://localhost:3000) to see the application. |
36 | 85 |
|
37 |
| - |
| 86 | +## Available Features |
| 87 | + |
| 88 | +1. **Chat Completion Example**: Demonstrates basic chat completion using Together AI's Llama models |
| 89 | +2. **LlamaIndex Integration**: Shows how to query models using LlamaIndex with Together AI integration |
| 90 | + |
| 91 | +## Docker Deployment |
38 | 92 |
|
39 |
| -Now you can add a backend to the example. In other examples, you can see how to ideally structure the backend app with workflows, functions and services. |
| 93 | +You can deploy both frontend and backend using Docker Compose: |
| 94 | + |
| 95 | +```bash |
| 96 | +docker-compose up -d |
| 97 | +``` |
| 98 | + |
| 99 | +This will: |
| 100 | + |
| 101 | +- Build and start the frontend on port 3000 |
| 102 | +- Build and start the backend on port 8000 |
| 103 | +- Set up proper networking between services |
40 | 104 |
|
41 | 105 | ## Deploy on Restack
|
42 | 106 |
|
43 |
| -To deploy this Next.js application on Restack, you can use the provided `restack_up.mjs` script. This script utilizes the Restack Cloud SDK to define and deploy your application stack. It sets up the necessary environment variables and configures the Next.js application for deployment. |
| 107 | +To deploy this application on Restack: |
44 | 108 |
|
45 |
| -To get started, ensure you have the required Restack Cloud credentials and environment variables set up. Then, run the script to initiate the deployment process. |
| 109 | +1. Ensure you have Restack Cloud credentials |
| 110 | +2. Set up required environment variables |
| 111 | +3. Run the deployment script: |
46 | 112 |
|
47 |
| -For more detailed information on deploying your repository to Restack, refer to the [Restack Cloud deployment documentation](https://docs.restack.io/restack-cloud/deployrepo). |
| 113 | +```bash |
| 114 | +node restack_up.mjs |
| 115 | +``` |
| 116 | + |
| 117 | +For detailed deployment information, see the [Restack Cloud documentation](https://docs.restack.io/restack-cloud/deployrepo). |
| 118 | + |
| 119 | +## Project Components |
| 120 | + |
| 121 | +### Frontend |
| 122 | + |
| 123 | +- Modern Next.js 14 application |
| 124 | +- Server Actions for workflow triggering |
| 125 | +- Tailwind CSS for styling |
| 126 | +- Type-safe API integration |
| 127 | + |
| 128 | +### Backend |
| 129 | + |
| 130 | +- Node.js backend service |
| 131 | +- Together AI integration for LLM operations |
| 132 | +- LlamaIndex for enhanced AI capabilities |
| 133 | +- Rate-limited API calls (60 RPM) |
| 134 | + |
| 135 | +## Example Workflows |
| 136 | + |
| 137 | +The application includes two main workflows: |
| 138 | + |
| 139 | +1. **Chat Completion Basic**: Generates greeting and farewell messages using Together AI models |
| 140 | +2. **LlamaIndex Together Simple**: Demonstrates LlamaIndex integration with Together AI for complex queries |
| 141 | + |
| 142 | +## Screenshots |
| 143 | + |
| 144 | + |
| 145 | +_Main application interface_ |
| 146 | + |
| 147 | + |
| 148 | +_Restack Web UI showing workflow execution_ |
48 | 149 |
|
| 150 | +## Contributing |
49 | 151 |
|
| 152 | +Feel free to submit issues and enhancement requests. |
0 commit comments