
Generative Understanding & Responsive Intelligent Assistant
A powerful and intuitive chat interface for Ollama models

Choose from a variety of powerful Ollama models

Automatic model download and initialization

Modern, responsive chat interface with real-time streaming
- 🎨 Modern, responsive UI with dark mode support
- 🔄 Real-time chat interface with streaming responses
- 🔒 Secure HTTPS with auto-generated SSL certificates
- 📱 Mobile-friendly design
- 💾 Local chat history storage
- 📤 Export chats to multiple formats (PDF, Markdown, Text)
- 🎯 Multiple Ollama model support
- 🔐 Privacy-focused (all data stays local)
- 🚀 Easy setup with automated installation script
- 💻 Cross-platform support (Windows, macOS, Linux)
GURIA comes with a smart setup script that handles everything for you! No need to manually install prerequisites - the script will check and install what's needed.
- Clone the repository:
git clone https://github.com/vinipx/guria-ai-app.git
cd guria-ai-app
- Run GURIA:
On Windows:
.\guria
On macOS/Linux:
./guria
That's it! The script will automatically:
- Check and install prerequisites (Python, Ollama, etc.)
- Set up the virtual environment
- Install all dependencies
- Generate SSL certificates for secure HTTPS
- Configure the application
- Start the server
Your default web browser will open to https://localhost:7860
when everything is ready.
Note: Your browser may show a security warning on first access because we use a local certificate for development. This is normal and safe to proceed.
GURIA works seamlessly across all major platforms:
- ✅ Windows: Native support via PowerShell (Windows 10/11)
- ✅ macOS: Full support for both Intel and Apple Silicon
- ✅ Linux: Compatible with all major distributions
- ✅ WSL: Windows Subsystem for Linux supported
Just run the GURIA script and you're good to go:
# Windows
.\guria
# macOS/Linux
./guria
The script will ensure Ollama is running and handle everything else for you!
- Backend: Flask 3.0.0
- Frontend: HTML5, TailwindCSS, JavaScript
- AI Integration: Ollama API
- Database: SQLite
- PDF Generation: ReportLab
- Process Management: psutil
GURIA is designed to work out of the box, but you can customize:
- Port number (default: 5000)
- Ollama API endpoint (default: http://localhost:11434)
- Available models (automatically detected from Ollama)
- Export formats and styling
guria-ai-app/
├── app.py # Main Flask application
├── templates/ # HTML templates
├── static/ # Static assets
├── setup/ # OS-specific setup scripts
│ ├── mac_setup.sh # macOS/Linux setup
│ └── windows_setup.ps1 # Windows setup
├── guria # Main setup script
├── requirements.txt # Python dependencies
└── README.md # This file
GURIA runs in production mode by default with HTTPS enabled. Here are the available options:
./guria.sh # Run in production mode with HTTPS
./guria.sh --http # Force HTTP mode
./guria.sh --debug # Enable debug mode
./guria.sh --port 8443 # Use custom port
You can combine multiple options:
./guria.sh --http --debug --port 8080
For development and troubleshooting, you can enable debug mode with the --debug
flag. However, debug mode should never be used in production as it may expose sensitive information.
GURIA runs in HTTPS mode by default for enhanced security. The application will automatically generate and install SSL certificates using mkcert
if they don't exist.
If certificate generation fails or if you specifically need HTTP mode, you can force HTTP using the --http
flag:
./guria.sh --http
You can also specify a custom port:
./guria.sh --port 8443 # Run with HTTPS on port 8443
./guria.sh --http --port 8080 # Run with HTTP on port 8080
Note: When running in HTTPS mode for the first time, you might see a security warning in your browser. This is normal for locally-generated certificates and you can safely proceed.
Vinicius Peixoto Fagundes - @vinipx
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for providing the amazing AI models
- Flask for the web framework
- TailwindCSS for the styling system
Vinicius Peixoto Fagundes - @vinipx
Project Link: https://github.com/vinipx/guria-ai-app