This repository contains scripts and configurations to run and interact with the Ollama AI model using Docker.
First, pull the latest Ollama image from Docker Hub:
docker pull ollama/ollama:latest
$docker run -d -p 11434:11434 --name ollama_test ollama/ollama
$docker exec -it <containerID> /bin/bash
or using your docker desktop UI have this option.
docker run -d -p 11434:11434 --name ollama_test ollama/ollama
docker exec -it <containerID> /bin/bash
ollama list
ollama pull tinyllama
ollama run tinyllama
Ensure that the requests library is installed:
pip install requests
Run the Python script to interact with the model:
python .\main.py
Copy code
docker stop ollama_test.
Copy code
docker rm $(docker ps -a -f "status=exited" -q)