Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jetson-containers build ollama error #785

Open
kentsuiGitHub opened this issue Jan 18, 2025 · 9 comments
Open

jetson-containers build ollama error #785

kentsuiGitHub opened this issue Jan 18, 2025 · 9 comments

Comments

@kentsuiGitHub
Copy link

kentsuiGitHub commented Jan 18, 2025

I am trying to build ollama but am not able to build it.

Could you guide me how to fix it.

Error:
Step 12/19 : COPY --from=ollama-l4t-build /opt/ollama/dist/linux-arm64/lib/ollama /usr/lib/ollama
COPY failed: stat opt/ollama/dist/linux-arm64/lib/ollama: file does not exist
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/opt/jetson-containers/jetson_containers/build.py", line 112, in
build_container(args.name, args.packages, args.base, args.build_flags, args.build_args, args.simulate, args.skip_tests, args.test_only, args.push, args.no_github_api, args.skip_packages)
File "/opt/jetson-containers/jetson_containers/container.py", line 147, in build_container
status = subprocess.run(cmd.replace(NEWLINE, ' '), executable='/bin/bash', shell=True, check=True)
File "/usr/lib/python3.10/subprocess.py", line 526, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command 'DOCKER_BUILDKIT=0 docker build --network=host --tag ollama:r36.4.3-ollama --file /opt/jetson-containers/packages/llm/ollama/Dockerfile --build-arg BASE_IMAGE=ollama:r36.4.3-python --build-arg OLLAMA_REPO="ollama/ollama" --build-arg OLLAMA_BRANCH="v0.5.7" --build-arg GOLANG_VERSION="1.22.8" --build-arg CMAKE_VERSION="3.22.1" --build-arg JETPACK_VERSION="6.2" --build-arg CUDA_VERSION_MAJOR="12" --build-arg CMAKE_CUDA_ARCHITECTURES="87" /opt/jetson-containers/packages/llm/ollama 2>&1 | tee /opt/jetson-containers/logs/20250119_015021/build/ollama_r36.4.3-ollama.txt; exit ${PIPESTATUS[0]}' returned non-zero exit status 1.

@iueoa
Copy link

iueoa commented Jan 22, 2025

I've been seeing the same behavior.

@WiktorRembielakA4BEE
Copy link

I am getting the same error

@tokk-nv
Copy link
Collaborator

tokk-nv commented Jan 29, 2025

Hi @kentsuiGitHub , @iueoa , @WiktorRembielakA4BEE ,

Just so we can understand the whole picture, did you get to pull the dustynv/ollama:0.5.1-r36.4.0 container image?

jetson-containers run --name ollama $(autotag ollama)

It looks like @kentsuiGitHub is on JetPack 6.2, but from what I test, the container image can run on JetPack 6.2 (but you may need to run ollama serve& at the beginning in the container).
Is there any specific reason why you want to build an Ollama container using jetson-containers?

Just so that we have all the alternative paths in our scope, we can now install Ollama natively as well.
https://www.jetson-ai-lab.com/tutorial_ollama.html#1-native-install
I'm just trying not to miss your specific motivation to build a custom Ollama container.

@cyber-nico
Copy link

cyber-nico commented Jan 30, 2025

Hi, I got the same error - also on JetPack 6.2. I was going to build a new container with the latest version of Ollama (v0.5.7), since the latest container image available for JetPack is based on v0.5.1. Issuing the command

jetson-containers build --name ollama-latest ollama

led to the exact same error message as reported by @kentsuiGitHub.

@tokk-nv
Copy link
Collaborator

tokk-nv commented Jan 30, 2025

We just fixed the issue with this PR #807.
And we just re-build the container with the latest Ollama (0.5.7), and updated the container image.

If you want to just run the container with the latest Ollama, you can do this.

jetson-containers run dustynv/ollama:main-r36.4.0

Or, you can build by yourself with the follwoing.

jetson-containers build --skip-tests=ollama ollama

@tokk-nv
Copy link
Collaborator

tokk-nv commented Jan 31, 2025

I just realized that my former PR was not installing Ollama with GPU support, and thus the resulted dustynv/ollama:main-r36.4.0 pushed (9e63cacd3b60) suffers the same issue.

I filed the new PR #808, and this should enable GPU.

@kentsuiGitHub
Copy link
Author

Hi, I got the same error - also on JetPack 6.2. I was going to build a new container with the latest version of Ollama (v0.5.7), since the latest container image available for JetPack is based on v0.5.1. Issuing the command

jetson-containers build --name ollama-latest ollama

led to the exact same error message as reported by @kentsuiGitHub.

Thank you for the clarification @cyber-nico

@kentsuiGitHub
Copy link
Author

Hi @kentsuiGitHub , @iueoa , @WiktorRembielakA4BEE ,

Just so we can understand the whole picture, did you get to pull the dustynv/ollama:0.5.1-r36.4.0 container image?

jetson-containers run --name ollama $(autotag ollama)

It looks like @kentsuiGitHub is on JetPack 6.2, but from what I test, the container image can run on JetPack 6.2 (but you may need to run ollama serve& at the beginning in the container). Is there any specific reason why you want to build an Ollama container using jetson-containers?

Just so that we have all the alternative paths in our scope, we can now install Ollama natively as well. https://www.jetson-ai-lab.com/tutorial_ollama.html#1-native-install I'm just trying not to miss your specific motivation to build a custom Ollama container.

Hi @tokk-nv

Thank you. I have followed the procedure and installed ollama natively for workaround recently

Will try the latest container image and post it here.

@kentsuiGitHub
Copy link
Author

I just realized that my former PR was not installing Ollama with GPU support, and thus the resulted dustynv/ollama:main-r36.4.0 pushed (9e63cacd3b60) suffers the same issue.

I filed the new PR #808, and this should enable GPU.

Understood.
Will try to run the container again when the new image is ready.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants