Skip to content

tt-inference-server/vllm-llama3-src-cloud-ubuntu-20.04-amd64 v0.0.1-b6ecf68e706b-b9564bf364e9 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/tenstorrent/tt-inference-server/vllm-llama3-src-cloud-ubuntu-20.04-amd64:v0.0.1-b6ecf68e706b-b9564bf364e9

Recent tagged image versions

  • Published 20 days ago · Digest
    sha256:4e4d7a2b21f8dbb7e5029258eba8599ecaa8fce0d971eb964db4247f4b8ad392
    3 Version downloads
  • Published about 1 month ago · Digest
    sha256:a5c90b076b10de76646f5e7605584e27b1e8400f10a8abdb3f6ad9c489576905
    2 Version downloads

Loading

Details


Last published

20 days ago

Issues

34

Total downloads

5