Skip to content

Often getting error: Could not locate file: "https://huggingface.co/.../onnx/model.onnx" #1485

@customautosys

Description

@customautosys

System Info

Running on Ubuntu Studio, testing on both Chrome and Firefox.

Example demo in my repo uses Quasar Framework.

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

On the old @xenova/transformers, I was able to load many of the models.

However, on @huggingface/transformers 3.x, I often face a problem where the model could not load due to an allegedly missing "model.onnx".

An example is Xenova/LaMini-GPT-774M where there's supposedly no model.onnx which is a required file (although there is decoder_model.onnx and other similar files).

As a result, a lot of the models on huggingface, after I filter for models which work with transformers.js, cannot even be loaded.

Reproduction

Try to load a lot of models on huggingface and you will get this error.

My current project is https://github.com/customautosys/ai-compare-candidates

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions