System Info
Running on Ubuntu Studio, testing on both Chrome and Firefox.
Example demo in my repo uses Quasar Framework.
Environment/Platform
Description
On the old @xenova/transformers, I was able to load many of the models.
However, on @huggingface/transformers 3.x, I often face a problem where the model could not load due to an allegedly missing "model.onnx".
An example is Xenova/LaMini-GPT-774M where there's supposedly no model.onnx which is a required file (although there is decoder_model.onnx and other similar files).
As a result, a lot of the models on huggingface, after I filter for models which work with transformers.js, cannot even be loaded.
Reproduction
Try to load a lot of models on huggingface and you will get this error.
My current project is https://github.com/customautosys/ai-compare-candidates