feat: add ondeviceml.space — browser-based on-device AI inference#2141
Open
abhid1234 wants to merge 1 commit intohuggingface:mainfrom
Open
feat: add ondeviceml.space — browser-based on-device AI inference#2141abhid1234 wants to merge 1 commit intohuggingface:mainfrom
abhid1234 wants to merge 1 commit intohuggingface:mainfrom
Conversation
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
Reviewed by Cursor Bugbot for commit fcba75d. Configure here.
| return ( | ||
| model.library_name === "transformers" && | ||
| (model.pipeline_tag === "text-generation" || model.pipeline_tag === "image-text-to-text") && | ||
| supportedModelTypes.some((t) => model.config?.model_type?.startsWith(t)) |
There was a problem hiding this comment.
Prefix allows unsupported architectures
Medium Severity
startsWith broadens the allow-list, so distinct architectures like qwen2_5_vl pass the qwen2 check. The button can appear for models outside the ondeviceml catalog or validated runtime, producing deeplinks the site cannot load.
Reviewed by Cursor Bugbot for commit fcba75d. Configure here.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.


What this adds
Registers ondeviceml.space as a local app for Gemma and Qwen2 transformer models.
ondeviceml.spaceis a client-side-only AI gallery: models run entirely in the browser via WebGPU / WASM (MediaPipe Tasks GenAI runtime). No server, no install, no account — the model weights download once to the browser cache and all inference stays on-device.Deep-link behavior
When a user clicks "Open in ondeviceml.space" from a model page, the URL
lands on the Gallery page, which:
hf_modelparam against the local catalog[ Gemma 3n E2B → Chat ] [ Load & Open ] [ Dismiss ]The deep-link intake is live on production — you can test it now:
displayOnModelPagelogicShows only for
transformers-library models withtext-generationorimage-text-to-textpipeline whosemodel_typeis one ofgemma3n,gemma4,gemma3,qwen2— the architectures validated to run via MediaPipe in-browser. Happy to expand or tighten the list based on your feedback.Catalog today
Checklist
displayOnModelPagetargets only supported model typesNote
Low Risk
Low risk: adds a new
LOCAL_APPSentry and display gating logic plus an external deeplink URL; no existing execution paths or core inference logic are modified.Overview
Adds
ondeviceml.spaceto theLOCAL_APPSregistry so supported Hugging Face model pages can show an “Open in ondeviceml.space” option.The new entry is only shown for
transformersmodels withtext-generationorimage-text-to-textpipelines whoseconfig.model_typestarts with one ofgemma3n,gemma4,gemma3, orqwen2, and it deep-links tohttps://ondeviceml.space/?hf_model=...&task=....Reviewed by Cursor Bugbot for commit fcba75d. Bugbot is set up for automated code reviews on this repo. Configure here.