Add deep link to local app for AiBrow#1065
Conversation
Vaibhavs10
left a comment
There was a problem hiding this comment.
Hi @rhys101 - sorry for the delay in getting back to you! It has been a bit crazy at work! Is there a limit to which size of GGUFs can you run via AiBrow?
In addition to this can you send us a SVG of the logo as well!
|
Hey @Vaibhavs10 There's no particular limit to the size of the GGUFs. It's using llama.cpp and node-llama-cpp under the hood, and it calculates a suitability score to check before attempting to run any partcular GGUF on the machine. If the model is too big it will get a score of 0 and refuse to run. So a fancy new M4 Pro 128 will run a larger GGUF than an 8Gb M1 Air for example. SVG attached. |
|
@rhys101, you should change the name of this PR. It has nothing to do with my app Local Chat. |
Good call - your PR was just previous when submitted so assumed the HF link was called Local Chat - it's Local Apps, so have amended as suggested. |
Hi, I've added AiBrow to the list of local apps. AiBrow is a new open source (https://github.com/axonzeta/aibrow) all-in-one local llm Extension for Chromium based browsers and Firefox. It comes bundled with llama.cpp and works on Mac, Windows, and Linux.
It implements the new Chrome AI Prompt API (https://github.com/explainers-by-googlers/prompt-api) under its own namespace (window.aibrow) and polyfills the window .ai Chrome proposal in other browsers. As well as the Prompt API implementation, AiBrow supports using different models (such as gguf models from HF), embeddings, grammar and more. The installer also includes the Q4_K_M version of SmolLM2 1.7 Instruct as the default model, all ready to go.
We just released a new version today with the support for Hugging Face models and deep linking. If you would like to check it out, please visit: https://aibrow.ai/