Skip to content

OpenLinkSoftware/WebLLM-Tools-Sample

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 

Repository files navigation

WebLLM-Tools-Sample

Sample of using tools with WebLLM (Note it works only in browsers with WebGPU support and you must have at last 6Gb of free VRAM). Tested on Chrome and Brave on MSWindows and macOS

Demo App

const tools = [
  {
    type: "function",
    function: {
      name: "fetch_wikipedia_content",
      description: "Search Wikipedia and fetch the introduction of the most relevant article. "+
                   "Always use this if the user is asking for something that is likely on wikipedia. "+
                   "If the user has a typo in their search query, correct it before searching.",
      parameters: {
        type: "object",
        properties: {
            type: "object",
            properties: {
                search_query: {
                    type: "string",
                    description: "Search query for finding the Wikipedia article"
                }
            },
        },
        required: ["search_query"],
      },
    },
  },
  {
    type: "function",
    function: {
      name: "sparql_exec",
      description: "Execute any SPARQL select queries and fetch results"+
                   "Always use this if the user is asking for execute some SPARQL select query. "+
                   "If the user has a typo in their SPARQL select query, correct it before executing.",
      parameters: {
        type: "object",
        properties: {
            type: "object",
            properties: {
                query: {
                    type: "string",
                    description: "SPARQL select query"
                }
            },
        },
        required: ["query"],
      },
      "return": {
        "type": "object",
        "description": "A data in application/sparql-results+json format"
      }
    },
  },
  ];

Sample of using tool calls with Web LLM. Now it works with next model:

  • Qwen3-8B
  • Qwen3-4B
  • Qwen2.5-7B-Instruct
  • Qwen2.5-14B-Instruct
  • Llama-3.1-8B-Instruct
  • Hermes-3-Llama-3.1-8B

Best results is with Qwen3-8B, Qwen2.5-14b, Qwen3-4B (for Linux use models with suffix q4f32)

About

Sample of using tools with WebLLM

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published