Skip to content

A web GUI for interacting with locally-run LLMs, (DeekSeek for example)

Notifications You must be signed in to change notification settings

griffindavidson/llm-web-gui

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-web-gui

A web GUI for interacting with locally-run LLMs, (DeekSeek for example)

Screenshot showing DeepSeek

Instructions

  1. This requires you have Ollama installed with a LLM model locally installed. Please do that first
  2. Clone this repository
  3. Navigate to Root directory
  4. Add .env with the entry MODEL='<model name here>'
  5. Run npm install
  6. Run node app.js
  7. On a web browser, navigate to https://localhost:3000

Other Tidbits

  • This project saves messages in SessionStorage. Closing the tab will delete your chats
  • I don't intend to do much more with this, may prettify it more, but I just wanted to try this

Known Issues

  • Text formatting is applied after the LLM is finished streaming, not immediately. This can be jarring for LLMs like DeepSeek where it likes to provide a gigantic thought process that is then removed. Code is also pulled out and enlarged for better readability, but jarring from the packed-in format it originally displays in
  • The name for each response block does not change color based on color scheme. Workaround: used a color that looks good for both.

About

A web GUI for interacting with locally-run LLMs, (DeekSeek for example)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published