Skip to content

FastChat support for Open_llama_3b_v2 inference - help sought #100

@RDouglasSharp

Description

@RDouglasSharp

I use FastChat as the framework for both training and dialog-based inference, and FastChat supports Meta/Llama. I was excited to try the 3B state Open-Llama model, and the FastChat finetuning scripts all work perfectly with open_llama_3b_v2. Oddly, the FastChat inference framework does not work with my finetuned model, or with the original model. Has anyone figured out how to get FastChat fastchat.serve.cli to support openlm-research models?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions