Skip to content

Allow Non-Standard Model Parameters #6343

@MikeNatC

Description

@MikeNatC

Is your feature request related to a problem? Please describe.

Yes - typically when configuring model parameters when inferencing with llama.cpp, one of the parameters that is often used is min_p, However, this parameter (and some others) do not appear to be on the configuration form when using the LocalAI Web UI. I tried modifying the yaml to add the 'min_p' parameter but it does not seem to persist. I am not sure what to do.

Describe the solution you'd like

I think it makes sense for the configuration form to have an open ended field for people to insert less commonly used parameters for the various backends without having to create massively long form just to cater to parameters that are almost never used.

Describe alternatives you've considered

If considered adding a field for min_p and other parameters but I think it might make the configuration form too complicated.

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions