llama-cpp-python agent framework for chat, structured output and function calling #4690
Maximilian-Winter
started this conversation in
Show and tell
Replies: 1 comment
-
awesome might end up less work for me! I just noticed this after I posted this: we shall see, if it ends up as good or better than mine ill stick to a bash version for more options. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
llama-cpp-agent Framework
Introduction
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). It provides a simple yet robust interface using llama-cpp-python, allowing users to chat with LLM models, execute structured function calls and get structured output.
Key Features
Installation
To get started with the llama-cpp-agent LLM framework, follow these steps:
requirements.txt
file.Usage Examples
Simple Chat Example
This example demonstrates how to initiate a chat with an LLM model.
Structured Output
This example shows how to get structured JSON output.
Function Calling Example
This example shows how to do function calling.
Additional Information
Beta Was this translation helpful? Give feedback.
All reactions