How to send the AI's output to Mac Say using only llama.cpp? #856
classicjazz
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I would like to have an interactive chat session using llama.cpp on a Mac, where only the output from AI is piped to the Mac "say" command, which then uses the voice selected in Accessibility settings (e.g. English Ireland > Voice 2). (I want to type my input)
@ggerganov has an example of this, used in conjunction with whisper.cpp, here:
https://twitter.com/ggerganov/status/1640035474205995011
See also
ggml-org/whisper.cpp@master...talk.llama-coreml#diff-919c16cc39dd4e7769d99b1a2866078795836b3a94285a574f865d13199de72e
Can anyone suggest the best way to output the AI's to invoke "say" in a bash script from main, for the AI output only, without needing whisper.cpp?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions