chatgpt / Exports / ChatGPTUnofficialProxyAPI
• new ChatGPTUnofficialProxyAPI(opts
)
Name | Type | Description |
---|---|---|
opts |
Object |
- |
opts.accessToken |
string |
- |
opts.apiReverseProxyUrl? |
string |
Default Value https://bypass.duti.tech/api/conversation * |
opts.debug? |
boolean |
Default Value false * |
opts.fetch? |
(input : RequestInfo | URL , init? : RequestInit ) => Promise <Response > |
- |
opts.headers? |
Record <string , string > |
Default Value undefined * |
opts.model? |
string |
Default Value text-davinci-002-render-sha * |
src/chatgpt-unofficial-proxy-api.ts:20
• get
accessToken(): string
string
src/chatgpt-unofficial-proxy-api.ts:66
• set
accessToken(value
): void
Name | Type |
---|---|
value |
string |
void
src/chatgpt-unofficial-proxy-api.ts:70
▸ sendMessage(text
, opts?
): Promise
<ChatMessage
>
Sends a message to ChatGPT, waits for the response to resolve, and returns the response.
If you want your response to have historical context, you must provide a valid parentMessageId
.
If you want to receive a stream of partial responses, use opts.onProgress
.
If you want to receive the full response, including message and conversation IDs,
you can use opts.onConversationResponse
or use the ChatGPTAPI.getConversation
helper.
Set debug: true
in the ChatGPTAPI
constructor to log more info on the full prompt sent to the OpenAI completions API. You can override the promptPrefix
and promptSuffix
in opts
to customize the prompt.
Name | Type |
---|---|
text |
string |
opts |
SendMessageBrowserOptions |
Promise
<ChatMessage
>
The response from ChatGPT