-
Notifications
You must be signed in to change notification settings - Fork 924
VinF Hybrid Inference: introduce Chrome Adapter class (rebased) #8932
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
* Adding LanguageModel types. These are based off https://github.com/webmachinelearning/prompt-api?tab=readme-ov-file#full-api-surface-in-web-idl * Adding LanguageModel types. * Remove bunch of exports * yarn formatted * after lint
|
Vertex AI Mock Responses Check
|
Replaced by #8942 |
This change minimally introduces a ChromeAdapter class to abstract interactions with Chrome's Prompt API.
To constrain the scope of the change, this only injects the adapter into the generateContent path. If we agree on this approach, a subsequent PR can inject the adapter into the generateContentStream and countTokens paths.
This builds on #8930 which updates the SDK API.
This replaces #8877 to clarify we're now merging into the new vaihi-exp integration branch.