-
Notifications
You must be signed in to change notification settings - Fork 237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(ai): introduce vercel ai sdk support #316
base: main
Are you sure you want to change the base?
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
In progress:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added some initial comments.
@rmarescu Feedback addressed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Submitting a partial review. High-level thoughts:
- No need to map Vercel's models with a local list, it complicates the implementation without too much gain
- Use
AI
instead ofLLM
(don't need 2 names that represent almost the same thing within the codebase) - Config schema seems complicated. Can be simplified?
} | ||
|
||
export interface LLMPPublicConfig { | ||
provider: LLMSupportedProvidersType; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think even this one can be optional. Ideally, there should be zero config to run Shortest.
No ai
prop provided should default to the provider we think is the best, e.g. anthropic
, which tries to read the apiKey
from ENV (as Vercel AI SDK supports it by default).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I personally don't see a strong case for simplifying the configuration. Keeping all settings in a single file actually reduces the cognitive load, as it makes it clear where each value comes from and how the final configuration is assembled
Ideally, there should be zero config to run Shortest
Achieving that is unlikely—especially as users increasingly demand more flexibility and control over Shortest (see, for example, issue #313)
What is possible, though, is keeping all non-sensitive data in config and sensitive data (e.g keys) in the env, not to duplicate them in the config
Co-authored-by: Razvan Marescu <[email protected]>
You must have Developer access to commit code to Antiwork on Vercel. If you contact an administrator and receive Developer access, commit again to see your changes. Learn more: https://vercel.com/docs/accounts/team-members-and-roles/access-roles#team-level-roles |
Co-authored-by: Razvan Marescu <[email protected]>
Co-authored-by: Razvan Marescu <[email protected]>
Co-authored-by: Razvan Marescu <[email protected]>
Co-authored-by: Razvan Marescu <[email protected]>
Co-authored-by: Razvan Marescu <[email protected]>
Co-authored-by: Razvan Marescu <[email protected]>
Remaining work (will update as needed):
|
Issue #291
What
Integrate Vercel AI SDK
Why
Simplify the introduction of new providers and models in the future, specifically, Amazon Bedrock (#310) and OpenAI when available