Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ Support for Featherless.ai as inference provider. #1310

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

wxgeorge
Copy link

Implements support for Featherless.ai as inference provider fully for chat and partially for completions (streaming completions to be covered in a future PR)

@julien-c julien-c added the inference-providers integration of a new or existing Inference Provider label Apr 2, 2025
@julien-c
Copy link
Member

julien-c commented Apr 2, 2025

Hi @wxgeorge we are currently finishing a refactoring of Inference Providers integration code in #1315, this should be merged soon, but we will need to rewrite part of your implementation (should be even simpler to integrate), will ping again after it's been merged.

@hanouticelina
Copy link
Contributor

Hi @wxgeorge,
We've merged a refactoring for Inference Provider integration into main, which should make adding new providers much easier. Could you merge main into your branch and update your PR accordingly? it should be relatively straightforward with the new structure:
1 - You have to update the PROVIDERS mapping here: inference/src/lib/getProviderHelper.ts#L49 and add featherless-ai (let's ensure we respect the alphabetical order) :

import * as FeatherlessAI from "../providers/featherless-ai";
...
export const PROVIDERS: Record<InferenceProvider, Partial<Record<InferenceTask, TaskProviderHelper>>> = {
	...
	"featherless-ai": {
		"conversational": new FeatherlessAI.FeatherlessAIConversationalTask(),
		"text-generation": new FeatherlessAI.FeatherlessAITextGenerationTask(),
	},
        ...

2 - Update packages/inference/src/providers/featherless-ai.ts to implement the two classes FeatherlessAIConversationalTask and FeatherlessAITextGenerationTask that inherit from BaseConversationalTask and BaseTextGenerationTask respectively:

import { BaseConversationalTask, BaseTextGenerationTask } from "./providerHelper";

const FEATHERLESS_API_BASE_URL = "https://api.featherless.ai";

export class FeatherlessAIConversationalTask extends BaseConversationalTask {
	constructor() {
		super("featherless-ai", FEATHERLESS_API_BASE_URL);
	}
}

export class FeatherlessAITextGenerationTask extends BaseTextGenerationTask {
	constructor() {
		super("featherless-ai", FEATHERLESS_API_BASE_URL);
	}
}

and that's it :) let us know if you need any help! you can find more details in the documentation here : https://huggingface.co/docs/inference-providers/register-as-a-provider#2-js-client-integration

@julien-c
Copy link
Member

julien-c commented Apr 8, 2025

(sorry for the moving parts @wxgeorge – we can help move this PR over the finish line if needed)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
inference-providers integration of a new or existing Inference Provider
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants