Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a configuration option to make callback logging synchronous #8202

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

B-Step62
Copy link
Contributor

@B-Step62 B-Step62 commented Feb 3, 2025

Title

LiteLLM executes success handlers in a background thread. This is generally preferred to avoid overhead in main application, however, we sometimes want invoke callbacks synchronously.

One example is that debugging with inline trace rendering. MLflow supports rendering trace object in the jupyter notebook directly (ref). In this case, trace must be completed by the end of the cell, therefore, non-blocking callback does not work well.

This PR adds a configuration litellm.sync_logging to make callback execution blocking for such debugging purpose. The default behavior remains the same (non-blocking) so there is no downside. Other tools like LangChain, LlamaIndex allow developers to control it as well.

Type

🆕 New Feature

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

Sync call
Screenshot 2025-02-03 at 10 41 11

Sync streaming
Screenshot 2025-02-03 at 10 41 26

Async call
Screenshot 2025-02-03 at 10 41 18

Async streaming
Screenshot 2025-02-03 at 10 41 37

Copy link

vercel bot commented Feb 3, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 3, 2025 2:22pm

Signed-off-by: B-Step62 <[email protected]>
target=self.run_success_logging_and_cache_storage,
args=(response, cache_hit),
).start() # log response
if litellm.sync_logging:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

having so many if/else blocks re: logging in the codebase can lead to bugs

can we use a more general pattern / function here which can ensure consistent behaviour? @B-Step62 @ishaan-jaff

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@krrishdholakia Sure, I can pull this combo to a shared utility function. Is that what you suggested?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@krrishdholakia I've updated the PR to encapsulate the conditional logic into a single place. Would you mind taking another look? Thank you in advance.

Signed-off-by: B-Step62 <[email protected]>
complete_streaming_response, None, None, cache_hit
)
else:
executor.submit(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note: I don't have a clear context for why TPE is used in some place only, so did not try to remove this if-else and keep the behavior same.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants