Skip to content

Commit 07baf4c

Browse files
github-actions[bot]lievanKyle-Verhoog
authored
fix(llmobs): fix token extraction for chat completion streams [backport 2.20] (#12091)
Backport 75179ef from #12070 to 2.20. Fixes token chunk extraction to account for the `choices` field in a chunk being an empty list #### Before ``` Error generating LLMObs span event for span <Span(id=16151817411339450163,trace_id=137677390470467884790869841527646927357,parent_id=None,name=openai.request)>, likely due to malformed span Traceback (most recent call last): File "/XXXXX/ddtrace/contrib/internal/openai/utils.py", line 118, in __aiter__ await self._extract_token_chunk(chunk) File "/XXXXX/ddtrace/contrib/internal/openai/utils.py", line 157, in _extract_token_chunk choice = getattr(chunk, "choices", [None])[0] IndexError: list index out of range ``` #### After Traced succesfully <img width="904" alt="image" src="https://github.com/user-attachments/assets/43c68edd-03f7-4105-a3d3-213eeb5fb0ab" /> Co-authored-by: lievan <[email protected]> Co-authored-by: kyle <[email protected]>
1 parent 3ff00cc commit 07baf4c

File tree

2 files changed

+12
-2
lines changed

2 files changed

+12
-2
lines changed

ddtrace/contrib/internal/openai/utils.py

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,10 @@ def _extract_token_chunk(self, chunk):
8989
"""Attempt to extract the token chunk (last chunk in the stream) from the streamed response."""
9090
if not self._dd_span._get_ctx_item("_dd.auto_extract_token_chunk"):
9191
return
92-
choice = getattr(chunk, "choices", [None])[0]
92+
choices = getattr(chunk, "choices")
93+
if not choices:
94+
return
95+
choice = choices[0]
9396
if not getattr(choice, "finish_reason", None):
9497
# Only the second-last chunk in the stream with token usage enabled will have finish_reason set
9598
return
@@ -152,7 +155,10 @@ async def _extract_token_chunk(self, chunk):
152155
"""Attempt to extract the token chunk (last chunk in the stream) from the streamed response."""
153156
if not self._dd_span._get_ctx_item("_dd.auto_extract_token_chunk"):
154157
return
155-
choice = getattr(chunk, "choices", [None])[0]
158+
choices = getattr(chunk, "choices")
159+
if not choices:
160+
return
161+
choice = choices[0]
156162
if not getattr(choice, "finish_reason", None):
157163
return
158164
try:
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
---
2+
fixes:
3+
- |
4+
LLM Observability: This fix resolves an issue where extracting token metadata from openai streamed chat completion token chunks caused an IndexError.

0 commit comments

Comments
 (0)