Skip to content

Commit e2ebfec

Browse files
authored
feat(agent-insights): product docs (#14148)
1 parent fd7118a commit e2ebfec

File tree

17 files changed

+349
-29
lines changed

17 files changed

+349
-29
lines changed

docs/platforms/python/integrations/anthropic/index.mdx

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,14 @@
11
---
22
title: Anthropic
33
description: "Learn about using Sentry for Anthropic."
4+
sidebar_hidden: true
45
---
56

6-
<Alert title="Beta">
7+
<Alert level="info" title="Product compatibility">
78

8-
The support for **Anthropic** is in its beta phase.
9+
This integration is designed for legacy [LLM Monitoring](/product/insights/ai/llm-monitoring/) and is not currently compatible with [AI Agents Insights](/product/insights/agents/).
910

10-
We are working on supporting different AI libraries (see [GitHub discussion](https://github.com/getsentry/sentry-python/discussions/3007)).
11-
12-
If you want to try the beta features and are willing to give feedback, please let us know on [Discord](https://discord.com/invite/Ww9hbqr).
11+
We are working on this, and it will soon be compatible with [AI Agents Insights](/product/insights/agents/).
1312

1413
</Alert>
1514

docs/platforms/python/integrations/cohere/index.mdx

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,18 @@
11
---
22
title: Cohere
33
description: "Learn about using Sentry for Cohere."
4+
sidebar_hidden: true
45
---
56

6-
<Alert title="Beta">
7+
<Alert level="info" title="Product compatibility">
78

8-
The support for **Cohere** is in its beta phase.
9+
This integration is designed for legacy [LLM Monitoring](/product/insights/ai/llm-monitoring/) and is not currently compatible with [AI Agents Insights](/product/insights/agents/).
910

10-
We are working on supporting different AI libraries (see [GitHub discussion](https://github.com/getsentry/sentry-python/discussions/3007)).
11-
12-
If you want to try the beta features and are willing to give feedback, please let us know on [Discord](https://discord.com/invite/Ww9hbqr).
11+
We are working on this, and it will soon be compatible with [AI Agents Insights](/product/insights/agents/).
1312

1413
</Alert>
1514

15+
1616
This integration connects Sentry with the [Cohere Python SDK](https://github.com/cohere-ai/cohere-python).
1717
The integration has been confirmed to work with Cohere 5.3.3.
1818

docs/platforms/python/integrations/huggingface_hub/index.mdx

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,14 @@
11
---
22
title: Huggingface Hub
33
description: "Learn about using Sentry for Huggingface Hub."
4+
sidebar_hidden: true
45
---
56

6-
<Alert title="Beta">
7+
<Alert level="info" title="Product compatibility">
78

8-
The support for **Huggingface Hub** is in its beta phase.
9+
This integration is designed for legacy [LLM Monitoring](/product/insights/ai/llm-monitoring/) and is not currently compatible with [AI Agents Insights](/product/insights/agents/).
910

10-
We are working on supporting different AI libraries (see [GitHub discussion](https://github.com/getsentry/sentry-python/discussions/3007)).
11-
12-
If you want to try the beta features and are willing to give feedback, please let us know on [Discord](https://discord.com/invite/Ww9hbqr).
11+
We are working on this, and it will soon be compatible with [AI Agents Insights](/product/insights/agents/).
1312

1413
</Alert>
1514

@@ -26,6 +25,7 @@ Install `sentry-sdk` from PyPI with the `huggingface_hub` extra:
2625
```bash {tabTitle:pip}
2726
pip install "sentry-sdk[huggingface_hub]"
2827
```
28+
2929
```bash {tabTitle:uv}
3030
uv add "sentry-sdk[huggingface_hub]"
3131
```

docs/platforms/python/integrations/langchain/index.mdx

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,18 @@
11
---
22
title: Langchain
33
description: "Learn about using Sentry for Langchain."
4+
sidebar_hidden: true
45
---
56

6-
<Alert title="Beta">
7+
<Alert level="info" title="Product compatibility">
78

8-
The support for **LangChain** is in its beta phase.
9+
This integration is designed for legacy [LLM Monitoring](/product/insights/ai/llm-monitoring/) and is not currently compatible with [AI Agents Insights](/product/insights/agents/).
910

10-
We are working on supporting different AI libraries (see [GitHub discussion](https://github.com/getsentry/sentry-python/discussions/3007)).
11-
12-
If you want to try the beta features and are willing to give feedback, please let us know on [Discord](https://discord.com/invite/Ww9hbqr).
11+
We are working on this, and it will soon be compatible with [AI Agents Insights](/product/insights/agents/).
1312

1413
</Alert>
1514

15+
1616
This integration connects Sentry with [Langchain](https://github.com/langchain-ai/langchain).
1717
The integration has been confirmed to work with Langchain 0.1.11.
1818

docs/platforms/python/integrations/openai/index.mdx

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,14 @@
11
---
22
title: OpenAI
33
description: "Learn about using Sentry for OpenAI."
4+
sidebar_hidden: true
45
---
56

6-
<Alert title="Beta">
7+
<Alert level="info" title="Product compatibility">
78

8-
The support for **OpenAI** is in its beta phase.
9+
This integration is designed for legacy [LLM Monitoring](/product/insights/ai/llm-monitoring/) and is not currently compatible with [AI Agents Insights](/product/insights/agents/).
910

10-
We are working on supporting different AI libraries (see [GitHub discussion](https://github.com/getsentry/sentry-python/discussions/3007)).
11-
12-
If you want to try the beta features and are willing to give feedback, please let us know on [Discord](https://discord.com/invite/Ww9hbqr).
11+
We are working on this, and it will soon be compatible with [AI Agents Insights](/product/insights/agents/).
1312

1413
</Alert>
1514

@@ -27,6 +26,7 @@ Install `sentry-sdk` from PyPI with the `openai` extra:
2726
```bash {tabTitle:pip}
2827
pip install "sentry-sdk[openai]"
2928
```
29+
3030
```bash {tabTitle:uv}
3131
uv add "sentry-sdk[openai]"
3232
```
@@ -43,7 +43,6 @@ An additional dependency, `tiktoken`, is required if you want to calculate token
4343

4444
Verify that the integration works by creating an AI pipeline. The resulting data should show up in your LLM monitoring dashboard.
4545

46-
4746
```python
4847
import sentry_sdk
4948
from sentry_sdk.ai.monitoring import ai_track
@@ -115,7 +114,6 @@ You can pass the following keyword arguments to `OpenAIIntegration()`:
115114

116115
The default is `None`.
117116

118-
119117
## Supported Versions
120118

121119
- OpenAI: 1.0+
Lines changed: 67 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,67 @@
1+
---
2+
title: AI Agents Dashboard
3+
sidebar_order: 10
4+
description: "Learn how to use Sentry's AI Agents Dashboard."
5+
---
6+
7+
<Include name="feature-limited-on-team-retention.mdx" />
8+
9+
Once you've [configured the Sentry SDK](/product/insights/agents/getting-started/) for your AI agent project, you'll start receiving data in the Sentry [AI Agents Insights](https://sentry.io/orgredirect/organizations/:orgslug/insights/agents/) dashboard.
10+
11+
The main dashboard provides a comprehensive view of all your AI agent activities, performance metrics, and recent executions.
12+
13+
![AI Agents Monitoring Overview](./img/overview.png)
14+
15+
The dashboard displays key widgets like:
16+
17+
- **Traffic**: Shows agent runs over time, error rates, and releases to track overall activity and health
18+
- **Duration**: Displays response times for your agent executions to monitor performance
19+
- **Recommended Issues**: Highlights recent errors and problems that need attention, including agent failures and exceptions
20+
- **LLM Generations**: Shows the number of language model calls with breakdowns by specific models (claude, 4o-mini, etc.)
21+
- **Tool Usage**: Shows which tools your agents use most frequently
22+
- **Token Usage**: Tracks token consumption over time with breakdown by model
23+
24+
Underneath these widgets are tables that allow you to view data in more detail:
25+
26+
- **Traces**: Recent agent runs with duration, errors, number of LLM and tool calls and token usage
27+
- **Models**: Traffic, duration, token usage and errors grouped by model
28+
- **Tools**: Number of requests and their usual durations grouped by tool
29+
30+
![AI Agent Trace Table](./img/trace-table.png)
31+
32+
Click on any trace to open the abbreviated trace view in a drawer.
33+
34+
## Abbreviated Trace View
35+
36+
Opens as a drawer when clicking any trace, showing essential details:
37+
38+
![AI Agent Abbreviated Trace View](./img/abbreviated-trace-view.png)
39+
40+
- **Agent Invocations**: Each agent execution and nested calls
41+
- **LLM Generations**: Language model interactions with token breakdown
42+
- **Tool Calls**: External API calls with inputs and outputs
43+
- **Handoffs**: Agent-to-agent transitions and human handoffs
44+
- **Critical Timing**: Duration metrics for each step
45+
- **Errors**: Any failures that occurred
46+
47+
Click **"View in full trace"** for comprehensive debugging details.
48+
49+
## Detailed Trace View
50+
51+
Shows complete agent workflow with full context:
52+
53+
![AI Agent Detailed Trace View](./img/trace-view.png)
54+
55+
This detailed view reveals:
56+
57+
- **Complete Agent Flow**: Every step from initial request to final response
58+
- **Tool Calls**: When and how the agent used external tools or APIs
59+
- **Model Interactions**: All LLM calls with prompts and responses (if PII is enabled)
60+
- **Timing Breakdown**: Duration of each step in the agent workflow
61+
- **Error Context**: Detailed information about any failures or issues
62+
63+
When your AI agents are part of larger applications (like web servers or APIs), the trace view will include context from other Sentry integrations, giving you a complete picture of how your agents fit into your overall application architecture.
64+
65+
<Alert title="Agent Pipelines vs Direct LLM Calls">
66+
67+
</Alert>
Lines changed: 187 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,187 @@
1+
---
2+
title: Set Up
3+
sidebar_order: 0
4+
description: "Learn how to set up Sentry AI Agent Monitoring"
5+
---
6+
7+
Sentry AI Agent Monitoring helps you track and debug AI agent applications using our supported SDKs and integrations. Monitor your complete agent workflows from user interaction to final response, including tool calls, model interactions, and custom logic.
8+
9+
To start sending AI agent data to Sentry, make sure you've created a Sentry project for your AI-enabled repository and follow one of the guides below:
10+
11+
## Supported SDKs
12+
13+
### JavaScript - Vercel AI SDK
14+
15+
The Sentry JavaScript SDK supports AI agent monitoring through the Vercel AI integration, which works with Node.js and Bun runtimes. This integration automatically captures spans for your AI agent workflows using the AI SDK's built-in telemetry.
16+
17+
#### Supported Platforms
18+
19+
- <LinkWithPlatformIcon
20+
platform="javascript.node"
21+
label="Node.js"
22+
url="/platforms/javascript/guides/node/configuration/integrations/vercelai/"
23+
/>
24+
- <LinkWithPlatformIcon
25+
platform="javascript.nextjs"
26+
label="Next.js"
27+
url="/platforms/javascript/guides/nextjs/configuration/integrations/vercelai/"
28+
/>
29+
- <LinkWithPlatformIcon
30+
platform="javascript.sveltekit"
31+
label="SvelteKit"
32+
url="/platforms/javascript/guides/sveltekit/configuration/integrations/vercelai/"
33+
/>
34+
- <LinkWithPlatformIcon
35+
platform="javascript.nuxt"
36+
label="Nuxt"
37+
url="/platforms/javascript/guides/nuxt/configuration/integrations/vercelai/"
38+
/>
39+
- <LinkWithPlatformIcon
40+
platform="javascript.astro"
41+
label="Astro"
42+
url="/platforms/javascript/guides/astro/configuration/integrations/vercelai/"
43+
/>
44+
- <LinkWithPlatformIcon
45+
platform="javascript.remix"
46+
label="Remix"
47+
url="/platforms/javascript/guides/remix/configuration/integrations/vercelai/"
48+
/>
49+
- <LinkWithPlatformIcon
50+
platform="javascript.solidstart"
51+
label="SolidStart"
52+
url="/platforms/javascript/guides/solidstart/configuration/integrations/vercelai/"
53+
/>
54+
- <LinkWithPlatformIcon
55+
platform="javascript.express"
56+
label="Express"
57+
url="/platforms/javascript/guides/express/configuration/integrations/vercelai/"
58+
/>
59+
- <LinkWithPlatformIcon
60+
platform="javascript.fastify"
61+
label="Fastify"
62+
url="/platforms/javascript/guides/fastify/configuration/integrations/vercelai/"
63+
/>
64+
- <LinkWithPlatformIcon
65+
platform="javascript.nestjs"
66+
label="Nest.js"
67+
url="/platforms/javascript/guides/nestjs/configuration/integrations/vercelai/"
68+
/>
69+
- <LinkWithPlatformIcon
70+
platform="javascript.hapi"
71+
label="Hapi"
72+
url="/platforms/javascript/guides/hapi/configuration/integrations/vercelai/"
73+
/>
74+
- <LinkWithPlatformIcon
75+
platform="javascript.koa"
76+
label="Koa"
77+
url="/platforms/javascript/guides/koa/configuration/integrations/vercelai/"
78+
/>
79+
- <LinkWithPlatformIcon
80+
platform="javascript.connect"
81+
label="Connect"
82+
url="/platforms/javascript/guides/connect/configuration/integrations/vercelai/"
83+
/>
84+
- <LinkWithPlatformIcon
85+
platform="javascript.hono"
86+
label="Hono"
87+
url="/platforms/javascript/guides/hono/configuration/integrations/vercelai/"
88+
/>
89+
- <LinkWithPlatformIcon
90+
platform="javascript.bun"
91+
label="Bun"
92+
url="/platforms/javascript/guides/bun/configuration/integrations/vercelai/"
93+
/>
94+
- <LinkWithPlatformIcon
95+
platform="javascript.aws-lambda"
96+
label="AWS Lambda"
97+
url="/platforms/javascript/guides/aws-lambda/configuration/integrations/vercelai/"
98+
/>
99+
- <LinkWithPlatformIcon
100+
platform="javascript.azure-functions"
101+
label="Azure Functions"
102+
url="/platforms/javascript/guides/azure-functions/configuration/integrations/vercelai/"
103+
/>
104+
- <LinkWithPlatformIcon
105+
platform="javascript.gcp-functions"
106+
label="Google Cloud Functions"
107+
url="/platforms/javascript/guides/gcp-functions/configuration/integrations/vercelai/"
108+
/>
109+
- <LinkWithPlatformIcon
110+
platform="javascript.electron"
111+
label="Electron"
112+
url="/platforms/javascript/guides/electron/configuration/integrations/vercelai/"
113+
/>
114+
115+
#### Quick Start with Vercel AI SDK
116+
117+
```javascript
118+
import { Sentry } from "@sentry/node";
119+
import { generateText } from "ai";
120+
import { openai } from "@ai-sdk/openai";
121+
122+
Sentry.init({
123+
tracesSampleRate: 1.0,
124+
integrations: [
125+
Sentry.vercelAIIntegration({
126+
recordInputs: true,
127+
recordOutputs: true,
128+
}),
129+
],
130+
});
131+
132+
// Your AI agent function
133+
async function aiAgent(userQuery) {
134+
const result = await generateText({
135+
model: openai("gpt-4o"),
136+
prompt: userQuery,
137+
experimental_telemetry: {
138+
isEnabled: true,
139+
functionId: "ai-agent-main",
140+
},
141+
});
142+
143+
return result.text;
144+
}
145+
```
146+
147+
### Python - OpenAI Agents
148+
149+
The Sentry Python SDK supports OpenAI Agents SDK.
150+
151+
#### Quick Start with OpenAI Agents
152+
153+
```python
154+
import sentry_sdk
155+
from sentry_sdk.integrations.openai_agents import OpenAIAgentsIntegration
156+
import agents
157+
from pydantic import BaseModel
158+
159+
sentry_sdk.init(
160+
dsn="YOUR_DSN",
161+
traces_sample_rate=1.0,
162+
send_default_pii=True, # Include LLM inputs/outputs
163+
integrations=[
164+
OpenAIAgentsIntegration(),
165+
],
166+
)
167+
168+
# Create your AI agent
169+
my_agent = agents.Agent(
170+
name="My Agent",
171+
instructions="You are a helpful assistant.",
172+
model="gpt-4o-mini",
173+
)
174+
175+
# Your AI agent function
176+
result = await agents.Runner.run(
177+
my_agent,
178+
input=user_query,
179+
)
180+
181+
```
182+
183+
<Alert title="Don't see your platform?">
184+
185+
We'll be adding AI agent integrations continuously. You can also instrument AI agents manually by following our [manual instrumentation guide](/platforms/python/tracing/instrumentation/custom-instrumentation/ai-agents-module).
186+
187+
</Alert>
Loading
Loading
523 KB
Loading
Loading
Loading

0 commit comments

Comments
 (0)