|
| 1 | +--- |
| 2 | +title: Vercel AI |
| 3 | +description: "Adds instrumentation for Vercel AI SDK." |
| 4 | +supported: |
| 5 | + - javascript.node |
| 6 | + - javascript.aws-lambda |
| 7 | + - javascript.azure-functions |
| 8 | + - javascript.connect |
| 9 | + - javascript.express |
| 10 | + - javascript.fastify |
| 11 | + - javascript.gcp-functions |
| 12 | + - javascript.hapi |
| 13 | + - javascript.koa |
| 14 | + - javascript.nestjs |
| 15 | + - javascript.electron |
| 16 | + - javascript.nextjs |
| 17 | + - javascript.nuxt |
| 18 | + - javascript.sveltekit |
| 19 | + - javascript.remix |
| 20 | + - javascript.astro |
| 21 | + - javascript.bun |
| 22 | +--- |
| 23 | + |
| 24 | +<Alert level="info"> |
| 25 | + |
| 26 | +This integration only works in the Node.js and Bun runtimes. Requires SDK version `8.43.0` or higher. |
| 27 | + |
| 28 | +</Alert> |
| 29 | + |
| 30 | +_Import name: `Sentry.vercelAIIntegration`_ |
| 31 | + |
| 32 | +The `vercelAIIntegration` adds instrumentation for the [`ai`](https://www.npmjs.com/package/ai) library by Vercel to capture spans using the [`AI SDK's built-in Telemetry`](https://sdk.vercel.ai/docs/ai-sdk-core/telemetry). |
| 33 | + |
| 34 | +```javascript |
| 35 | +Sentry.init({ |
| 36 | + integrations: [new Sentry.vercelAIIntegration()], |
| 37 | +}); |
| 38 | +``` |
| 39 | + |
| 40 | +To enhance the spans collected by this integration, we recommend providing a `functionId` to identify the function that the telemetry data is for. For more details, see the [AI SDK Telemetry Metadata docs](https://sdk.vercel.ai/docs/ai-sdk-core/telemetry#telemetry-metadata). |
| 41 | + |
| 42 | +```javascript |
| 43 | +const result = await generateText({ |
| 44 | + model: openai("gpt-4-turbo"), |
| 45 | + experimental_telemetry: { functionId: "my-awesome-function" }, |
| 46 | +}); |
| 47 | +``` |
| 48 | + |
| 49 | +## Configuration |
| 50 | + |
| 51 | +By default this integration adds tracing support to all `ai` function callsites. If you need to disable span collection for a specific call, you can do so by setting `experimental_telemetry.isEnabled` to `false` in the first argument of the function call. |
| 52 | + |
| 53 | +```javascript |
| 54 | +const result = await generateText({ |
| 55 | + model: openai("gpt-4-turbo"), |
| 56 | + experimental_telemetry: { isEnabled: false }, |
| 57 | +}); |
| 58 | +``` |
| 59 | + |
| 60 | +If you want to collect inputs and outputs for a specific call, you must specifically opt-in to each function call by setting `experimental_telemetry.recordInputs` and `experimental_telemetry.recordOutputs` to `true`. |
| 61 | + |
| 62 | +```javascript |
| 63 | +const result = await generateText({ |
| 64 | + model: openai("gpt-4-turbo"), |
| 65 | + experimental_telemetry: { |
| 66 | + isEnabled: true, |
| 67 | + recordInputs: true, |
| 68 | + recordOutputs: true, |
| 69 | + }, |
| 70 | +}); |
| 71 | +``` |
| 72 | + |
| 73 | +## Supported Versions |
| 74 | + |
| 75 | +- `ai`: `>=3.0.0 <5` |
0 commit comments