Skip to content

Commit f4da261

Browse files
feat(ai-monitoring): Langchain integration docs (#9799)
* Langchain integration docs * Add to python integrations * Review comments * Update docs/platforms/python/integrations/langchain/index.mdx Co-authored-by: vivianyentran <20403606+vivianyentran@users.noreply.github.com> --------- Co-authored-by: vivianyentran <20403606+vivianyentran@users.noreply.github.com>
1 parent 1d3203a commit f4da261

File tree

2 files changed

+96
-3
lines changed

2 files changed

+96
-3
lines changed

docs/platforms/python/integrations/index.mdx

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -35,9 +35,10 @@ The Sentry SDK uses integrations to hook into the functionality of popular libra
3535

3636
## AI
3737

38-
| | **Auto enabled** |
39-
| ------------------------------------------------------------------------------------------------------------------ | :--------------: |
40-
| <LinkWithPlatformIcon platform="openai" label="OpenAI" url="/platforms/python/integrations/openai" /> ||
38+
| | **Auto enabled** |
39+
|-----------------------------------------------------------------------------------------------------------------------|:----------------:|
40+
| <LinkWithPlatformIcon platform="openai" label="OpenAI" url="/platforms/python/integrations/openai" /> ||
41+
| <LinkWithPlatformIcon platform="langchain" label="Langchain" url="/platforms/python/integrations/langchain" /> ||
4142

4243
## Data Processing
4344

Lines changed: 92 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,92 @@
1+
---
2+
title: Langchain
3+
description: "Learn about using Sentry for Langchain."
4+
---
5+
6+
This integration connects Sentry with [Langchain](https://github.com/langchain-ai/langchain).
7+
The integration has been confirmed to work with Langchain 0.1.11.
8+
9+
## Install
10+
11+
Install `sentry-sdk` from PyPI and the appropriate langchain packages:
12+
13+
```bash
14+
pip install --upgrade 'sentry-sdk' 'langchain-openai' 'langchain-core'
15+
```
16+
17+
## Configure
18+
19+
If you have the `langchain` package in your dependencies, the Langchain integration will be enabled automatically when you initialize the Sentry SDK.
20+
21+
An additional dependency, `tiktoken`, is required to be installed if you want to calculate token usage for streaming chat responses.
22+
23+
<SignInNote />
24+
25+
```python
26+
from langchain_openai import ChatOpenAI
27+
import sentry_sdk
28+
29+
sentry_sdk.init(
30+
dsn="___PUBLIC_DSN___",
31+
enable_tracing=True,
32+
traces_sample_rate=1.0,
33+
send_default_pii=True, # send personally-identifiable information like LLM responses to sentry
34+
)
35+
36+
llm = ChatOpenAI(model="gpt-3.5-turbo-0125", temperature=0)
37+
```
38+
39+
## Verify
40+
41+
Verify that the integration works by inducing an error:
42+
43+
```python
44+
from langchain_openai import ChatOpenAI
45+
import sentry_sdk
46+
47+
sentry_sdk.init(...) # same as above
48+
49+
llm = ChatOpenAI(model="gpt-3.5-turbo-0125", temperature=0, api_key="bad API key")
50+
with sentry_sdk.start_transaction(op="ai-inference", name="The result of the AI inference"):
51+
response = llm.invoke([("system", "What is the capital of paris?")])
52+
print(response)
53+
```
54+
55+
After running this script, a transaction will be created in the Performance section of [sentry.io](https://sentry.io). Additionally, an error event (about the bad API key) will be sent to [sentry.io](https://sentry.io) and will be connected to the transaction.
56+
57+
It may take a couple of moments for the data to appear in [sentry.io](https://sentry.io).
58+
59+
## Behavior
60+
61+
- The Langchain integration will connect Sentry with Langchain and automatically monitor all LLM, tool, and function calls.
62+
63+
- All exceptions in the execution of the chain are reported.
64+
65+
- Sentry considers LLM and tokenizer inputs/outputs as PII and, by default, does not include PII data. If you want include the data, then set `send_default_pii=True` in the `sentry_sdk.init()` call. To explicitly exclude prompts and outputs despite `send_default_pii=True`, configure the integration with `include_prompts=False` as shown in the [Options section](#options) below.
66+
67+
## Options
68+
69+
By adding `LangchainIntegration` to your `sentry_sdk.init()` call explicitly, you can set options for `LangchainIntegration` to change its behavior:
70+
71+
```python
72+
import sentry_sdk
73+
from sentry_sdk.integrations.langchain import LangchainIntegration
74+
75+
sentry_sdk.init(
76+
dsn="___PUBLIC_DSN___",
77+
enable_tracing=True,
78+
send_default_pii=True,
79+
traces_sample_rate=1.0,
80+
integrations = [
81+
LangchainIntegration(
82+
include_prompts=False, # LLM/tokenizer inputs/outputs will be not sent to Sentry, despite send_default_pii=True
83+
),
84+
],
85+
)
86+
```
87+
88+
## Supported Versions
89+
90+
- Langchain: 0.1.11+
91+
- tiktoken: 0.6.0+
92+
- Python: 3.9+

0 commit comments

Comments
 (0)