You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/en/llm_observability/quickstart.md
+71-1Lines changed: 71 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -21,7 +21,7 @@ This guide uses the LLM Observability SDKs for [Python][1] and [Node.js][2]. If
21
21
22
22
To better understand LLM Observability terms and concepts, you can explore the examples in the [LLM Observability Jupyter Notebooks repository][12]. These notebooks provide a hands-on experience, and allow you to apply these concepts in real time.
23
23
24
-
## Command line
24
+
## Trace an LLM application
25
25
26
26
To generate an LLM Observability trace, you can run a Python or Node.js script.
27
27
@@ -129,6 +129,75 @@ The trace you see is composed of a single LLM span. The `ddtrace-run` or `NODE_O
129
129
130
130
If your application consists of more elaborate prompting or complex chains or workflows involving LLMs, you can trace it using the [Setup documentation][11] and the [SDK documentation][1].
131
131
132
+
## Trace an LLM application in AWS Lambda
133
+
The following steps generate an LLM Observability trace in an AWS Lambda environment and create an Amazon Bedrock based chatbot running with LLM Observability in AWS Lambda.
134
+
135
+
1. Create a [Lambda function chatbot using Amazon Bedrock][13].
136
+
2. Instrument your Lambda function:
137
+
1. Open a Cloudshell
138
+
2. Install the Datadog CLI client
139
+
```shell
140
+
npm install -g @datadog/datadog-ci
141
+
```
142
+
3. Set the Datadog API key and site
143
+
```shell
144
+
export DD_SITE=<YOUR_DD_SITE>
145
+
export DD_API_KEY=<YOUR_DATADOG_API_KEY>
146
+
```
147
+
If you already have or prefer to use a secret in Secrets Manager, you can set the API key by
8000
using the secret ARN:
1. In the Datadog UI, navigate to `Infrastructure > Serverless`
166
+
2. Search for the name of your function.
167
+
3. Click on it to open the details panel.
168
+
4. Under the `Configuration` tab are the details of the Lambda function, attached layers, and a list of `DD_` Datadog-related environment variables under the <
8000
span class="pl-s">`Datadog Environment Variables` section.
169
+
4. Invoke your Lambda functionand verify that LLM Observability traces are visible in the Datadog UI.
170
+
171
+
### Force flushing traces
172
+
173
+
For either serverless environments other than AWS Lambda or issues seeing traces from AWS Lambdas, use the `flush` method to ensure traces are flushed before the process exits.
174
+
175
+
{{< tabs >}}
176
+
{{% tab "Python" %}}
177
+
178
+
```python
179
+
from ddtrace.llmobs import LLMObs
180
+
def handler():
181
+
# function body
182
+
LLMObs.flush()
183
+
```
184
+
185
+
{{% /tab %}}
186
+
{{% tab "Node.js" %}}
187
+
188
+
```javascript
189
+
import tracer from 'dd-trace';
190
+
const llmobs = tracer.llmobs;
191
+
192
+
export const handler = async (event) => {
193
+
// your functionbody
194
+
llmobs.flush();
195
+
};
196
+
```
197
+
198
+
{{% /tab %}}
199
+
{{< /tabs >}}
200
+
132
201
## Further Reading
133
202
134
203
{{< partial name="whats-next/whats-next.html">}}
@@ -144,3 +213,4 @@ If your application consists of more elaborate prompting or complex chains or wo
Copy file name to clipboardExpand all lines: content/en/llm_observability/setup/sdk/python.md
+5-10Lines changed: 5 additions & 10 deletions
Original file line number
Diff line number
Diff line change
@@ -107,13 +107,7 @@ LLMObs.enable(
107
107
108
108
### AWS Lambda setup
109
109
110
-
Enable LLM Observability by specifying the required environment variables in your [command line setup](#command-line-setup) and following the setup instructions for the [Datadog-Python and Datadog-Extension][14] AWS Lambda layers. Additionally, set `DD_TRACE_ENABLED` to `true` in your Lambda function's environment variables.
111
-
112
-
If you are only expecting traces from LLM Observability, set `DD_LLMOBS_AGENTLESS_ENABLED` to `true` in your Lambda function's environment variables.
113
-
114
-
If you are expecting APM traces from your Lambda function in addition to LLM Observability, leave `DD_EXTENSION_VERSION` unset in your Lambda function's environment variables if you are using `v66` or earlier of the Datadog-Extension layer. Otherwise, set `DD_EXTENSION_VERSION` to `compatibility` if you are using `v67` or later.
115
-
116
-
**Note**: Using the `Datadog-Python` and `Datadog-Extension` layers automatically turns on all LLM Observability integrations, and force flushes spans at the end of the Lambda function.
110
+
See the [AWS Lambda Quickstart Guide][15] to quickly integrate LLM Observability into your Lambda functions.
117
111
118
112
#### Application naming guidelines
119
113
@@ -480,12 +474,12 @@ The SDK's `LLMObs.annotate_context()` method returns a context manager that can
480
474
481
475
The `LLMObs.annotation_context()` method accepts the following arguments:
482
476
483
-
`name`
477
+
`name`
484
478
: optional - _str_
485
479
<br />Name that overrides the span name for any auto-instrumented spans that are started within the annotation context.
486
480
487
-
`prompt`
488
-
: optional - _dictionary_
481
+
`prompt`
482
+
: optional - _dictionary_
489
483
<br />A dictionary that represents the prompt used for an LLM call in the following format:<br />`{"template": "...", "id": "...", "version": "...", "variables": {"variable_1": "...", ...}}`.<br />You can also import the `Prompt` object from `ddtrace.utils` and pass it in as the `prompt` argument. **Note**: This argument only applies to LLM spans.
0 commit comments