8000 [MLOB-2380] Add LLM Observability Serverless Quickstart Guide (#28118) · DataDog/documentation@885ad08 · GitHub
[go: up one dir, main page]

Skip to content

Commit 885ad08

Browse files
sabrennercswatturseberry
authored
[MLOB-2380] Add LLM Observability Serverless Quickstart Guide (#28118)
* add quickstart from reference * wording fixes * nodejs tab * fixes * enhance quickstart guide steps * remove images in favor of direct steps * remove compatibility notes section * link to quickstart guide * add flush mention * Update content/en/llm_observability/quickstart.md Co-authored-by: cecilia saixue watt <cecilia.watt@datadoghq.com> * Update content/en/llm_observability/quickstart.md Co-authored-by: cecilia saixue watt <cecilia.watt@datadoghq.com> * Update content/en/llm_observability/quickstart.md Co-authored-by: cecilia saixue watt <cecilia.watt@datadoghq.com> * add to guides section * fix wording for new approach * try formatting fix * Update content/en/llm_observability/setup/sdk/python.md * Update content/en/llm_observability/setup/sdk/nodejs.md * Update content/en/llm_observability/quickstart.md Co-authored-by: Ursula Chen <58821586+urseberry@users.noreply.github.com> * try and add some newlines for tab formatting * try different newlining * newlines * get rid of newlines entirely --------- Co-authored-by: cecilia saixue watt <cecilia.watt@datadoghq.com> Co-authored-by: Ursula Chen <58821586+urseberry@users.noreply.github.com>
1 parent 8467506 commit 885ad08

File tree

4 files changed

+80
-13
lines changed

4 files changed

+80
-13
lines changed

content/en/llm_observability/guide/_index.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,8 @@ cascade:
1414
{{< /site-region >}}
1515

1616
{{< whatsnext desc="LLM Observability Guides:" >}}
17-
{{< nextlink href="/llm_observability/quickstart" >}}Trace an LLM Application{{< /nextlink >}}
17+
{{< nextlink href="/llm_observability/quickstart#trace-an-llm-application" >}}Trace an LLM Application{{< /nextlink >}}
18+
{{< nextlink href="/llm_observability/quickstart#trace-an-llm-application-in-aws-lambda" >}}Trace an LLM Application in AWS Lambda{{< /nextlink >}}
1819
{{< nextlink href="/llm_observability/submit_evaluations" >}}Submit Evaluations{{< /nextlink >}}
1920
{{< nextlink href="/llm_observability/submit_nemo_evaluations" >}}Submit NVIDIA NeMo Custom Evaluations{{< /nextlink >}}
2021
{{< nextlink href="/llm_observability/guide/ragas_quickstart" >}}Ragas Quickstart{{< /nextlink >}}

content/en/llm_observability/quickstart.md

Lines changed: 71 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ This guide uses the LLM Observability SDKs for [Python][1] and [Node.js][2]. If
2121

2222
To better understand LLM Observability terms and concepts, you can explore the examples in the [LLM Observability Jupyter Notebooks repository][12]. These notebooks provide a hands-on experience, and allow you to apply these concepts in real time.
2323

24-
## Command line
24+
## Trace an LLM application
2525

2626
To generate an LLM Observability trace, you can run a Python or Node.js script.
2727

@@ -129,6 +129,75 @@ The trace you see is composed of a single LLM span. The `ddtrace-run` or `NODE_O
129129

130130
If your application consists of more elaborate prompting or complex chains or workflows involving LLMs, you can trace it using the [Setup documentation][11] and the [SDK documentation][1].
131131

132+
## Trace an LLM application in AWS Lambda
133+
The following steps generate an LLM Observability trace in an AWS Lambda environment and create an Amazon Bedrock based chatbot running with LLM Observability in AWS Lambda.
134+
135+
1. Create a [Lambda function chatbot using Amazon Bedrock][13].
136+
2. Instrument your Lambda function:
137+
1. Open a Cloudshell
138+
2. Install the Datadog CLI client
139+
```shell
140+
npm install -g @datadog/datadog-ci
141+
```
142+
3. Set the Datadog API key and site
143+
```shell
144+
export DD_SITE=<YOUR_DD_SITE>
145+
export DD_API_KEY=<YOUR_DATADOG_API_KEY>
146+
```
147+
If you already have or prefer to use a secret in Secrets Manager, you can set the API key by 8000 using the secret ARN:
148+
```shell
149+
export DATADOG_API_KEY_SECRET_ARN=<DATADOG_API_KEY_SECRET_ARN>
150+
```
151+
4. Instrument your Lambda function with LLM Observability (this requires at least version 77 of the Datadog Extension layer).
152+
{{< tabs >}}
153+
{{% tab "Python" %}}
154+
```shell
155+
datadog-ci lambda instrument -f <YOUR_LAMBDA_FUNCTION_NAME> -r <AWS_REGION> -v {{< latest-lambda-layer-version layer="python" >}} -e {{< latest-lambda-layer-version layer="extension" >}} --llmobs <YOUR_LLMOBS_ML_APP>
156+
```
157+
{{% /tab %}}
158+
{{% tab "Node.js" %}}
159+
```shell
160+
datadog-ci lambda instrument -f <YOUR_LAMBDA_FUNCTION_NAME> -r <AWS_REGION> -v {{< latest-lambda-layer-version layer="node" >}} -e {{< latest-lambda-layer-version layer="extension" >}} --llmobs <YOUR_LLMOBS_ML_APP>
161+
```
162+
{{% /tab %}}
163+
{{< /tabs >}}
164+
3. Verify that your function was instrumented.
165+
1. In the Datadog UI, navigate to `Infrastructure > Serverless`
166+
2. Search for the name of your function.
167+
3. Click on it to open the details panel.
168+
4. Under the `Configuration` tab are the details of the Lambda function, attached layers, and a list of `DD_` Datadog-related environment variables under the < 8000 span class="pl-s">`Datadog Environment Variables` section.
169+
4. Invoke your Lambda function and verify that LLM Observability traces are visible in the Datadog UI.
170+
171+
### Force flushing traces
172+
173+
For either serverless environments other than AWS Lambda or issues seeing traces from AWS Lambdas, use the `flush` method to ensure traces are flushed before the process exits.
174+
175+
{{< tabs >}}
176+
{{% tab "Python" %}}
177+
178+
```python
179+
from ddtrace.llmobs import LLMObs
180+
def handler():
181+
# function body
182+
LLMObs.flush()
183+
```
184+
185+
{{% /tab %}}
186+
{{% tab "Node.js" %}}
187+
188+
```javascript
189+
import tracer from 'dd-trace';
190+
const llmobs = tracer.llmobs;
191+
192+
export const handler = async (event) => {
193+
// your function body
194+
llmobs.flush();
195+
};
196+
```
197+
198+
{{% /tab %}}
199+
{{< /tabs >}}
200+
132201
## Further Reading
133202

134203
{{< partial name="whats-next/whats-next.html" >}}
@@ -144,3 +213,4 @@ If your application consists of more elaborate prompting or complex chains or wo
144213
[10]: /llm_observability/setup/auto_instrumentation/
145214
[11]: /llm_observability/setup/
146215
[12]: https://github.com/DataDog/llm-observability
216+
[13]: https://repost.aws/articles/ARixmsXALpSWuxI02zHgv1YA/bedrock-unveiled-a-quick-lambda-example

content/en/llm_observability/setup/sdk/nodejs.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ These options can be set on the general tracer configuration:
102102

103103
### AWS Lambda setup
104104

105-
Use the `llmobs.flush()` function to flush all remaining spans from the tracer to LLM Observability at the end of the Lambda function.
105+
See the [AWS Lambda Quickstart Guide][7] to quickly integrate LLM Observability into your Lambda functions.
106106

107107
#### Application naming guidelines
108108

@@ -709,3 +709,4 @@ tracer.use('http', false) // disable the http integration
709709
[4]: /tracing/trace_collection/compatibility/nodejs/#web-framework-compatibility
710710
[5]: /llm_observability/setup/auto_instrumentation/?tab=nodejs
711711
[6]: /tracing/trace_collection/custom_instrumentation/nodejs/dd-api/?tab=wrapper
712+
[7]: /llm_observability/quickstart?tab=nodejs#trace-an-llm-application-in-aws-lambda

content/en/llm_observability/setup/sdk/python.md

Lines changed: 5 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -107,13 +107,7 @@ LLMObs.enable(
107107

108108
### AWS Lambda setup
109109

110-
Enable LLM Observability by specifying the required environment variables in your [command line setup](#command-line-setup) and following the setup instructions for the [Datadog-Python and Datadog-Extension][14] AWS Lambda layers. Additionally, set `DD_TRACE_ENABLED` to `true` in your Lambda function's environment variables.
111-
112-
If you are only expecting traces from LLM Observability, set `DD_LLMOBS_AGENTLESS_ENABLED` to `true` in your Lambda function's environment variables.
113-
114-
If you are expecting APM traces from your Lambda function in addition to LLM Observability, leave `DD_EXTENSION_VERSION` unset in your Lambda function's environment variables if you are using `v66` or earlier of the Datadog-Extension layer. Otherwise, set `DD_EXTENSION_VERSION` to `compatibility` if you are using `v67` or later.
115-
116-
**Note**: Using the `Datadog-Python` and `Datadog-Extension` layers automatically turns on all LLM Observability integrations, and force flushes spans at the end of the Lambda function.
110+
See the [AWS Lambda Quickstart Guide][15] to quickly integrate LLM Observability into your Lambda functions.
117111

118112
#### Application naming guidelines
119113

@@ -480,12 +474,12 @@ The SDK's `LLMObs.annotate_context()` method returns a context manager that can
480474

481475
The `LLMObs.annotation_context()` method accepts the following arguments:
482476

483-
`name`
477+
`name`
484478
: optional - _str_
485479
<br />Name that overrides the span name for any auto-instrumented spans that are started within the annotation context.
486480

487-
`prompt`
488-
: optional - _dictionary_
481+
`prompt`
482+
: optional - _dictionary_
489483
<br />A dictionary that represents the prompt used for an LLM call in the following format:<br />`{"template": "...", "id": "...", "version": "...", "variables": {"variable_1": "...", ...}}`.<br />You can also import the `Prompt` object from `ddtrace.utils` and pass it in as the `prompt` argument. **Note**: This argument only applies to LLM spans.
490484

491485
`tags`
@@ -772,3 +766,4 @@ def server_process_request(request):
772766
[12]: /tracing/trace_collection/compatibility/python/#library-compatibility
773767
[13]: /llm_observability/setup/auto_instrumentation/
774768
[14]: /serverless/aws_lambda/installation/python/?tab=custom#installation
769+
[15]: /llm_observability/quickstart?tab=python#trace-an-llm-application-in-aws-lambda

0 commit comments

Comments
 (0)
0