8000 [MLOB-2380] Add LLM Observability Serverless Quickstart Guide by sabrenner · Pull Request #28118 · DataDog/documentation · GitHub
[go: up one dir, main page]

Skip to content

[MLOB-2380] Add LLM Observability Serverless Quickstart Guide #28118

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 24 commits into from
May 21, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
278922f
add quickstart from reference
sabrenner Mar 12, 2025
57ae6f6
wording fixes
sabrenner Mar 12, 2025
9509bdd
nodejs tab
sabrenner Mar 13, 2025
27217f9
fixes
sabrenner Mar 13, 2025
25533fc
enhance quickstart guide steps
sabrenner Mar 14, 2025
781bfe0
remove images in favor of direct steps
sabrenner Mar 14, 2025
7b0b522
remove compatibility notes section
sabrenner Mar 14, 2025
f589f93
link to quickstart guide
sabrenner Mar 18, 2025
4e3464d
add flush mention
sabrenner Mar 18, 2025
6ad014d
Update content/en/llm_observability/quickstart.md
sabrenner Mar 19, 2025
c2dd6f5
Update content/en/llm_observability/quickstart.md
sabrenner Mar 19, 2025
849d5ed
Update content/en/llm_observability/quickstart.md
sabrenner Mar 19, 2025
289c1ea
add to guides section
sabrenner Mar 19, 2025
0dcbbdf
Merge branch 'master' of github.com:DataDog/documentation into sabren…
sabrenner Apr 16, 2025
4a4c41d
fix wording for new approach
sabrenner Apr 16, 2025
969bddd
try formatting fix
sabrenner Apr 16, 2025
dba554d
Merge branch 'master' into sabrenner/llmobs-serverless-quickstart
sabrenner May 8, 2025
a11eec4
Update content/en/llm_observability/setup/sdk/python.md
sabrenner May 12, 2025
489349d
Update content/en/llm_observability/setup/sdk/nodejs.md
sabrenner May 12, 2025
12821a4
Update content/en/llm_observability/quickstart.md
sabrenner May 15, 2025
4a2a4ef
try and add some newlines for tab formatting
sabrenner May 15, 2025
44cddc4
try different newlining
sabrenner May 15, 2025
b34f215
newlines
sabrenner May 15, 2025
9fc48d2
get rid of newlines entirely
sabrenner May 19, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion content/en/llm_observability/guide/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@ cascade:
---

{{< whatsnext desc="LLM Observability Guides:" >}}
{{< nextlink href="/llm_observability/quickstart" >}}Trace an LLM Application{{< /nextlink >}}
{{< nextlink href="/llm_observability/quickstart#trace-an-llm-application" >}}Trace an LLM Application{{< /nextlink >}}
{{< nextlink href="/llm_observability/quickstart#trace-an-llm-application-in-aws-lambda" >}}Trace an LLM Application in AWS Lambda{{< /nextlink >}}
{{< nextlink href="/llm_observability/submit_evaluations" >}}Submit Evaluations{{< /nextlink >}}
{{< nextlink href="/llm_observability/submit_nemo_evaluations" >}}Submit NVIDIA NeMo Custom Evaluations{{< /nextlink >}}
{{< nextlink href="/llm_observability/guide/ragas_quickstart" >}}Ragas Quickstart{{< /nextlink >}}
Expand Down
72 changes: 71 additions & 1 deletion content/en/llm_observability/quickstart.md
8000
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ This guide uses the LLM Observability SDKs for [Python][1] and [Node.js][2]. If

To better understand LLM Observability terms and concepts, you can explore the examples in the [LLM Observability Jupyter Notebooks repository][12]. These notebooks provide a hands-on experience, and allow you to apply these concepts in real time.

## Command line
## Trace an LLM application

To generate an LLM Observability trace, you can run a Python or Node.js script.

Expand Down Expand Up @@ -129,6 +129,75 @@ The trace you see is composed of a single LLM span. The `ddtrace-run` or `NODE_O

If your application consists of more elaborate prompting or complex chains or workflows involving LLMs, you can trace it using the [Setup documentation][11] and the [SDK documentation][1].

## Trace an LLM application in AWS Lambda
The following steps generate an LLM Observability trace in an AWS Lambda environment and create an Amazon Bedrock based chatbot running with LLM Observability in AWS Lambda.

1. Create a [Lambda function chatbot using Amazon Bedrock][13].
2. Instrument your Lambda function:
1. Open a Cloudshell
2. Install the Datadog CLI client
```shell
npm install -g @datadog/datadog-ci
```
3. Set the Datadog API key and site
```shell
export DD_SITE=<YOUR_DD_SITE>
export DD_API_KEY=<YOUR_DATADOG_API_KEY>
```
If you already have or prefer to use a secret in Secrets Manager, you can set the API key by using the secret ARN:
```shell
export DATADOG_API_KEY_SECRET_ARN=<DATADOG_API_KEY_SECRET_ARN>
```
4. Instrument your Lambda function with LLM Observability (this requires at least version 77 of the Datadog Extension layer).
{{< tabs >}}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because this tabbed section is nested inside a numbered list, the tabs are indented. I think it looks a bit odd, but perhaps it is intentional and desired? Take a look and tell me what you think. If you'd like the tabs to be left justified, I think adding some extra newlines will separate them from the numbered list.

To see what I mean about the indentation, go to the preview, search on the page for the string "77", and look at the tabs just below the search result.

Copy link
Contributor Author
@sabrenner sabrenner May 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ahh yes i see what you're saying! yeah let me try adding some newlines around it to see if it helps a bit. it is intentional to have a tab section here, although the misalignment wasn't intentional, but otherwise i don't think it's something we're too concerned about.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i tried a bunch of newline configurations and couldn't quite get it to look right. any ideas for how to get the spacing a little better are appreciated 😄 we wanted the tabs here because of the different layer versions for the different language extensions. thanks!!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also played around with newlines. I was able to get the tab left justified, but it looked strange. I think the current configuration is the best. Thanks for being willing to experiment!

{{% tab "Python" %}}
```shell
datadog-ci lambda instrument -f <YOUR_LAMBDA_FUNCTION_NAME> -r <AWS_REGION> -v {{< latest-lambda-layer-version layer="python" >}} -e {{< latest-lambda-layer-version layer="extension" >}} --llmobs <YOUR_LLMOBS_ML_APP>
```
{{% /tab %}}
{{% tab "Node.js" %}}
```shell
datadog-ci lambda instrument -f <YOUR_LAMBDA_FUNCTION_NAME> -r <AWS_REGION> -v {{< latest-lambda-layer-version layer="node" >}} -e {{< latest-lambda-layer-version layer="extension" >}} --llmobs <YOUR_LLMOBS_ML_APP>
```
{{% /tab %}}
{{< /tabs >}}
3. Verify that your function was instrumented.
1. In the Datadog UI, navigate to `Infrastructure > Serverless`
2. Search for the name of your function.
3. Click on it to open the details panel.
4. Under the `Configuration` tab are the details of the Lambda function, attached layers, and a list of `DD_` Datadog-related environment variables under the `Datadog Environment Variables` section.
4. Invoke your Lambda function and verify that LLM Observability traces are visible in the Datadog UI.

### Force flushing traces

For either serverless environments other than AWS Lambda or issues seeing traces from AWS Lambdas, use the `flush` method to ensure traces are flushed before the process exits.

{{< tabs >}}
{{% tab "Python" %}}

```python
from ddtrace.llmobs import LLMObs
def handler():
# function body
LLMObs.flush()
```

{{% /tab %}}
{{% tab "Node.js" %}}

```javascript
import tracer from 'dd-trace';
const llmobs = tracer.llmobs;

export const handler = async (event) => {
// your function body
llmobs.flush();
};
```

{{% /tab %}}
{{< /tabs >}}

## Further Reading

{{< partial name="whats-next/whats-next.html" >}}
Expand All @@ -144,3 +213,4 @@ If your application consists of more elaborate prompting or complex chains or wo
[10]: /llm_observability/setup/auto_instrumentation/
[11]: /llm_observability/setup/
[12]: https://github.com/DataDog/llm-observability
[13]: https://repost.aws/articles/ARixmsXALpSWuxI02zHgv1YA/bedrock-unveiled-a-quick-lambda-example
3 changes: 2 additions & 1 deletion content/en/llm_observability/setup/sdk/nodejs.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ These options can be set on the general tracer configuration:

### AWS Lambda setup

Use the `llmobs.flush()` function to flush all remaining spans from the tracer to LLM Observability at the end of the Lambda function.
See the [AWS Lambda Quickstart Guide][7] to quickly integrate LLM Observability into your Lambda functions.

#### Application naming guidelines

Expand Down Expand Up @@ -709,3 +709,4 @@ tracer.use('http', false) // disable the http integration
[4]: /tracing/trace_collection/compatibility/nodejs/#web-framework-compatibility
[5]: /llm_observability/setup/auto_instrumentation/?tab=nodejs
[6]: /tracing/trace_collection/custom_instrumentation/nodejs/dd-api/?tab=wrapper
[7]: /llm_observability/quickstart?tab=nodejs#trace-an-llm-application-in-aws-lambda
15 changes: 5 additions & 10 deletions content/en/llm_observability/setup/sdk/python.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,13 +107,7 @@ LLMObs.enable(

### AWS Lambda setup

Enable LLM Observability by specifying the required environment variables in your [command line setup](#command-line-setup) and following the setup instructions for the [Datadog-Python and Datadog-Extension][14] AWS Lambda layers. Additionally, set `DD_TRACE_ENABLED` to `true` in your Lambda function's environment variables.

If you are only expecting traces from LLM Observability, set `DD_LLMOBS_AGENTLESS_ENABLED` to `true` in your Lambda function's environment variables.

If you are expecting APM traces from your Lambda function in addition to LLM Observability, leave `DD_EXTENSION_VERSION` unset in your Lambda function's environment variables if you are using `v66` or earlier of the Datadog-Extension layer. Otherwise, set `DD_EXTENSION_VERSION` to `compatibility` if you are using `v67` or later.

**Note**: Using the `Datadog-Python` and `Datadog-Extension` layers automatically turns on all LLM Observability integrations, and force flushes spans at the end of the Lambda function.
See the [AWS Lambda Quickstart Guide][15] to quickly integrate LLM Observability into your Lambda functions.

#### Application naming guidelines

Expand Down Expand Up @@ -480,12 +474,12 @@ The SDK's `LLMObs.annotate_context()` method returns a context manager that can

The `LLMObs.annotation_context()` method accepts the following arguments:

`name`
`name`
: optional - _str_
<br />Name that overrides the span name for any auto-instrumented spans that are started within the annotation context.

`prompt`
: optional - _dictionary_
`prompt`
: optional - _dictionary_
<br />A dictionary that represents the prompt used for an LLM call in the following format:<br />`{"template": "...", "id": "...", "version": "...", "variables": {"variable_1": "...", ...}}`.<br />You can also import the `Prompt` object from `ddtrace.utils` and pass it in as the `prompt` argument. **Note**: This argument only applies to LLM spans.

`tags`
Expand Down Expand Up @@ -772,3 +766,4 @@ def server_process_request(request):
[12]: /tracing/trace_collection/compatibility/python/#library-compatibility
[13]: /llm_observability/setup/auto_instrumentation/
[14]: /serverless/aws_lambda/installation/python/?tab=custom#installation
[15]: /llm_observability/quickstart?tab=python#trace-an-llm-application-in-aws-lambda
Loading
0