8000 Implement OpenAI Agents span processing by nagkumar91 · Pull Request #3817 · open-telemetry/opentelemetry-python-contrib · GitHub
[go: up one dir, main page]

Skip to content

Conversation

nagkumar91
Copy link
Contributor
@nagkumar91 nagkumar91 commented Oct 7, 2025

Description

Updates the previous barebones PR to include genai spans and traces.

Fixes # (issue)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

Testing

  • Manual agents example: cd instrumentation-genai/opentelemetry-instrumentation-openai-agents/examples/manual && cp .env.example .env (populate OPENAI_API_KEY, optional OTLP vars) then python3 main.py; confirmed travel response prints and spans export
    cleanly with integer timestamps.
  • Zero-code example: cd instrumentation-genai/opentelemetry-instrumentation-openai-agents/examples/zero-code && cp .env.example .env (set API key) then python3 main.py; verified auto-configured instrumentation loads env defaults and completes without
    exporter errors.
  • Unit tests: From repo root, with .venv active and pip install -e .[examples] applied, ran python3 -m pytest opentelemetry-instrumentation-openai-agents/tests/test_tracer.py; all tests passed, confirming span translation logic handles nanosecond
    timestamps and attribute mapping.

Environment

  • Python 3.9.6 inside the project’s .venv.
  • Dependencies installed via pip install -e instrumentation-genai/opentelemetry-instrumentation-openai-agents[examples].
  • OTLP collector optional; when unavailable, spans export successfully to the default console exporter.

Does This PR Require a Core Repo Change?

  • Yes. - Link to PR:
  • No.

Checklist:

See contributing.md for styleguide, changelog guidelines, and more.

  • Followed the style guidelines of this project
  • Changelogs have been updated
  • Unit tests have been added
  • Documentation has been updated

@nagkumar91 nagkumar91 marked this pull request as ready for review October 8, 2025 20:28
@nagkumar91 nagkumar91 requested a review from a team as a code owner October 8, 2025 20:28

processor = _OpenAIAgentsSpanProcessor(tracer=tracer, system=system)

tracing = _load_tracing_module()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason why we are loading the module during init of the instrumentor and not at the beginning of runtime/

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to avoid errors at import if the openai agents SDK is not installed. We can move it around if it makes more sense.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't think there is a need for that. All other instrumentations follow the convention that the underlying library is assumed to be installed. You can move the import statements to top level.

start_time = _parse_iso8601(getattr(trace, "started_at", None))

with self._lock:
span = self._tracer.start_span(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it make sense to have a span created for trace start? It's not really clearly defined in the semantic conventions what this span should look like or where in the agent process does on_trace_start get called. Might be something to discuss in the SIG.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

on_trace_start fires in the OpenAI Agents SDK whenever it starts a new agent workflow Trace.start()
we translate that event into a root span so every downstream generation/response/tool span
has a common parent. We currently tag that root as SpanKind.SERVER because, in the Agent runtime model, the trace represents the server side orchestrating a request; this mirrors how other server
frameworks build a tree rooted at a SERVER span. The GenAI spec doesn’t yet spell out guidance for this top-level agent span, so we can bring it to the SIG for confirmation, but today the root span
gives us the correct hierarchy and timing for the whole workflow.


def _resolve_system(value: str | None) -> str:
if not value:
return GenAI.GenAiSystemValues.OPENAI.value
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think gen_ai.provider.name MUST be openai for all openai spans. https://github.com/open-telemetry/semantic-conventions/blob/main/docs/gen-ai/openai.md#spans

}
)

_GEN_AI_PROVIDER_NAME = "gen_ai.provider.name"
Copy link
Contributor
6854

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not use the sem conv symbol?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants

0