8000 POtel implementation base branch by sl0thentr0py · Pull Request #3152 · getsentry/sentry-python · GitHub
[go: up one dir, main page]

Skip to content

POtel implementation base branch #3152

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 344 commits into
base: master
Choose a base branch
from
Draft

POtel implementation base branch #3152

wants to merge 344 commits into from

Conversation

sl0thentr0py
Copy link
Member
@sl0thentr0py sl0thentr0py commented Jun 10, 2024

Full state of CI: #3744

Contains:

Simple test

import sentry_sdk
from time import sleep

sentry_sdk.init(
    debug=True,
    traces_sample_rate=1.0,
    _experiments={"otel_powered_performance": True},
)

with sentry_sdk.start_span(description="sentry request"):
    sleep(0.1)
    with sentry_sdk.start_span(description="sentry db"):
        sleep(0.5)
        with sentry_sdk.start_span(description="sentry redis"):
            sleep(0.2)
    with sentry_sdk.start_span(description="sentry http"):
        sleep(1)

References

Misc

In OTel, this:

with tracer.start_as_current_span("parent") as parent:
    with tracer.start_span("child1"):
        pass
    with tracer.start_span("child2"):
        pass

is equivalent to

from opentelemetry import trace, context

parent = tracer.start_span("parent")

# Creates a Context object with parent set as current span
ctx = trace.set_span_in_context(parent)

# Set as the implicit current context
token = context.attach(ctx)

# Child will automatically be a child of parent
child1 = tracer.start_span("child1")
child1.end()

# Child will automatically be a child of parent
child2 = tracer.start_span("child2")
child2.end()

# Don't forget to detach or parent will remain the parent above this call stack
context.detach(token)
parent.end()

@sl0thentr0py sl0thentr0py requested a review from sentrivana June 10, 2024 19:00
@sl0thentr0py sl0thentr0py force-pushed the potel-base branch 2 times, most recently from f7f153c to 28effd6 Compare June 11, 2024 11:43
@sl0thentr0py sl0thentr0py force-pushed the potel-base branch 2 times, most recently from 16f9341 to 951477f Compare June 25, 2024 15:16
Copy link
codecov bot commented Jun 26, 2024

Codecov Report

Attention: Patch coverage is 84.41687% with 314 lines in your changes missing coverage. Please review.

Project coverage is 84.37%. Comparing base (cb82483) to head (07ba0f0).
Report is 1 commits behind head on master.

✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
sentry_sdk/integrations/aws_lambda.py 11.11% 48 Missing ⚠️
sentry_sdk/tracing.py 80.49% 37 Missing and 10 partials ⚠️
sentry_sdk/opentelemetry/utils.py 81.93% 25 Missing and 18 partials ⚠️
sentry_sdk/opentelemetry/span_processor.py 81.81% 13 Missing and 17 partials ⚠️
sentry_sdk/integrations/gcp.py 0.00% 27 Missing ⚠️
sentry_sdk/tracing_utils.py 79.16% 10 Missing and 5 partials ⚠️
sentry_sdk/integrations/ray.py 7.14% 13 Missing ⚠️
sentry_sdk/opentelemetry/sampler.py 91.66% 7 Missing and 4 partials ⚠️
sentry_sdk/utils.py 86.11% 5 Missing and 5 partials ⚠️
sentry_sdk/integrations/tornado.py 75.86% 4 Missing and 3 partials ⚠️
... and 23 more
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #3152      +/-   ##
==========================================
+ Coverage   80.31%   84.37%   +4.06%     
==========================================
  Files         142      144       +2     
  Lines       15961    14697    -1264     
  Branches     2727     2339     -388     
==========================================
- Hits        12819    12401     -418     
+ Misses       2267     1567     -700     
+ Partials      875      729     -146     
Files with missing lines Coverage Δ
sentry_sdk/__init__.py 100.00% <100.00%> (ø)
sentry_sdk/_compat.py 84.21% <ø> (-1.84%) ⬇️
sentry_sdk/_init_implementation.py 100.00% <100.00%> (+23.33%) ⬆️
sentry_sdk/ai/monitoring.py 88.73% <100.00%> (+2.43%) ⬆️
sentry_sdk/ai/utils.py 85.00% <100.00%> (+7.72%) ⬆️
sentry_sdk/api.py 95.34% <100.00%> (+15.77%) ⬆️
sentry_sdk/client.py 84.80% <100.00%> (+5.34%) ⬆️
sentry_sdk/consts.py 99.59% <100.00%> (+6.15%) ⬆️
sentry_sdk/debug.py 95.00% <ø> (+3.69%) ⬆️
sentry_sdk/envelope.py 82.98% <100.00%> (+3.08%) ⬆️
... and 84 more

... and 42 files with indirect coverage changes

@sl0thentr0py sl0thentr0py changed the title Skeletons for new POTEL components New POTEL base branch Jul 9, 2024
@sl0thentr0py sl0thentr0py changed the title New POTEL base branch potel implementation base branch Jul 9, 2024
@antonpirker antonpirker changed the title potel implementation base branch POtel implementation base branch Aug 5, 2024
@sentrivana sentrivana removed their request for review August 28, 2024 09:12
sentrivana and others added 18 commits October 17, 2024 17:13
…n` to integration. (#3637)

Move `subprocess` breadcrumbs from `maybe_create_breadcrumbs_from_span` into the stdlib integration and preserve the breadcrumb behavior in POTel.
This moves the creation of `redis` breadcrumbs from the `maybe_create_breadcrumbs_from_span` into the integrations. And as well fixes some span related tests by using `render_span_tree`.
This moves the creation of breadcrumbs for outgoing HTTP requests from the `maybe_create_breadcrumbs_from_span` into the integrations.
It is possible to use the return value of `sentry_sdk.init` as a context manager; however, this functionality has not been maintained for a long time, and it does not seem to be documented anywhere.

So, we are deprecating this functionality, and we will remove it in the next major release.

Closes #3282
szokeasaurusrex and others added 30 commits April 24, 2025 13:00
Currently, this property has type `Any`, but it can now be changed to
`Optional[Span]`

Depends on:
  - #4263
`ThreadingIntegration` can optionally **NOT** propagate scope data to
threads (`propagate_scope=False`). In that case, in POTel we were
wrapping the thread's task in an `isolation_scope()`:

```python
with sentry_sdk.isolation_scope() as scope:
    return _run_old_run_func()
```

But as this forks the currently active isolation scope, the thread
effectively gets all scope data from the parent isolation scope -- so
the scope is actually propagated to the thread, even though it shouldn't
be since `propagate_scope=False`.

~We effectively need some way to give the thread a clear isolation scope
instead. In this PR, I'm just clearing the forked iso scope, but I'm not
sure if this is good enough and if something doesn't need to be done on
the OTel side too.~

~Another option would be to set the iso/current scopes to the initial,
empty iso/current scopes instead, before running the thread's target
function.~

UPDATE: we're just instantiating new scopes now


Another change is that in OTel, the spans in the threads, now without a
parent, automatically get promoted to transactions. (On master they'd
just be orphaned spans, so they wouldn't be taken into account at all.)
We probably need to instruct folks to add `only_if_parent` if they don't
want this to happen.

---------

Co-authored-by: Neel Shah <neel.shah@sentry.io>
With the switch to OTel, the Common test suite is now dependent on an
otel package, so it technically fits the toxgen usecase. By letting
toxgen take care of it, we're making sure we're always testing a good
range of otel versions, including the oldest one (to catch regressions)
and the newest one (to catch incompatibilities early).

Couple things surfaced in terms of incompatibility with older versions:
- Some semantic attributes we're using weren't there from the get go
open-telemetry/opentelemetry-python@495d705.
Changed the code that uses them to handle failure.
- The signature of `span.set_status()` changed at some point
open-telemetry/opentelemetry-python@6e282d2.
Added a compat version of `set_status()` for older otel.

Also included:
- removing the `opentelemetry-experimental` extra (not used anymore)
- ❗ switching to using `opentelemetry-sdk` instead of
`opentelemetry-distro` -- the `distro` only seems to [be setting up some
defaults](https://github.com/open-telemetry/opentelemetry-python-contrib/blob/8390db35ae2062c09d4d74a08d310c7bde1912c4/opentelemetry-distro/src/opentelemetry/distro/__init__.py)
that we're not using


Closes #3241
Closes #4332

---------

Co-authored-by: Daniel Szoke <7881302+szokeasaurusrex@users.noreply.github.com>
Introduce the convention of underscore-prefixed span attributes. These
won't be sent to Sentry and are meant for internal SDK usage.

Changed `flag.count` to internal. Looked through the rest of the attrs
we're setting and that stuff requires a big comprehensive cleanup
altogether to make stuff align with OTel. Didn't touch anything else for
now.

Closes #4329
Revert changing the default of `traces_sample_rate` done in
#4240
Store feature flags on the isolation scope, that is the correct place. 

I also checked back with Colton about the behavior of feature flags, and
having the flags on the isolation scope (meaning: one set of flags per
request-response cycle) is the expected behavior.
…eader (#4356)

Since we don't automatically have unsampled spans running, this caused a
change in behavior when an upstream sampling decision needs to be
propagated further downstream.

### Explanation of problem
When an incoming trace has `sampled` set to 0 (`trace_id-span_id-0`),
in the past we would propagate this since we would have an active
span/transaction running but just not sampled, so downstream would also
receive `trace_id-span_id-0` from that active span.
Now, we actually don't have an active span since we don't sample (just
how otel works), so instead of sending the `trace_id-span_id-0` as
before, we would have sent `trace_id-other_span_id` from the
`propagation_context` instead.
This would cause the downstream service to not receive the `-0` flag and
would thus sample independently, which is a regression.
In some cases FastAPI emits an exception that has as `__cause__` an
ExceptionGroup that contains a single excpetion. That single exepction
is the original exception. This PR prevents an infinite loop by trying
to add this construct in the `exception.values` field.

It also introduces an hard upper limit of chained/nested transaction, to
never run into an infinite loop.
This PR also needs an update of the docs to make sure people use the top
level API. See this docs issue:
getsentry/sentry-docs#13592
To preserve behavior form 2.x
Not using plain strings, but always use `SPANDATA`. Follow up to this
PR: #4373
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants
0