Migrate from 2.x to 3.x

Learn about migrating from sentry-python 2.x to 3.x

This guide describes the common patterns involved in migrating to version 3.x of the sentry-python SDK. For the full list of changes, check out the detailed migration guide in the repository.

Sentry Python SDK 3.x only supports Python 3.7 and higher. If you're on an older Python version, you'll need to stay on an older version of the SDK:

  • Python 2.7-3.5: SDK 1.x
  • Python 3.6: SDK 2.x

The enable_tracing option was removed. Use traces_sample_rate directly, or configure a traces_sampler for more fine-grained control over which spans should be sampled.

Copied
  sentry_sdk.init(
-     enable_tracing=True,
+     traces_sample_rate=1.0,
  )

The deprecated propagate_traces option was removed. Use trace_propagation_targets instead.

Copied
  sentry_sdk.init(
      # don't propagate trace info downstream
-     propagate_traces=False,
+     trace_propagation_targets=[],
  )

Note that this only affects the global SDK option. The propagate_traces option of the Celery integration remains unchanged.

The profiles_sample_rate and profiler_mode options previously nested under _experiments have been removed. They're replaced by top-level options of the same name:

Copied
  sentry_sdk.init(
-     _experiments={
-         "profiles_sample_rate": 1.0,
-         "profiler_mode": "thread",
-     },
+     profiles_sample_rate=1.0,
+     profiler_mode="thread",
  )

add_attachment() is now a part of the top-level level API and should be imported and used directly from sentry_sdk.

Copied
  import sentry_sdk

- scope = sentry_sdk.get_current_scope()
- scope.add_attachment(bytes=b"Hello World!", filename="attachment.txt")
+ sentry_sdk.add_attachment(bytes=b"Hello World!", filename="attachment.txt")

Using sentry_sdk.add_attachment() directly also makes sure the attachment is added to the correct scope internally.

Tracing in the Sentry Python SDK 3.x is powered by OpenTelemetry in the background, which also means we're moving away from the Sentry-specific concept of transactions and towards a span-only future. sentry_sdk.start_transaction() is now deprecated in favor of sentry_sdk.start_span().

Copied
- with sentry_sdk.start_transaction():
+ with sentry_sdk.start_span():
      ...

Any spans without a parent span will become transactions by default. If you want to avoid promoting a span without a parent to a transaction, you can pass the only_if_parent=True keyword argument to sentry_sdk.start_span().

sentry_sdk.start_transaction() and sentry_sdk.start_span() no longer take the following arguments: trace_id, baggage, span_id, parent_span_id. Use sentry_sdk.continue_trace() for propagating trace data.

sentry_sdk.continue_trace() no longer returns a Transaction and is now a context manager. To continue a trace from headers or environment variables, start a new span inside sentry_sdk.continue_trace():

Copied
- transaction = sentry_sdk.continue_trace({...})
- with sentry_sdk.start_transaction(transaction=transaction):
-     ...
+ with sentry_sdk.continue_trace({...}):
+     with sentry_sdk.start_span():
+         ...

The functions continue_from_headers, continue_from_environ and from_traceparent have been removed. Use the sentry_sdk.continue_trace() context manager instead.

In OpenTelemetry, there is no concept of separate categories of data on a span: everything is simply a span attribute. This is a concept the Sentry SDK is also adopting. We deprecated set_data() and added a new span method called set_attribute():

Copied
  with sentry_sdk.start_span(...) as span:
-      span.set_data("my_attribute", "my_value")
+      span.set_attribute("my_attribute", "my_value")

You can also set attributes directly when creating the span. This has the advantage that these initial attributes will be accessible in the sampling context in your traces_sampler/profiles_sampler (see also the Sampling section).

Copied
with sentry_sdk.start_span(attributes={"my_attribute": "my_value"}):
    ...

It's no longer possible to change the sampling decision of a span by setting span.sampled directly after the span has been created. Use either a custom traces_sampler (preferred) or the sampled argument to start_span() for determining whether a span should be sampled.

Copied
with sentry_sdk.start_span(sampled=True) as span:
    ...

The sampling_context argument of traces_sampler and profiles_sampler has changed considerably for spans coming from our auto-instrumented integrations. As a consequence of using OpenTelemetry under the hood, spans can only carry specific, primitive types of data. This prevents us from making custom objects, for example, the Request object for several web frameworks, accessible on the span.

AIOHTTP sampling context changes

The AIOHTTP integration doesn't add the aiohttp_request object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:

Request propertySampling context key(s)
pathurl.path
query_stringurl.query
methodhttp.request.method
hostserver.address, server.port
schemeurl.scheme
full URLurl.full
request.headershttp.request.header.{header}
Celery sampling context changes

The Celery integration doesn't add the celery_job dictionary anymore. Instead, the individual keys are now available as:

Dictionary keysSampling context keyExample
celery_job["args"]celery.job.args.{index}celery.job.args.0
celery_job["kwargs"]celery.job.kwargs.{kwarg}celery.job.kwargs.kwarg_name
celery_job["task"]celery.job.task
Tornado sampling context changes

The Tornado integration doesn't add the tornado_request object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:

Request propertySampling context key(s)
pathurl.path
queryurl.query
protocolurl.scheme
methodhttp.request.method
hostserver.address, server.port
versionnetwork.protocol.name, network.protocol.version
full URLurl.full
request.headershttp.request.header.{header}
WSGI sampling context changes

The WSGI integration doesn't add the wsgi_environ object anymore. Instead, the individual properties of the environment are accessible, if available, as follows:

Env propertySampling context key(s)
PATH_INFOurl.path
QUERY_STRINGurl.query
REQUEST_METHODhttp.request.method
SERVER_NAMEserver.address
SERVER_PORTserver.port
SERVER_PROTOCOLserver.protocol.name, server.protocol.version
wsgi.url_schemeurl.scheme
full URLurl.full
HTTP_*http.request.header.{header}
ASGI sampling context changes

The ASGI integration doesn't add the asgi_scope object anymore. Instead, the individual properties of the scope, if available, are accessible as follows:

Scope propertySampling context key(s)
typenetwork.protocol.name
schemeurl.scheme
pathurl.path
queryurl.query
http_versionnetwork.protocol.version
methodhttp.request.method
serverserver.address, server.port
clientclient.address, client.port
full URLurl.full
headershttp.request.header.{header}
RQ sampling context changes

The RQ integration doesn't add the rq_job object anymore. Instead, the individual properties of the job and the queue, if available, are accessible as follows:

RQ propertySampling context keyExample
rq_job.argsrq.job.args.{index}rq.job.args.0
rq_job.kwargsrq.job.kwargs.{kwarg}rq.job.args.my_kwarg
rq_job.funcrq.job.func
queue.namemessaging.destination.name
rq_job.idmessaging.message.id

Note that rq.job.args, rq.job.kwargs, and rq.job.func are serialized and not the actual objects on the job.

AWS Lambda sampling context changes

The AWS Lambda integration doesn't add the aws_event and aws_context objects anymore. Instead, the following, if available, is accessible:

AWS propertySampling context key(s)
aws_event["httpMethod"]http.request.method
aws_event["queryStringParameters"]url.query
aws_event["path"]url.path
full URLurl.full
aws_event["headers"]["X-Forwarded-Proto"]network.protocol.name
aws_event["headers"]["Host"]server.address
aws_context["function_name"]faas.name
aws_event["headers"]http.request.headers.{header}
GCP sampling context changes

The GCP integration doesn't add the gcp_env and gcp_event keys anymore. Instead, the following, if available, is accessible:

Old sampling context keyNew sampling context key
gcp_env["function_name"]faas.name
gcp_env["function_region"]faas.region
gcp_env["function_project"]gcp.function.project
gcp_env["function_identity"]gcp.function.identity
gcp_env["function_entry_point"]gcp.function.entry_point
gcp_event.methodhttp.request.method
gcp_event.query_stringurl.query
gcp_event.headershttp.request.header.{header}

The ability to set custom_sampling_context on start_transaction was removed. If there is custom data that you want to have accessible in the sampling_context of a traces_sampler or profiles_sampler, set it on the span via the attributes argument, as all span attributes are now included in the sampling_context by default:

Copied
- with start_transaction(custom_sampling_context={"custom_attribute": "custom_value"}):
+ with start_span(attributes={"custom_attribute": "custom_value"}) as span:
      # custom_attribute will now be accessible in the sampling context
      # of your traces_sampler/profiles_sampler
      ...

We've updated how we handle ExceptionGroups. You will now get more data if ExceptionGroups appear in chained exceptions. As an indirect consequence, you might notice a change in how issues are grouped in Sentry.

Additional integrations will now be activated automatically if the SDK detects the respective package is installed: Ariadne, ARQ, asyncpg, Chalice, clickhouse-driver, GQL, Graphene, huey, Loguru, PyMongo, Quart, Starlite, Strawberry. You can opt-out of specific integrations with the disabled_integrations option.

We no longer support Django older than 2.0, trytond older than 5.0, and Falcon older than 3.0.

The logging integration, which implements out-of-the-box support for the Python standard library logging framework, doesn't capture error logs as events anymore by default. The original behavior can still be achieved by providing a custom event_level to the LoggingIntegration:

Copied
sentry_sdk.init(
    integrations=[
        # capture error, critical, exception logs
        # and send them to Sentry as errors
        LoggingIntegration(event_level="ERROR"),
    ],
)

The query being executed is now available under the db.query.text span attribute (only if send_default_pii is True).

The PyMongo integration no longer sets tags automatically. The data is still accessible via span attributes.

The PyMongo integration doesn't set operation_ids anymore. The individual IDs (operation_id, request_id, session_id) are now accessible as separate span attributes.

In Redis pipeline spans, there is no span["data"]["redis.commands"] that contains a dictionary {"count": 3, "first_ten": ["cmd1", "cmd2", ...]}. Instead, there is span["data"]["redis.commands.count"] (containing 3) and span["data"]["redis.commands.first_ten"] (containing ["cmd1", "cmd2", ...]).

The set_measurement() API was removed. You can set custom attributes on the span instead with set_attribute().

The auto_session_tracking() context manager was removed. Use track_session() instead.

Setting Scope.user directly is no longer supported. Use Scope.set_user() instead.

The sentry_sdk.metrics API doesn't exist anymore in SDK 3.x as the metrics beta has come to an end. The associated experimental options enable_metrics, before_emit_metric and metric_code_locations have been removed as well.

There is no concept of a hub anymore and all APIs and attributes that were connected to hubs have been removed.

Was this helpful?
Help improve this content
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").