DocsOpentelemetry

OpenTelemetry

Where is this feature available?
  • Hobby
  • Pro
  • Team
  • Self Hosted
⚠️

OpenTelemetry support in Langfuse is experimental. All APIs may change at any point in time without prior notice. On Langfuse Cloud, we have strict rate-limits in place for OpenTelemetry span ingestion.

We share this feature to gather feedback and improve it based on our user’s needs. Please share all feedback you have in the OpenTelemetry Support GitHub Discussion.

OpenTelemetry is a CNCF project that provides a set of specifications, APIs, libraries that define a standard way to collect distributed traces and metrics from your application. OpenTelemetry maintains an experimental set of Semantic Conventions for GenAI attributes on traces. In addition, to our native SDKs and our vendor specific integrations, we added experimental support for OpenTelemetry.

Getting Started

To get started using the OpenTelemetry integration, you will need the Langfuse Trace API endpoint and your Langfuse API keys. For our EU data region the endpoint is https://cloud.langfuse.com/api/public/otel and for the US data region it is https://us.cloud.langfuse.com/api/public/otel. We will use the EU data region in the following examples.

In addition, you will need the Langfuse API keys, e.g. pk-lf-1234567890 and sk-lf-1234567890. Run $ echo -n ${LANGFUSE_PUBLIC_KEY}:${LANGFUSE_SECRET_KEY} | base64 to get the base64 encoded API keys (referred to as “AUTH_STRING” going forward).

Using those parameters, you can configure the OpenTelemetry exporters for your tracing framework as follows:

Add a Langfuse exporter to your OpenTelemetry Collector configuration:

receivers:
  otlp:
    protocols:
    grpc:
      endpoint: 0.0.0.0:4317
    http:
      endpoint: 0.0.0.0:4318
 
processors:
  batch:
  memory_limiter:
    # 80% of maximum memory up to 2G
    limit_mib: 1500
    # 25% of limit up to 2G
    spike_limit_mib: 512
    check_interval: 5s
 
exporters:
  otlp/langfuse:
    endpoint: "cloud.langfuse.com/api/public/otel"
    headers:
      Authorization: "Basic ${AUTH_STRING}"
 
service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [memory_limiter, batch]
      exporters: [otlp/langfuse]

Property Mapping

Langfuse accepts any span that adheres to the OpenTelemetry specification. In addition, we map many GenAI specific properties to properties in the Langfuse data model to provide a seamless experience when using OpenTelemetry with Langfuse. First and foremost, we stick to the OpenTelemetry Gen AI Conventions, but also map vendor specific properties from common frameworks. All attributes and resourceAttributes are available within the Langfuse metadata property as a fallback.

Below, we share a non-exhaustive list of mappings that Langfuse applies:

OpenTelemetry AttributeLangfuse PropertyDescription
gen_ai.usage.costcostDetails.totalThe total cost of the request.
gen_ai.usage.*usageDetails.*Maps all keys within usage aside from cost to usageDetails. Token properties are simplified to input, output, and total.
gen_ai.request.modelmodelThe model used for the request.
gen_ai.response.modelmodelThe model used for the response.
gen_ai.request.*modelParametersMaps all keys within request to modelParameters.
langfuse.session.idsessionIdThe session ID for the request.
session.idsessionIdThe session ID for the request.
langfuse.user.iduserIdThe user ID for the request.
user.iduserIdThe user ID for the request.
gen_ai.promptinputInput field. Deprecated by OpenTelemetry as event properties should be preferred.
gen_ai.completionoutputOutput field. Deprecated by OpenTelemetry as event properties should be preferred.

Was this page useful?

Questions? We're here to help

Subscribe to updates