Learn how to export OpenLLMetry traces to Openlayer
OpenLLMetry (by Traceloop) is
an open-source project that makes it easy to monitor and trace the execution of LLM
applications. It builds on top of OpenTelemetry and captures traces in a non-intrusive way.This guide shows how you can export traces captured by OpenLLMetry to Openlayer.
Make sure to include %20 between Bearer and your API key. It encodes the space character correctly in the TRACELOOP_HEADERS value.
2
Initialize OpenLLMetry instrumentation
Initialize OpenLLMetry instrumentation in your application.
Copy
Ask AI
from traceloop.sdk import TraceloopTraceloop.init(disable_batch=True)
3
Run LLMs and workflows as usual
Once instrumentation is set up, you can run your LLM calls as usual.
Trace data will be automatically captured and exported to Openlayer, where
you can begin testing and analyzing it.For example:
Copy
Ask AI
from openai import OpenAIclient = OpenAI()client.chat.completions.create( model="gpt-4o-mini", messages=[{"role": "user", "content": "How are you doing today?"}],)