Skip to main content
Traces let you see what’s happening inside your application—LLM calls, tool invocations, retrieval steps, and more. This guide walks you through sending your first traces to Phoenix.
1
Set environment variables to connect to your Phoenix instance:
export PHOENIX_API_KEY="your-api-key"

# Local (default, no API key required)
export PHOENIX_COLLECTOR_ENDPOINT="http://localhost:6006"

# Phoenix Cloud
# export PHOENIX_COLLECTOR_ENDPOINT="https://app.phoenix.arize.com/s/your-space-name"

# Self-hosted
# export PHOENIX_COLLECTOR_ENDPOINT="https://your-phoenix-instance.com"
You can find your collector endpoint and API key in the Settings page of your Phoenix instance.
2
Install the Phoenix OTEL package and an instrumentation library. This quickstart uses OpenAI, but Phoenix supports many providers and frameworks including Anthropic, LangChain, LlamaIndex, Vercel AI SDK, Mastra, and more.
pip install arize-phoenix-otel openinference-instrumentation-openai openai
See all Python integrations.
3
Register Phoenix as your trace collector. The register function configures your application to export spans to Phoenix and auto-instruments supported libraries.
from phoenix.otel import register

tracer_provider = register(
    project_name="my-llm-app",
    auto_instrument=True,  # Auto-instruments OpenAI, LangChain, etc.
    batch=False,  # Send spans immediately (recommended for local dev)
)
See the arize-phoenix-otel reference for all options.
Local vs Production: We set batch=False here so traces appear immediately—ideal for local development. In production, use batch=True (the default) for better performance. See the SDK reference for configuration options like batching, gRPC transport, and custom headers.
4
Make an LLM call to generate your first trace. OpenAI calls are automatically captured:
import openai

client = openai.OpenAI()
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Why is the sky blue?"}],
)
print(response.choices[0].message.content)
Use @tracer.agent, @tracer.tool, and @tracer.chain decorators to create custom spans around your own functions. See the tracing guide for details.
5
Open Phoenix to see your traces:
Phoenix Traces View

Next Steps