Trace logoRegister
Blog

AI observability beyond Python and TypeScript

22 December 2025Ornella Altunyan3 min

Most AI observability tools only support Python and TypeScript. That's a problem for banks adding LLM features to Java backends, infrastructure teams building AI tooling in Go, and startups shipping products in Ruby or C#. These developers are left with two options: build custom instrumentation from scratch, or bolt on generic tracing tools that don't understand AI-specific concepts like token usage, costs, or prompt/completion pairs.

We've heard this directly from customers, so we built native SDKs for Java, Go, Ruby, and C#, all built on OpenTelemetry for vendor-neutral observability.

Every SDK provides:

  • Automatic tracing that fits into existing infrastructure. Export traces to Braintrust, Datadog, Honeycomb, or any OTLP-compatible backend.
  • Client wrappers for OpenAI, Anthropic, and other providers that automatically capture inputs, outputs, latency, token usage, and costs.
  • Evaluation frameworks for running evals in CI/CD with custom scorers.
  • Prompt management for fetching prompts from Braintrust at runtime.

Get started

Each SDK is open source and available now:

Here's how to instrument an OpenAI client in each language:

go
import (
    traceopenai "github.com/braintrustdata/braintrust-sdk-go/trace/contrib/openai"
    "github.com/openai/openai-go"
    "github.com/openai/openai-go/option"
)

// Wrap your OpenAI client with automatic tracing
client := openai.NewClient(
    option.WithMiddleware(traceopenai.NewMiddleware()),
)

// Use the client as normal - all calls automatically traced
resp, _ := client.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
    Messages: []openai.ChatCompletionMessageParamUnion{
        openai.UserMessage("Explain quantum computing"),
    },
    Model: openai.ChatModelGPT4oMini,
})

Check out the README for each SDK for full documentation and examples. If you have questions or run into issues, reach out on Discord.

Share

Trace everything