Skip to content
traceAI logo

traceAI

Unclaimed

Open-source observability for AI applications, tracing every LLM call, prompt, and agent decision.

Visit Website

TL;DR - traceAI

  • Open-source AI tracing built on OpenTelemetry.
  • Captures LLM calls, prompts, tokens, and agent decisions.
  • Integrates with existing OTel-compatible observability backends.
Pricing: Free plan available
Best for: Growing teams

Pros & Cons

Pros

  • Leverages existing observability infrastructure (OpenTelemetry-native)
  • Supports multiple programming languages and AI frameworks
  • Provides deep, granular insights into AI application behavior
  • Easy to integrate with minimal code changes
  • Open-source and community-driven

Cons

  • Requires familiarity with OpenTelemetry concepts for advanced usage
  • Initial setup might involve configuring environment variables and API keys
  • Relies on external observability platforms for data visualization and analysis

Key Features

Standardized tracing for AI workflows using OpenTelemetry spans and attributesZero-config setup with drop-in instrumentation for minimal code changesMulti-framework support with 50+ integrations across Python, TypeScript, Java, and C#Vendor-agnostic, compatible with any OpenTelemetry backend (e.g., Datadog, Grafana, Jaeger)Rich context capture including prompts, completions, tokens, and model parametersProduction-ready with async support, streaming, error handling, and performance optimization

Pricing Plans

Free Trial

Free

$0 USD per month

  • Unlimited public/private repositories
  • Dependabot security and version updates
  • 2,000 CI/CD minutes/month (Free for public repositories)
  • 500MB of Packages storage (Free for public repositories)
  • Issues & Projects
  • Community support
  • GitHub Copilot Access
  • GitHub Codespaces Access

Team

$4 USD per user/month

  • Everything included in Free
  • Access to GitHub Codespaces
  • Repository rules
  • Multiple reviewers in pull requests
  • Draft pull requests
  • Code owners
  • Required reviewers
  • Pages and Wikis
  • Environment deployment branches and secrets
  • 3,000 CI/CD minutes/month (Free for public repositories)
  • 2GB of Packages storage (Free for public repositories)
  • Web-based support
  • GitHub Secret Protection
  • GitHub Code Security

Enterprise

$21 USD per user/month

  • Everything included in Team
  • Data residency
  • Enterprise Managed Users
  • User provisioning through SCIM
  • Enterprise Account to centrally manage multiple organizations
  • Environment protection rules
  • Repository rules
  • Audit Log API
  • SOC1, SOC2, type 2 reports annually
  • FedRAMP Tailored Authority to Operate

What is traceAI?

Editorial review
traceAI is an open-source AI tracing framework built on OpenTelemetry, designed to provide full visibility into AI applications. It captures detailed information about every LLM call, prompt, token count, retrieval step, and agent decision, transforming this data into structured traces. These traces are then sent to any OpenTelemetry-compatible backend, such as Datadog, Grafana, Jaeger, or Future AGI, allowing users to leverage their existing observability tools without needing new vendors or dashboards. The tool is ideal for developers and teams building AI applications who need deep insights into the performance and behavior of their LLMs, agents, and other AI components. It supports over 50 AI frameworks across Python, TypeScript, Java, and C#, offering zero-config tracing and consistent APIs. By providing rich context including prompts, completions, tokens, model parameters, and tool calls, traceAI helps in debugging, optimizing, and understanding complex AI workflows in production environments.

Reviews

Be the first to review traceAI

Your take helps the next buyer. Verified LinkedIn reviewers get a badge.

Write a review

Best traceAI Alternatives

Top alternatives based on features, pricing, and user needs.

View full list →

Explore More

traceAI FAQ

How does traceAI ensure compatibility with various AI frameworks and programming languages?

traceAI achieves broad compatibility by offering specific instrumentors for over 50 AI frameworks across Python, TypeScript, Java, and C#. These instrumentors provide consistent APIs and are designed for zero-config setup, allowing developers to easily integrate tracing into their existing AI applications regardless of the underlying framework or language.

What kind of data does traceAI capture from an AI application, beyond basic LLM calls?

Beyond basic LLM calls, traceAI captures rich contextual data including the full prompt, completion details, token counts, model parameters, tool calls made by agents, and decisions made during agent execution. It also traces retrieval steps in RAG (Retrieval Augmented Generation) systems, providing a comprehensive view of the AI workflow.

Can traceAI be used with a custom OpenTelemetry collector or a self-hosted OpenTelemetry backend?

Yes, traceAI is built on OpenTelemetry and is designed to be vendor-agnostic. This means it can send structured traces to any OpenTelemetry-compatible backend, including custom OpenTelemetry collectors or self-hosted solutions, in addition to popular commercial offerings like Datadog, Grafana, and Jaeger.

What are the specific environment variables or configuration steps required to get traceAI working with an OpenAI application?

For an OpenAI application, you typically need to set FI_API_KEY, FI_SECRET_KEY (for traceAI's own backend if used, or for project registration), and OPENAI_API_KEY. After installing the relevant traceai-openai package, you register a tracer provider and then instrument the OpenAI client, which automatically captures tracing data.

Source: github.com