How does Latitude help reduce AI behavior drift and unexpected product breaks from prompt changes?
Latitude addresses AI behavior drift and unexpected breaks through its comprehensive observability, failure discovery, and evaluation features. It captures real inputs, outputs, and context from live traffic, providing full traces of your AI's behavior. This allows teams to identify exactly where and why small prompt changes might be causing issues, turning production failures into actionable signals for fixes, rather than relying on guesswork.
What is GEPA and how does Latitude leverage it for faster prompt iteration?
GEPA (Agrawal et al., 2025) is a technique that Latitude uses to enable 8x faster prompt iteration. While the specifics of GEPA are not fully detailed, Latitude integrates this method to streamline the process of optimizing prompts, allowing developers to quickly test and refine their LLM inputs for improved performance and reliability without extensive manual effort.
Can Latitude be deployed within a private infrastructure for high-volume or sensitive AI products?
Yes, Latitude offers an Enterprise plan that supports deployment within your own infrastructure. This option is designed for high-volume products or those with specific security and compliance requirements, providing full control over data and environment while still leveraging Latitude's reliability features.
What is the difference in log retention and trace limits between the Team and Scale plans?
The Team plan includes 90-day log retention and allows for 200,000 traces per month, with additional traces available at $10 per 100,000. The Scale plan, on the other hand, offers unlimited log retention and supports 1 million traces per month, also with extra traces at $10 per 100,000. This distinction caters to varying operational scales and data retention needs.
How does Latitude's 'Issue discovery' feature function to pinpoint problems in LLM applications?
Latitude's 'Issue discovery' feature works by analyzing the captured inputs, outputs, and context from live traffic, combined with human feedback and evaluation results. It automatically identifies patterns and anomalies that indicate failures or suboptimal performance in your LLM system, helping teams quickly pinpoint the root cause of problems and prioritize fixes.
Is it possible to self-host Latitude for free, and what features are included in the open-source version?
Yes, Latitude offers an open-source option that allows users to self-host the primary features of the platform free of charge. This provides full control over the deployment environment and includes core functionalities necessary for managing and observing LLM applications, making it suitable for teams who prefer an on-premise solution or want to explore the platform's capabilities without an immediate financial commitment.