JetBrains announced that Tracy, a new open-source Kotlin library, enables developers to trace, monitor, and assess AI-driven functionalities straight from their Kotlin or Java development environments.
VladyslaV Travel photo / Shutterstock
JetBrains recently unveiled Tracy, an AI tracing library designed for use with both Kotlin and Java programming languages.
Made public on March 11 and available on GitHub, Tracy empowers developers to track, observe, and assess AI-driven functionalities straight from their Kotlin or Java projects, according to JetBrains. This open-source Kotlin library offers a consistent API to collect structured traces, assisting developers in troubleshooting issues, measuring operational duration, and monitoring Large Language Model (LLM) activity across various model calls, tool invocations, and bespoke application logic.
Tracy adheres to the OpenTelemetry Generative AI Semantic Conventions concerning span attributes and event nomenclature, guaranteeing that traces are compatible with any OpenTelemetry-enabled back-end system. JetBrains highlighted these particular applications for Tracy:
- Monitoring AI client interactions to record messages, associated costs, token consumption, and overall execution duration.
- Observing any function’s activity to document its inputs, outputs, and how long it takes to execute.
- Manually generating and overseeing spans.
- Transferring traces to compatible back-end platforms (currently including Langfuse and Weave).
Operating under the Apache 2.0 License, Tracy supports Kotlin versions 2.0.0 and newer, along with Java versions 17 and above. It allows for integrations with SDKs from OpenAI, Anthropic, and Gemini. Furthermore, JetBrains stated that the library functions seamlessly with popular Kotlin/LLM setups, encompassing OkHttp and Ktor clients, in addition to those from OpenAI, Anthropic, and Gemini.