Skip to Content
GeneratedPluginsllm-observability

llm-observability

LLM Observability β€” trace every LLM call with input/output logging, token counts, latency percentiles, cost attribution.

Overview

PropertyValue
TypePlugin
Version1.0.0
AuthorN/A
LicenseN/A

Source


Auto-generated from the FrootAI primitive catalogΒ .

Last updated on