⏰Observability
Last updated
Last updated
Note: Beyondllm are currently only supports observability for OpenAI models as of now
Observability is required to monitor and evaluate the performance and behaviour of your pipeline. Some key features that observability offer are:
Tracking metrics: This includes things like response time, token usage and the kind of api call (embedding, llm, etc).
Analyzing input and output: Looking at the prompts users provide and the responses the LLM generates can provide valuable insights.
Overall, LLM observability is a crucial practice for anyone developing or using large language models. It helps to ensure that these powerful tools are reliable, effective, and monitored.
Beyondllm offer observability layer with the help of Phoenix. We have integrated phoenix within our library so you can run the dashboard with just a single command.
First you import the observe module from beyondllm
You then make an object of the observe.Observer()
You then run the Observe object and Voila you have your dashboard running. Whatever api call you make will be reflected on your dashboard.