Can we get end-to-end tracing of our LLM requests?

To achieve end-to-end tracing of your LLM requests, you can utilize Middleware’s LLM Observability. Here are the steps to set it up:

  1. Install the Traceloop or Opelit SDK based on your tech stack.

  2. Refer to the LLM Observability documentation for guidance.

  3. Navigate to the platform and select the LLM Observability section from the left-hand menu.

  4. You will find all the traces displayed on the LLM Traces dashboard.

  5. Click on any trace to view detailed insights, including the duration of the request, total tokens used, type of LLM request, and the model used.

  6. Additionally, you can explore the flame graph, map, and waterfall views of the entire request, along with the spans.

LLM_Traces