Self-hosted models via API
Hello, could you please tell how do I evaluate models which are self-hosted via API?
Solution:Jump to solution
You can use any model with Langfuse. You can either log the usage via the low level sdks or eg via the Python decorator. Find an example here: https://langfuse.com/docs/sdk/python/decorators#log-any-llm-call
Decorator-based Python Integration - Langfuse
A decorator-based integration to give you powerful tracing, evals, and analytics for your LLM application
1 Reply
Solution
You can use any model with Langfuse. You can either log the usage via the low level sdks or eg via the Python decorator. Find an example here: https://langfuse.com/docs/sdk/python/decorators#log-any-llm-call
Decorator-based Python Integration - Langfuse
A decorator-based integration to give you powerful tracing, evals, and analytics for your LLM application