Debug Langchain integration
Hi all, I still havent managed to solve it. Any idea what could be wrong please?
I can see in the console that llm is outputing (see below). I am running v1.24.2
I am putting the LangfuseCallback in every model I create, in every chain, but im getting no output.
3 Replies
Hi @faileon, happy to help debug this
do you pass the callback to the invoke?
usually with LCEL that's the only thing that's necessary
Hi, thanks mate, appreaciate it.
I am passing the handler to all my invokes and chains,
for example
so i am putting the handler to every possible call, including LLM creation. i dont know if that is even necessary or not.
My only guess at this point is to try and update langfuse and or langchain, I have noticed there were some improvements regarding token tracking and null output issues
i am running the following version of packages currently
and my langfuse server is
Hi, I am also facing a similar problem. I followed the LCEL implementation in the documentation by including under config when invoking the chain. However in the langfuse tracers UI, I am getting null on every output.