Langchain Python AzureOpenAi
trying to get langfuse to work using the first step of https://langfuse.com/docs/integrations/langchain/python
I am using AzureOpenAI as llm.
The problem is that I dont get the generation as last step and also this output.
Ive tried to setup a generation manually and it worked but not with the CallbackHandler
Langchain integration (Python) - Langfuse
Langchain users can integrated with Langfuse in seconds using the integration
51 Replies
Common issue of langchain unfortunately. You can easily overcome it by adding the handler as a callback also to the LLM
AzureOpenAI also takes a callback=[handler] argument
So you first initialize the CallbackHandler and then the llm?
yes, you init the callback handler and then pass the same handler to
1. AzureOpenAI
2. your retrievalqachain
are you on the latest version of langchain and langfuse?
langfuse=1.1.15
langchain= 0.0.339
Edit: Updated to the latest version of both
Initialize like this?
I get this Error
if this doesnt work I can also make it with CreateTrace and InitialGeneration. The Issue there was that the generation was shown at the top and not at the bottom. Maybe I missed something where you can link it to the LLMChain
Hi, i just had a look and it seems langchain changed something about their intenrals which broke our handler. Can you check out our latest SDK? https://github.com/langfuse/langfuse-python/releases/tag/v1.7.1
GitHub
Release v1.7.1 Ā· langfuse/langfuse-python
What's Changed
support new azure langchain integration by @maxdeichmann in #185
Full Changelog: v1.7.0...v1.7.1
I upgraded the langfuse version with pip but unfortunatly the problem remains.
I also tried to add the handler to the callback of AzureOpenAI,but I get the same error as mentioned above
Hi, sorry for the late response. Are you sure you upgraded to the latest library? I just tested again on my side and it seems to work with langchain 0.0.339
I will check tomorrow.
pip install langfuse --upgrade
should do it?yes
Thank you for your help
š
Sure š
So I updated langfuse and tried it with this set:
But I still get this error:
If I dont add it as handler the azure llm I still get the problem with
KeyError: 'engine'
and run not found
Probably I have to create a new project and test it out there the problem is in the project and not on langfuse side
I get the the trace when I dont add the handler to azureChatOpenAI and only the generation is missingIs this all of the stacktrace?
Not initializing Azure callback:
also the output is the same if i call the chain like this:
result = qa.invoke({"question": prompt, "chat_history": chat_history}, config={"callbacks": [handler]})
or like this
result = qa({"question": prompt, "chat_history": chat_history},callbacks=[handler])
Thanks, i am going through this rn.