Rise
Riseā€¢11mo ago

Langchain Python AzureOpenAi

trying to get langfuse to work using the first step of https://langfuse.com/docs/integrations/langchain/python I am using AzureOpenAI as llm. The problem is that I dont get the generation as last step and also this output. Ive tried to setup a generation manually and it worked but not with the CallbackHandler
Langchain integration (Python) - Langfuse
Langchain users can integrated with Langfuse in seconds using the integration
No description
No description
51 Replies
Marc
Marcā€¢11mo ago
Common issue of langchain unfortunately. You can easily overcome it by adding the handler as a callback also to the LLM AzureOpenAI also takes a callback=[handler] argument
Rise
Riseā€¢11mo ago
So you first initialize the CallbackHandler and then the llm?
Marc
Marcā€¢11mo ago
yes, you init the callback handler and then pass the same handler to 1. AzureOpenAI 2. your retrievalqachain are you on the latest version of langchain and langfuse?
Rise
Riseā€¢11mo ago
langfuse=1.1.15 langchain= 0.0.339 Edit: Updated to the latest version of both Initialize like this?
from langchain.chat_models import AzureChatOpenAI

llm = AzureChatOpenAI(
temperature=0.5,
deployment_name="gpt-35-turbo",
max_tokens=700,
callback=[handler]
)
from langchain.chat_models import AzureChatOpenAI

llm = AzureChatOpenAI(
temperature=0.5,
deployment_name="gpt-35-turbo",
max_tokens=700,
callback=[handler]
)
I get this Error
'engine'
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 498, in __on_llm_action
model_name = kwargs["invocation_params"]["engine"]
~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^
KeyError: 'engine'
UUID('2e33c0b5-08e0-4607-8e81-af1784885be6')
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\base.py", line 339, in generate
self._generate_with_cache(
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\base.py", line 492, in _generate_with_cache
return self._generate(
^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\openai.py", line 422, in _generate
response = self.completion_with_retry(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\openai.py", line 344, in completion_with_retry
return self.client.create(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\openai\_utils\_utils.py", line 299, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: Completions.create() got an unexpected keyword argument 'callback'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 585, in on_llm_error
self.runs[run_id] = self.runs[run_id].update(UpdateGeneration(endTime=datetime.now(), statusMessage=str(error), level=ObservationLevel.ERROR, version=self.version))
~~~~~~~~~^^^^^^^^
KeyError: UUID('cfcdd27e-a76c-4941-a4f8-ee8ff5fa1996')
'engine'
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 498, in __on_llm_action
model_name = kwargs["invocation_params"]["engine"]
~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^
KeyError: 'engine'
UUID('2e33c0b5-08e0-4607-8e81-af1784885be6')
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\base.py", line 339, in generate
self._generate_with_cache(
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\base.py", line 492, in _generate_with_cache
return self._generate(
^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\openai.py", line 422, in _generate
response = self.completion_with_retry(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\openai.py", line 344, in completion_with_retry
return self.client.create(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\openai\_utils\_utils.py", line 299, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: Completions.create() got an unexpected keyword argument 'callback'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 585, in on_llm_error
self.runs[run_id] = self.runs[run_id].update(UpdateGeneration(endTime=datetime.now(), statusMessage=str(error), level=ObservationLevel.ERROR, version=self.version))
~~~~~~~~~^^^^^^^^
KeyError: UUID('cfcdd27e-a76c-4941-a4f8-ee8ff5fa1996')
if this doesnt work I can also make it with CreateTrace and InitialGeneration. The Issue there was that the generation was shown at the top and not at the bottom. Maybe I missed something where you can link it to the LLMChain
Max
Maxā€¢11mo ago
Hi, i just had a look and it seems langchain changed something about their intenrals which broke our handler. Can you check out our latest SDK? https://github.com/langfuse/langfuse-python/releases/tag/v1.7.1
GitHub
Release v1.7.1 Ā· langfuse/langfuse-python
What's Changed support new azure langchain integration by @maxdeichmann in #185 Full Changelog: v1.7.0...v1.7.1
Rise
Riseā€¢11mo ago
I upgraded the langfuse version with pip but unfortunatly the problem remains. I also tried to add the handler to the callback of AzureOpenAI,but I get the same error as mentioned above
Max
Maxā€¢11mo ago
Hi, sorry for the late response. Are you sure you upgraded to the latest library? I just tested again on my side and it seems to work with langchain 0.0.339
Rise
Riseā€¢11mo ago
I will check tomorrow. pip install langfuse --upgrade should do it?
Max
Maxā€¢11mo ago
yes
Rise
Riseā€¢11mo ago
Thank you for your help šŸ™‚
Max
Maxā€¢11mo ago
Sure šŸ™‚
Rise
Riseā€¢11mo ago
So I updated langfuse and tried it with this set:
config = configparser.ConfigParser()
config.read(FILE_PATH)
env_public_key = config["Data"]["PUBLIC_KEY"]
env_secret_key = config["Data"]["SECRET_KEY"]

handler = CallbackHandler(env_public_key, env_secret_key)

llm = AzureChatOpenAI(
temperature=0.5,
deployment_name="gpt-35-turbo",
max_tokens=700,
callback=[handler]
)

qa = self.initialise_conversational(
llm=llm,
retriever=retriever,
)

result = qa.invoke({"question": prompt, "chat_history": chat_history}, config={"callbacks": [handler]})
config = configparser.ConfigParser()
config.read(FILE_PATH)
env_public_key = config["Data"]["PUBLIC_KEY"]
env_secret_key = config["Data"]["SECRET_KEY"]

handler = CallbackHandler(env_public_key, env_secret_key)

llm = AzureChatOpenAI(
temperature=0.5,
deployment_name="gpt-35-turbo",
max_tokens=700,
callback=[handler]
)

qa = self.initialise_conversational(
llm=llm,
retriever=retriever,
)

result = qa.invoke({"question": prompt, "chat_history": chat_history}, config={"callbacks": [handler]})
But I still get this error:
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 589, in on_llm_error
self.runs[run_id] = self.runs[run_id].update(UpdateGeneration(endTime=datetime.now(), statusMessage=str(error), level=ObservationLevel.ERROR, version=self.version))
~~~~~~~~~^^^^^^^^
KeyError: UUID('b353fe98-0007-417a-a433-deb63c15f9f5')
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 589, in on_llm_error
self.runs[run_id] = self.runs[run_id].update(UpdateGeneration(endTime=datetime.now(), statusMessage=str(error), level=ObservationLevel.ERROR, version=self.version))
~~~~~~~~~^^^^^^^^
KeyError: UUID('b353fe98-0007-417a-a433-deb63c15f9f5')
If I dont add it as handler the azure llm I still get the problem with KeyError: 'engine' and run not found Probably I have to create a new project and test it out there the problem is in the project and not on langfuse side I get the the trace when I dont add the handler to azureChatOpenAI and only the generation is missing
Max
Maxā€¢11mo ago
Is this all of the stacktrace?
Rise
Riseā€¢11mo ago
KeyError: UUID('8719e5fb-c465-4f10-9a76-79a5c2f44737')
Loaded vectordb successfully
WARNING! callback is not default parameter.
callback was transferred to model_kwargs.
Please confirm that callback is what you intended.
WARNING! callback is not default parameter.
callback was transferred to model_kwargs.
Please confirm that callback is what you intended.


> Entering new LLMChain chain...
Prompt after formatting:
Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question, in its original language.

Chat History:

Assistant: How can I help?
response = self.completion_with_retry(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\openai.py", line 344, in completion_with_retry
return self.client.create(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\openai\_utils\_utils.py", line 299, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: Completions.create() got an unexpected keyword argument 'callback'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 589, in on_llm_error
self.runs[run_id] = self.runs[run_id].update(UpdateGeneration(endTime=datetime.now(), statusMessage=str(error), level=ObservationLevel.ERROR, version=self.version))
~~~~~~~~~^^^^^^^^
KeyError: UUID('b353fe98-0007-417a-a433-deb63c15f9f5')
Stopping...
(venv) PS C:\Users\gerlor\Code\DataLake4LLM>
KeyError: UUID('8719e5fb-c465-4f10-9a76-79a5c2f44737')
Loaded vectordb successfully
WARNING! callback is not default parameter.
callback was transferred to model_kwargs.
Please confirm that callback is what you intended.
WARNING! callback is not default parameter.
callback was transferred to model_kwargs.
Please confirm that callback is what you intended.


> Entering new LLMChain chain...
Prompt after formatting:
Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question, in its original language.

Chat History:

Assistant: How can I help?
response = self.completion_with_retry(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langchain\chat_models\openai.py", line 344, in completion_with_retry
return self.client.create(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\openai\_utils\_utils.py", line 299, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: Completions.create() got an unexpected keyword argument 'callback'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 589, in on_llm_error
self.runs[run_id] = self.runs[run_id].update(UpdateGeneration(endTime=datetime.now(), statusMessage=str(error), level=ObservationLevel.ERROR, version=self.version))
~~~~~~~~~^^^^^^^^
KeyError: UUID('b353fe98-0007-417a-a433-deb63c15f9f5')
Stopping...
(venv) PS C:\Users\gerlor\Code\DataLake4LLM>
Not initializing Azure callback:
'engine'
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 493, in __on_llm_action
model_name = kwargs["invocation_params"]["engine"]
~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^
KeyError: 'engine'
run not found
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 568, in on_llm_end
raise Exception("run not found")
Exception: run not found

> Finished chain.
'engine'
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 493, in __on_llm_action
model_name = kwargs["invocation_params"]["engine"]
~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^
KeyError: 'engine'
run not found
Traceback (most recent call last):
File "C:\Users\gerlor\Code\DataLake4LLM\venv\Lib\site-packages\langfuse\callback.py", line 568, in on_llm_end
raise Exception("run not found")
Exception: run not found

> Finished chain.
also the output is the same if i call the chain like this: result = qa.invoke({"question": prompt, "chat_history": chat_history}, config={"callbacks": [handler]}) or like this result = qa({"question": prompt, "chat_history": chat_history},callbacks=[handler])
Max
Maxā€¢11mo ago
Thanks, i am going through this rn.