elsatch
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
Not done yet! Things I've discovered:
- I was able to create traces when NOT using the Haystack integration. In particular, I managed to trace using OpenAI, Ollama, LiteLLM without problems using the OpenAI SDK compatiblity of Langfuse. So it looks like the problem lies in the integration.
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
I am done for today. Tried two computers, different OSs, my own code, two cookbook examples, self-hosted langfuse, cloud langfuse and a total of zero traces containing info using the Haystack-Langfuse integration.
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
Second computer using Windows OS instead of Linux. Installed environment from scratch using:
Torch 2.4.0 returns an error about version not found, but 2.3.1 works.
When launching the default script from Haystack example:
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
Advances and checks so far:
- I have tried adding flushing to Langfuse to see if it made any difference. Still returns nulls.
- I have tried switching from LlamaCppChatGenerator to OpenAIChatGenerator, querrying local Ollama as an OpenAI compatible endpoint. Still returns nulls.
- I have tried switching from OpenAIChatGenerator calling Ollama to OpenAIChatGenerator calling GPT-4o Mini at OpenAI endpoint. Still returns nulls in the traces.
- I have tried switching from local Langfuse to Cloud Langfuse. I am still getting nulls in my traces.
So, at this point I am quite sure there is something wrong in my code 🙂
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
Just to clarify, I am still getting nulls in the output
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
I have modified my code to add flushing without any significant difference. As a reference to add manual flushing:
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
I will try flushing then. Copying the reference information from the tracing:
If you want to send a batch immediately, you can call the flush method on the client. In case of network issues, flush will log an error and retry the batch, it will never throw an exception.
Decorator
from langfuse.decorators import langfuse_context
langfuse_context.flush()
low-level SDK
langfuse.flush()
If you exit the application, use shutdown method to make sure all requests are flushed and pending requests are awaited before the process exits. On success of this function, no more events will be sent to Langfuse API.
langfuse.shutdown()
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
If anyone has faced this issue or could offer any guidance about how to solve it, I would appreciate it
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
Outputs are printed to console properly and traces are recorded on the langfuse side... but empty
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
Code continues:
18 replies
LLangfuse
•Created by elsatch on 7/26/2024 in #get-support
Null values when using Haystack integration
This is the code I'm using:
18 replies