David Alonso
Does Langfuse require being run in a Node environment?
I'll try to dig a bit, though my question is more around the fact that the guide that uses vercel/otel is in the nextjs section when in my case the llm call doesn't run on a nextjs server so I'm not sure how to get around it
13 replies
Does Langfuse require being run in a Node environment?
The issue I'm having is that I want to run Vercel AI + Langfuse inside a Convex action (see link I shared in the original post), so when I visited these docs: https://langfuse.com/docs/integrations/vercel-ai-sdk I leaned towards the Nodejs guide, but that comes with some performance losses on the Convex side, so ideally I'd use the nextjs guide but I'm not sure how to get that to work with Convex, specifically the next.config.js part feels like it wouldn't work well
13 replies
Does Langfuse require being run in a Node environment?
This might be relevant: https://github.com/evanderkoogh/otel-cf-workers
13 replies
Traces take too long to show up
This is an example of the trace: https://cloud.langfuse.com/project/clzgvmdsq0016dca36phntcd5/traces/30186174433271e650b5b0473481da4e?observation=5da0dd0d2118b149
Not sure I understand the structure on the right sidebar, when I click into it it looks like there's duplication, unless the LLM is actually going through the flow twice..
6 replies