David Alonso
David Alonso
LLangfuse
Created by David Alonso on 8/21/2024 in #support
Does Langfuse require being run in a Node environment?
I'll try to dig a bit, though my question is more around the fact that the guide that uses vercel/otel is in the nextjs section when in my case the llm call doesn't run on a nextjs server so I'm not sure how to get around it
13 replies
LLangfuse
Created by David Alonso on 8/21/2024 in #support
Does Langfuse require being run in a Node environment?
fyi as much as we like langfuse this is the main reason for us to consider something like helicone which we know easily works in the edge runtime of convex
13 replies
LLangfuse
Created by David Alonso on 8/21/2024 in #support
Does Langfuse require being run in a Node environment?
any thoughts @Marc ?
13 replies
LLangfuse
Created by David Alonso on 8/21/2024 in #support
Does Langfuse require being run in a Node environment?
The issue I'm having is that I want to run Vercel AI + Langfuse inside a Convex action (see link I shared in the original post), so when I visited these docs: https://langfuse.com/docs/integrations/vercel-ai-sdk I leaned towards the Nodejs guide, but that comes with some performance losses on the Convex side, so ideally I'd use the nextjs guide but I'm not sure how to get that to work with Convex, specifically the next.config.js part feels like it wouldn't work well
13 replies
LLangfuse
Created by David Alonso on 8/21/2024 in #support
Does Langfuse require being run in a Node environment?
13 replies
LLangfuse
Created by David Alonso on 8/13/2024 in #support
Traces take too long to show up
This is an example of the trace: https://cloud.langfuse.com/project/clzgvmdsq0016dca36phntcd5/traces/30186174433271e650b5b0473481da4e?observation=5da0dd0d2118b149 Not sure I understand the structure on the right sidebar, when I click into it it looks like there's duplication, unless the LLM is actually going through the flow twice..
6 replies
LLangfuse
Created by David Alonso on 8/13/2024 in #support
Traces take too long to show up
hmm okay. Doing something like this:
const sdk = new NodeSDK({
traceExporter: new LangfuseExporter({
debug: process.env.NODE_ENV === "development",
}),
instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();

const groq = createOpenAI({
baseURL: "https://api.groq.com/openai/v1",
apiKey: process.env.GROQ_API_KEY,
});

export const createBlockFromPromptVercel = authenticatedAction({
args: {
userPrompt: v.string(),
blockId: v.id("blocks"),
workspaceCollectionInfo: zodToConvex(zWorkspaceCollectionsInfo),
},
handler: async (ctx, args) => {
const result = await createBlockFromPromptAls.run(
{
ctx: ctx,
clerkOrgId: args.clerkOrgId,
blockId: args.blockId,
workspaceCollectionInfo: args.workspaceCollectionInfo,
},
() =>
generateText({
model: groq("llama-3.1-70b-versatile"),
maxToolRoundtrips: 5, // allow up to 5 tool roundtrips
experimental_telemetry: {
isEnabled: true,
functionId: "createBlockFromPrompt",
metadata: {
environment: process.env.NODE_ENV,
},
}, // langfuse telemetry
tools: {
getFields: FireviewAITools.getFields,
createTableBlockOneShot: FireviewAITools.createTableBlockOneShot,
},
})
);

// Save messages, etc

console.log(result.text);

await sdk.shutdown(); // Flushes the trace to Langfuse
},
});
const sdk = new NodeSDK({
traceExporter: new LangfuseExporter({
debug: process.env.NODE_ENV === "development",
}),
instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();

const groq = createOpenAI({
baseURL: "https://api.groq.com/openai/v1",
apiKey: process.env.GROQ_API_KEY,
});

export const createBlockFromPromptVercel = authenticatedAction({
args: {
userPrompt: v.string(),
blockId: v.id("blocks"),
workspaceCollectionInfo: zodToConvex(zWorkspaceCollectionsInfo),
},
handler: async (ctx, args) => {
const result = await createBlockFromPromptAls.run(
{
ctx: ctx,
clerkOrgId: args.clerkOrgId,
blockId: args.blockId,
workspaceCollectionInfo: args.workspaceCollectionInfo,
},
() =>
generateText({
model: groq("llama-3.1-70b-versatile"),
maxToolRoundtrips: 5, // allow up to 5 tool roundtrips
experimental_telemetry: {
isEnabled: true,
functionId: "createBlockFromPrompt",
metadata: {
environment: process.env.NODE_ENV,
},
}, // langfuse telemetry
tools: {
getFields: FireviewAITools.getFields,
createTableBlockOneShot: FireviewAITools.createTableBlockOneShot,
},
})
);

// Save messages, etc

console.log(result.text);

await sdk.shutdown(); // Flushes the trace to Langfuse
},
});
6 replies