Murali
Murali3w ago

Vercel AI SDK with Svelte

Hi, I am using vercel AI with svelte.. To configure I have used NodeSDK and create a instrumentation.ts file export const sdk = new NodeSDK({ traceExporter: new LangfuseExporter({ secretKey: env.LANGFUSE_SECRET_KEY, publicKey: env.LANGFUSE_PUBLIC_KEY, baseUrl: env.LANGFUSE_BASE_URL, debug: true, }), instrumentations: [getNodeAutoInstrumentations()], }); for the traces to work, how should I start and shutdown this SDK? If I do this before I trigger AI sdk functions, it sometimes captures the traces and sometimes it doesn't. Any help is much appreciated
11 Replies
Murali
Murali2w ago
@Marc - can you help help me with this query
Marc
Marc2w ago
Hi @Murali, How do you deploy the application or is this on localhost? Many inconsistencies are caused by the shutdown behavior of serverless environments. await sdk.shutdown() via the otel node sdk should solve for this though
Murali
Murali2w ago
Hi @Marc - thanks for the response. sdk.shutdown() didnt work for flushing. What I realized is that the langfuse exporter has a forceflush function. So I changed my instrumentation file like this import { NodeSDK } from "@opentelemetry/sdk-node"; import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node"; import { LangfuseExporter } from "langfuse-vercel"; import { env } from '$env/dynamic/private'; import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base"; /* export const sdk = new NodeSDK({ traceExporter: new LangfuseExporter({ secretKey: env.LANGFUSE_SECRET_KEY, publicKey: env.LANGFUSE_PUBLIC_KEY, baseUrl: env.LANGFUSE_BASE_URL, // optional debug: true }), instrumentations: [getNodeAutoInstrumentations()], }); */ const exporter = new LangfuseExporter({ secretKey: env.LANGFUSE_SECRET_KEY, publicKey: env.LANGFUSE_PUBLIC_KEY, baseUrl: env.LANGFUSE_BASE_URL, debug: true, }); // Create a BatchSpanProcessor with more frequent exports const spanProcessor = new BatchSpanProcessor(exporter, { maxExportBatchSize: 3, // Export after 10 spans (adjust as needed) scheduledDelayMillis: 500, // Export every 1 second (adjust as needed) }); export const sdk = new NodeSDK({ traceExporter: exporter, instrumentations: [getNodeAutoInstrumentations()], }); export async function flush() { await exporter.forceFlush() console.log('from instrumentation - flush') } export function initializeTracing() { sdk.start() console.log('from instrumentation - Tracing initialized'); } export function shutdownTracing() { return sdk.shutdown().then(() => console.log('Tracing initialized')) .catch((error) => console.error('Error initializing tracing', error)); } Now, I keep calling flush in places where I need to push the data -- Do you see any issues with this?
Marc
Marc2w ago
Yes, generally, you only want to flush on shutown, otherwise this can add latency to the application. Can you open a github issue with this context? langfuse.com/issue we plan to invest into this integration as they release a ton of updates and this helps to keep track of this beahvior + are you on the latest ai sdk and langfuse versions? this might help fix the issue as well as the integration is still experimental
Murali
Murali2w ago
Yes I am on the latest on both.
Marc
Marc2w ago
thanks for confirming
Murali
Murali2w ago
if you only do it on shutdown.. does that mean my localhost has to completely shutdown? I am not sure how to trigger the push of traces.. Because in the exmaple provided on langfuse website.. it calls shutdown in the instrumentation file itself. so everytime I try to get capture traces, I get the server shutdown in debug
Marc
Marc2w ago
good questions, can we move this to a github issue? I think generally on localhost we should make sure to process these as soon as possible
Murali
Murali2w ago
sure let me create an issue
Murali
Murali2w ago
GitHub
Vercel AI SDK and Svelte. How to push traces intermittently back to...
In langfuse document, this is the suggestion for using with NodeSDK import { openai } from "@ai-sdk/openai"; import { generateText } from "ai"; import { NodeSDK } from "@op...
Marc
Marc2w ago
thank uou