faileon
faileon
LLangfuse
Created by faileon on 1/5/2024 in #get-support
Debug Langchain integration
i am running the following version of packages currently
"langchain": "^0.0.203",
"langfuse-langchain": "^2.0.0-alpha.1",
"langchain": "^0.0.203",
"langfuse-langchain": "^2.0.0-alpha.1",
and my langfuse server is
ghcr.io/langfuse/langfuse:1.24.2
ghcr.io/langfuse/langfuse:1.24.2
8 replies
LLangfuse
Created by faileon on 1/5/2024 in #get-support
Debug Langchain integration
so i am putting the handler to every possible call, including LLM creation. i dont know if that is even necessary or not. My only guess at this point is to try and update langfuse and or langchain, I have noticed there were some improvements regarding token tracking and null output issues
8 replies
LLangfuse
Created by faileon on 1/5/2024 in #get-support
Debug Langchain integration
I am passing the handler to all my invokes and chains,
const langfuseHandler = new CallbackHandler({
secretKey: this.langfuseConfig.secretKey,
publicKey: this.langfuseConfig.publicKey,
baseUrl: this.langfuseConfig.baseUrl,
userId,
});

const llm = new ChatOpenAI({
maxTokens: -1,
temperature: 0.25,
verbose: true,
frequencyPenalty: 0,
presencePenalty: 0,
topP: 1,
streaming: true,
openAIApiKey: this.openAiConfig.key,
modelName,
callbacks: [langfuseHandler],
});


const performQuestionAnswering = async (input: { question: string; context: string; history: Array<BaseMessage> }) => {
const formattedMessage = await this.promptTemplate.format({
question: input.question,
context: input.context,
});

return llm.stream([...input.history, new HumanMessage(formattedMessage)], { callbacks: [langfuseHandler] });
};

const chain = RunnableSequence.from([
{
question: (input: { question: string; chatHistory?: string }) => {
return input.question;
},
},
performQuestionRephrasing,
performContextRetrieval,
performQuestionAnswering,
]);

// create iterable stream
const iterableStream = chain.invoke(
{
question,
},
{
callbacks: [langfuseHandler],
}
);
const langfuseHandler = new CallbackHandler({
secretKey: this.langfuseConfig.secretKey,
publicKey: this.langfuseConfig.publicKey,
baseUrl: this.langfuseConfig.baseUrl,
userId,
});

const llm = new ChatOpenAI({
maxTokens: -1,
temperature: 0.25,
verbose: true,
frequencyPenalty: 0,
presencePenalty: 0,
topP: 1,
streaming: true,
openAIApiKey: this.openAiConfig.key,
modelName,
callbacks: [langfuseHandler],
});


const performQuestionAnswering = async (input: { question: string; context: string; history: Array<BaseMessage> }) => {
const formattedMessage = await this.promptTemplate.format({
question: input.question,
context: input.context,
});

return llm.stream([...input.history, new HumanMessage(formattedMessage)], { callbacks: [langfuseHandler] });
};

const chain = RunnableSequence.from([
{
question: (input: { question: string; chatHistory?: string }) => {
return input.question;
},
},
performQuestionRephrasing,
performContextRetrieval,
performQuestionAnswering,
]);

// create iterable stream
const iterableStream = chain.invoke(
{
question,
},
{
callbacks: [langfuseHandler],
}
);
for example
8 replies
LLangfuse
Created by faileon on 1/5/2024 in #get-support
Debug Langchain integration
Hi, thanks mate, appreaciate it.
8 replies
LLangfuse
Created by faileon on 12/31/2023 in #get-support
Hi all, I switched from langchain
No description
1 replies
LLangfuse
Created by faileon on 12/11/2023 in #get-support
Get tokens by users
or is that out of scope currently?
10 replies
LLangfuse
Created by faileon on 12/11/2023 in #get-support
Get tokens by users
Cool, thanks. Would you accept a PR if I went ahead and implemented it? :]
10 replies
LLangfuse
Created by faileon on 12/11/2023 in #get-support
Get tokens by users
Hi, just got my hands on it, I think I can work with that for now. Eventually I would love to see an endpoint to get tokens for specific user in a given timeframe (start,end). I guess that is not possible for now?
10 replies
LLangfuse
Created by faileon on 12/11/2023 in #get-support
Get tokens by users
Ill check it out when I get to PC and let you know, cheers
10 replies
LLangfuse
Created by faileon on 11/22/2023 in #get-support
JS SDK (Bundling)
Is it node 20 and above only now?
15 replies
LLangfuse
Created by faileon on 11/22/2023 in #get-support
JS SDK (Bundling)
Amazing thank you, I tried switching to "langfuse-langchain": "^2.0.0-alpha.1" and typescript is now happy and so am I because i dont have to cast it to any anymore :]
15 replies
LLangfuse
Created by faileon on 11/22/2023 in #get-support
JS SDK (Bundling)
No description
15 replies
LLangfuse
Created by faileon on 11/22/2023 in #get-support
JS SDK (Bundling)
Splendid, thanks for the quick reply, looking forward to the PR then 👍
15 replies