byron mackay
byron mackay6mo ago

My local instance of langfuse is showing

My local instance of langfuse is showing null for input, output, and metadata on the top trace. If I dig down, I see the outputs showing. I'm using a simple LangChain implementation to run an LLM (code in thread). Am I missing something?
No description
6 Replies
byron mackay
byron mackay6mo ago
template = PromptTemplate(template="{query} Context: {results}", input_variables=["query", "results"])

chain = LLMChain(llm=llm, prompt=template)



print(chain.invoke({"query": "What are the largest cost drivers in 2023?", "results": results},
config={
"callbacks": [langfuse_callback(["rag"])]
}
)
)
template = PromptTemplate(template="{query} Context: {results}", input_variables=["query", "results"])

chain = LLMChain(llm=llm, prompt=template)



print(chain.invoke({"query": "What are the largest cost drivers in 2023?", "results": results},
config={
"callbacks": [langfuse_callback(["rag"])]
}
)
)
Marc
Marc6mo ago
how did you create the langfuse_callback?
Marc
Marc6mo ago
Tagging traces - Langfuse
Tags help to filter and organize traces in Langfuse based on use case, functions/apis used, environment and other criteria.
byron mackay
byron mackay6mo ago
def langfuse_callback(tags):
langfuse = Langfuse()
trace = langfuse.trace(
tags=tags
)
return trace.get_langchain_handler()
def langfuse_callback(tags):
langfuse = Langfuse()
trace = langfuse.trace(
tags=tags
)
return trace.get_langchain_handler()
Marc
Marc6mo ago
ok, this is the expected behavior. if you only want once trace for each langchain run and then include I/O, go for the other implementation
handler = CallbackHandler(
tags=["tag-1", "tag-2"]
)
handler = CallbackHandler(
tags=["tag-1", "tag-2"]
)
Without creating a trace manually
byron mackay
byron mackay6mo ago
Working now. Thanks, @Marc!