jeecee
jeecee14mo ago

Langchain Integration [Huggingface]

The first 10 minutes test works, that is nice. I have a small sample using langchain to connect to OpenAI and Huggingface. It seems the HuggingFace version does not work completely. I get and error and the Generation is not logged like with OpenAI. This is the error:
ERROR:root:'model_name'
ERROR:root:run not found
ERROR:root:'model_name'
ERROR:root:run not found
Below is the code I use:
def initialize_huggingface_llm(prompt: PromptTemplate, temperature: float, max_length: int) -> LLMChain:
repo_id = "google/flan-t5-xxl"

# Experiment with the max_length parameter and temperature
llm = HuggingFaceHub(
repo_id=repo_id, model_kwargs={"temperature": temperature, "max_length": max_length}
)
return LLMChain(prompt=prompt, llm=llm)

def generate_prompt() -> PromptTemplate:
# You can play around with the prompt, see the results change if you make small changes to the prompt
template = """Given the name of the country, give the languages that are spoken in that country.
Start with the official languages of the country and continue with the other languages of that country.
Country: {country}?
Languages:
"""

return PromptTemplate(template=template, input_variables=["country"])
def initialize_huggingface_llm(prompt: PromptTemplate, temperature: float, max_length: int) -> LLMChain:
repo_id = "google/flan-t5-xxl"

# Experiment with the max_length parameter and temperature
llm = HuggingFaceHub(
repo_id=repo_id, model_kwargs={"temperature": temperature, "max_length": max_length}
)
return LLMChain(prompt=prompt, llm=llm)

def generate_prompt() -> PromptTemplate:
# You can play around with the prompt, see the results change if you make small changes to the prompt
template = """Given the name of the country, give the languages that are spoken in that country.
Start with the official languages of the country and continue with the other languages of that country.
Country: {country}?
Languages:
"""

return PromptTemplate(template=template, input_variables=["country"])
3 Replies
Marc
Marc14mo ago
Thanks for the pointer, we'll look into it! Copying your second message
if __name__ == '__main__':
load_dotenv()

handler = CallbackHandler(os.getenv('LANGFUSE_PUBLIC_KEY'),
os.getenv('LANGFUSE_SECRET_KEY'),
os.getenv('LANGFUSE_HOST'))

# Try other values to see impact on results
country = "belgium"
country_max_length = 100
country_temperature = 0.1

country_prompt = generate_prompt()

hugging_chain = initialize_huggingface_llm(prompt=country_prompt,
temperature=country_temperature,
max_length=country_max_length)

print("HuggingFace")
print(hugging_chain.run(country, callbacks=[handler]))
if __name__ == '__main__':
load_dotenv()

handler = CallbackHandler(os.getenv('LANGFUSE_PUBLIC_KEY'),
os.getenv('LANGFUSE_SECRET_KEY'),
os.getenv('LANGFUSE_HOST'))

# Try other values to see impact on results
country = "belgium"
country_max_length = 100
country_temperature = 0.1

country_prompt = generate_prompt()

hugging_chain = initialize_huggingface_llm(prompt=country_prompt,
temperature=country_temperature,
max_length=country_max_length)

print("HuggingFace")
print(hugging_chain.run(country, callbacks=[handler]))
Marc
Marc14mo ago
Created a new issue for this problem as @Dev Khant (Mem0) was interested to also have a look at this: https://github.com/langfuse/langfuse-python/issues/16
GitHub
[Langchain Integration] Support HuggingFaceHub as LLM · Issue #16 ·...
A user got the following error when using the Langchain integration with HuggingFaceHub for LLMs. ERROR:root:'model_name' ERROR:root:run not found Steps investigate issue to find root cause...
Marc
Marc14mo ago
Could be a minor fix in the Handler. I’ll give it a try