Openai integration envs
hey langfuse team, I just want to ask if you could help to upgrade the implementation of the Python OpenAI API import logic? Now the system variables should be defined before the import:
In practice, those varaibles can be included within class, or be loaded from a config file. In this case I don't prefer writing the import within the class, but I want to do
from langfuse.openai import openai
at the begining of the file. This will throws an error. May you help to suggest the correct way of importing the openai
module?5 Replies
Thx for the feedback on this
In your application context, don’t you load the system variables kind of globally, eg using poetry?
Or do you manually set them via os.environ?
I use os.environ
I created an open-source tool, and I hope users could choose whether to use langfuse logging. In this case, I need them to pass "use-langfuse-logging" as a variable in parser. So the import of "langfuse.openai" is after in the
__init__
of the main class, instead of at the beginning of the file.
Also, may I know if the current langfuse supports gpt-3.5-turbo-16k?
Seems like I always encounter errors when I try to set this modelmakes sense, thanks for the additional context
will check with @Max later who built this part of the Python SDK
Can you share a code snippet? Should work as we parse the different gpt3.5 variants
Does the tokenization or cost calculation not work in your case? Happy to make a quick fix
Sorry I think that was a false positive at my end. Just worked it out and everything is quite smooth now:)
Awesome 🙂