alexrosen - The example on Prompt Management->G...
The example on Prompt Management->Get Started shows getting a compiled chat prompt, but it doesn't show how that gets used with the client to get a completion. Is there a messages property on the compiled prompt?
My suggestion is to add an example on Get Started that shows a chat prompt being compiled and then used to get a completion.
1 Reply
You can use the returned chat prompt from the compile method with the openai or other llm sdks.
I've made a note to add an example here as well in the get started docs.