BasePrompt
_init_
substitute_constants
get_prompt_variables
format
escape
Prompt
format
Instructions
format
PromptCallableBase
LLMResponse
outputstr - The output from the LLM.stream_outputOptional[Iterator] - A stream of output from the LLM. Default None.async_stream_outputOptional[AsyncIterator] - An async stream of output from the LLM. Default None.prompt_token_countOptional[int] - The number of tokens in the prompt. Default None.response_token_countOptional[int] - The number of tokens in the response. Default None.