Documentation Index
Fetch the complete documentation index at: https://snowglobe.so/docs/llms.txt
Use this file to discover all available pages before exploring further.
Class for representing a prompt entry.
BasePrompt
Base class for representing an LLM prompt.
_init_
def __init__(source: str,
output_schema: Optional[str] = None,
*,
xml_output_schema: Optional[str] = None)
Initialize and substitute constants in the prompt.
substitute_constants
def substitute_constants(text: str) -> str
Substitute constants in the prompt.
get_prompt_variables
def get_prompt_variables() -> List[str]
def format(**kwargs) -> "BasePrompt"
escape
Escape single curly braces into double curly braces.
The LLM prompt.
Prompt
Prompt class.
The prompt is passed to the LLM as primary instructions.
def format(**kwargs) -> "Prompt"
Format the prompt using the given keyword arguments.
Instructions to the LLM, to be passed in the prompt.
Instructions
class Instructions(BasePrompt)
Instructions class.
The instructions are passed to the LLM as secondary input. Different model may use these differently. For example, chat models may receive instructions in the system-prompt.
def format(**kwargs) -> "Instructions"
Format the prompt using the given keyword arguments.
PromptCallableBase
LLMResponse
class LLMResponse(ILLMResponse)
Standard information collection from LLM responses to feed the validation loop.
Attributes:
output str - The output from the LLM.
stream_output Optional[Iterator] - A stream of output from the LLM. Default None.
async_stream_output Optional[AsyncIterator] - An async stream of output from the LLM. Default None.
prompt_token_count Optional[int] - The number of tokens in the prompt. Default None.
response_token_count Optional[int] - The number of tokens in the response. Default None.