Skip to main content

Helpers for LLM Interactions

Class for representing a prompt entry.

BasePrompt

class BasePrompt()

Base class for representing an LLM prompt.

__init__

def __init__(source: str,
output_schema: Optional[str] = None,
*,
xml_output_schema: Optional[str] = None)

Initialize and substitute constants in the prompt.

substitute_constants

def substitute_constants(text: str) -> str

Substitute constants in the prompt.

get_prompt_variables

def get_prompt_variables() -> List[str]

format

def format(**kwargs) -> "BasePrompt"

escape

def escape() -> str

Escape single curly braces into double curly braces.

The LLM prompt.

Prompt

class Prompt(BasePrompt)

Prompt class.

The prompt is passed to the LLM as primary instructions.

format

def format(**kwargs) -> "Prompt"

Format the prompt using the given keyword arguments.

Instructions to the LLM, to be passed in the prompt.

Instructions

class Instructions(BasePrompt)

Instructions class.

The instructions are passed to the LLM as secondary input. Different model may use these differently. For example, chat models may receive instructions in the system-prompt.

format

def format(**kwargs) -> "Instructions"

Format the prompt using the given keyword arguments.

PromptCallableBase

LLMResponse

class LLMResponse(ILLMResponse)

Standard information collection from LLM responses to feed the validation loop.

Attributes:

  • output str - The output from the LLM.
  • stream_output Optional[Iterator] - A stream of output from the LLM. Default None.
  • async_stream_output Optional[AsyncIterator] - An async stream of output from the LLM. Default None.
  • prompt_token_count Optional[int] - The number of tokens in the prompt. Default None.
  • response_token_count Optional[int] - The number of tokens in the response. Default None.