monoai.prompts
This module provides classes for handling different types of prompts in the COICOI framework. Prompt can be provided as a string or as a xml-like .prompt file.
1""" 2This module provides classes for handling different types of prompts in the COICOI framework. 3Prompt can be provided as a string or as a xml-like .prompt file. 4""" 5 6from .prompt_chain import PromptChain 7from .prompt import Prompt 8from .iterative_prompt import IterativePrompt 9 10__all__ = ['Prompt', 'PromptChain', 'IterativePrompt']
8class Prompt: 9 """ 10 A class to handle text prompts with optional response type specification. 11 12 The Prompt class can be initialized either with a direct prompt string or by loading 13 a prompt from a .prompt file. It supports response type specification and 14 data formatting through Python's string formatting. 15 16 .prompt format 17 -------- 18 Simple prompt 19 ``` 20 <prompt> 21 Hello, coicoi! 22 </prompt> 23 ``` 24 In this case you could also use plain text without the prompt tag: 25 ``` 26 Hello, coicoi! 27 ``` 28 Prompt with data formatting 29 ``` 30 <prompt> 31 Hello, {name}! 32 </prompt> 33 ``` 34 Prompt with response type 35 ``` 36 <prompt response_type="int"> 37 What is 2+2? 38 </prompt> 39 ``` 40 41 Examples 42 -------- 43 Simple prompt with response type: 44 ``` 45 prompt = Prompt(prompt="What is 2+2?", response_type=int) 46 ``` 47 Prompt with data formatting: 48 ``` 49 prompt = Prompt( 50 prompt="What is the capital of {country}?", 51 prompt_data={"country": "France"}, 52 response_type=str 53 ) 54 ``` 55 Load prompt from file: 56 ``` 57 prompt = Prompt(prompt_id="math_question") 58 ``` 59 math_question.prompt 60 ``` 61 <prompt response_type="int"> 62 What is 2+2? 63 </prompt> 64 ``` 65 """ 66 67 def __init__(self, 68 prompt_id: str | None = None, 69 prompt_data: dict | None = None, 70 prompt: str | None = None, 71 response_type: type | None = None): 72 """ 73 Initialize a new Prompt instance. 74 75 Parameters 76 ---------- 77 prompt_id : str, optional 78 A .prompt file name for loading a prompt from file 79 prompt_data : dict, optional 80 Dictionary of values for formatting the prompt 81 prompt : str, optional 82 Direct prompt text if prompt_id is not provided 83 response_type : type | BaseModel, optional 84 Expected type of the response or a Pydantic BaseModel for json schema response 85 86 Raises 87 ------ 88 ValueError 89 If neither prompt_id nor prompt is provided 90 """ 91 prompt_response_type = None 92 if prompt_id is not None: 93 self._prompt, prompt_response_type = _PromptParser().parse(prompt_id) 94 elif prompt is not None: 95 self._prompt = prompt 96 else: 97 raise ValueError("Either prompt_id or prompt must be provided") 98 99 if prompt_data is not None: 100 self._prompt = self._prompt.format(**prompt_data) 101 102 if prompt_response_type is not None: 103 self.response_type = prompt_response_type 104 else: 105 self.response_type = response_type 106 107 def __str__(self) -> str: 108 """ 109 Get the string representation of the prompt. 110 111 Returns 112 ------- 113 str 114 The prompt text 115 """ 116 return self._prompt 117 118 def __repr__(self) -> str: 119 """ 120 Get the official string representation of the prompt. 121 122 Returns 123 ------- 124 str 125 The prompt text 126 """ 127 return self.__str__()
A class to handle text prompts with optional response type specification.
The Prompt class can be initialized either with a direct prompt string or by loading a prompt from a .prompt file. It supports response type specification and data formatting through Python's string formatting.
.prompt format
Simple prompt
<prompt>
Hello, coicoi!
</prompt>
In this case you could also use plain text without the prompt tag:
Hello, coicoi!
Prompt with data formatting
<prompt>
Hello, {name}!
</prompt>
Prompt with response type
<prompt response_type="int">
What is 2+2?
</prompt>
Examples
Simple prompt with response type:
prompt = Prompt(prompt="What is 2+2?", response_type=int)
Prompt with data formatting:
prompt = Prompt(
prompt="What is the capital of {country}?",
prompt_data={"country": "France"},
response_type=str
)
Load prompt from file:
prompt = Prompt(prompt_id="math_question")
math_question.prompt
<prompt response_type="int">
What is 2+2?
</prompt>
67 def __init__(self, 68 prompt_id: str | None = None, 69 prompt_data: dict | None = None, 70 prompt: str | None = None, 71 response_type: type | None = None): 72 """ 73 Initialize a new Prompt instance. 74 75 Parameters 76 ---------- 77 prompt_id : str, optional 78 A .prompt file name for loading a prompt from file 79 prompt_data : dict, optional 80 Dictionary of values for formatting the prompt 81 prompt : str, optional 82 Direct prompt text if prompt_id is not provided 83 response_type : type | BaseModel, optional 84 Expected type of the response or a Pydantic BaseModel for json schema response 85 86 Raises 87 ------ 88 ValueError 89 If neither prompt_id nor prompt is provided 90 """ 91 prompt_response_type = None 92 if prompt_id is not None: 93 self._prompt, prompt_response_type = _PromptParser().parse(prompt_id) 94 elif prompt is not None: 95 self._prompt = prompt 96 else: 97 raise ValueError("Either prompt_id or prompt must be provided") 98 99 if prompt_data is not None: 100 self._prompt = self._prompt.format(**prompt_data) 101 102 if prompt_response_type is not None: 103 self.response_type = prompt_response_type 104 else: 105 self.response_type = response_type
Initialize a new Prompt instance.
Parameters
- prompt_id (str, optional): A .prompt file name for loading a prompt from file
- prompt_data (dict, optional): Dictionary of values for formatting the prompt
- prompt (str, optional): Direct prompt text if prompt_id is not provided
- response_type (type | BaseModel, optional): Expected type of the response or a Pydantic BaseModel for json schema response
Raises
- ValueError: If neither prompt_id nor prompt is provided
5class PromptChain(Prompt): 6 7 """ 8 PromptChain class for handling sequences of prompts, the output of one prompt is used as context for the next one. 9 10 .prompt file example: 11 -------- 12 ``` 13 <promptchain> 14 <prompt> 15 What is the capital of {country}? 16 </prompt> 17 <prompt> 18 What is its population? 19 </prompt> 20 </promptchain> 21 ``` 22 23 Examples 24 -------- 25 Create a chain from a sequence of prompts: 26 ``` 27 prompts = [ 28 Prompt(prompt="What is the capital of France?"), 29 Prompt(prompt="What is its population?") 30 ] 31 chain = PromptChain(prompts=prompts) 32 ``` 33 34 Create a chain from a file: 35 ``` 36 chain = PromptChain( 37 promptchain_id="city_analysis", 38 prompts_data=[{"country": "France"}] 39 ) 40 ``` 41 """ 42 43 def __init__(self, 44 promptchain_id: str = None, 45 prompts_data: list[dict] = None, 46 prompts: List[Prompt] = None, 47 response_type: type | None = None): 48 49 """ 50 Initialize a new PromptChain instance. 51 52 Parameters 53 ---------- 54 promptchain_id : str, optional 55 A .prompt file name for loading a prompt chain from file 56 prompts_data : list[dict], optional 57 List of dictionaries containing formatting data for each prompt 58 prompts : List[Prompt], optional 59 Direct list of Prompt objects to form the chain if promptchain_id is not provided 60 61 Raises 62 ------ 63 - ValueError 64 If neither promptchain_id nor prompts is provided 65 """ 66 if promptchain_id is not None: 67 self._prompts, self._response_type = _PromptChainParser().parse(promptchain_id) 68 for i in range(len(self._prompts)): 69 self._prompts[i] = Prompt(prompt=self._prompts[i], prompt_data=prompts_data[i]) 70 elif prompts is not None: 71 self._prompts = prompts 72 else: 73 raise ValueError("Either promptchain_id or prompts must be provided") 74 self._size = len(self._prompts) 75 self.response_type = response_type 76 77 def _format(self, index: int, context: str | None = None) -> str: 78 """ 79 Format a specific prompt in the chain with optional context. 80 81 This method formats the prompt at the specified index, optionally including 82 context from previous prompts' responses. 83 84 Parameters 85 ---------- 86 index : int 87 Index of the prompt to format 88 context : str, optional 89 Context from previous prompts' responses to include 90 91 Returns 92 ------- 93 str 94 The formatted prompt text with optional context 95 """ 96 if context is None: 97 return str(self._prompts[index]) 98 else: 99 return str(self._prompts[index]) + "\n\n" + context 100 101 def __str__(self) -> str: 102 """ 103 Get the string representation of the prompt chain. 104 105 Returns 106 ------- 107 str 108 All prompts in the chain joined by newlines 109 """ 110 return "\n".join([str(prompt) for prompt in self._prompts]) 111 112 def __repr__(self) -> str: 113 """ 114 Get the official string representation of the prompt chain. 115 116 Returns 117 ------- 118 str 119 All prompts in the chain joined by newlines 120 """ 121 return self.__str__()
PromptChain class for handling sequences of prompts, the output of one prompt is used as context for the next one.
.prompt file example:
<promptchain>
<prompt>
What is the capital of {country}?
</prompt>
<prompt>
What is its population?
</prompt>
</promptchain>
Examples
Create a chain from a sequence of prompts:
prompts = [
Prompt(prompt="What is the capital of France?"),
Prompt(prompt="What is its population?")
]
chain = PromptChain(prompts=prompts)
Create a chain from a file:
chain = PromptChain(
promptchain_id="city_analysis",
prompts_data=[{"country": "France"}]
)
43 def __init__(self, 44 promptchain_id: str = None, 45 prompts_data: list[dict] = None, 46 prompts: List[Prompt] = None, 47 response_type: type | None = None): 48 49 """ 50 Initialize a new PromptChain instance. 51 52 Parameters 53 ---------- 54 promptchain_id : str, optional 55 A .prompt file name for loading a prompt chain from file 56 prompts_data : list[dict], optional 57 List of dictionaries containing formatting data for each prompt 58 prompts : List[Prompt], optional 59 Direct list of Prompt objects to form the chain if promptchain_id is not provided 60 61 Raises 62 ------ 63 - ValueError 64 If neither promptchain_id nor prompts is provided 65 """ 66 if promptchain_id is not None: 67 self._prompts, self._response_type = _PromptChainParser().parse(promptchain_id) 68 for i in range(len(self._prompts)): 69 self._prompts[i] = Prompt(prompt=self._prompts[i], prompt_data=prompts_data[i]) 70 elif prompts is not None: 71 self._prompts = prompts 72 else: 73 raise ValueError("Either promptchain_id or prompts must be provided") 74 self._size = len(self._prompts) 75 self.response_type = response_type
Initialize a new PromptChain instance.
Parameters
- promptchain_id (str, optional): A .prompt file name for loading a prompt chain from file
- prompts_data (list[dict], optional): List of dictionaries containing formatting data for each prompt
- prompts (List[Prompt], optional): Direct list of Prompt objects to form the chain if promptchain_id is not provided
Raises
- - ValueError: If neither promptchain_id nor prompts is provided
5class IterativePrompt(Prompt): 6 """ 7 IterativePrompt class for handling prompts that iterate over a sequence of data, with optional memory 8 of previous iterations' responses. This allow for the generation of longer and structured responses. 9 10 .prompt file example: 11 -------- 12 ``` 13 <iterativeprompt> 14 <prompt> 15 Generate the content of a chapter of a book about {topic} 16 The chapters are {chapters}. Generate the chapter {{data}}. 17 </prompt> 18 <prompt_memory> 19 Be sure that the chapter is coherent with the previous chapters, this is the content of the previous chapters: 20 {{data}} 21 </prompt_memory> 22 </iterativeprompt> 23 ``` 24 25 Examples 26 -------- 27 28 Iterative prompt with memory: 29 ``` 30 data = ["data types", "conditional statements", "iterative statements"] 31 prompt = IterativePrompt( 32 prompt="Generate the content of a chapter of a book about {topic}. The chapters are {chapters}. Generate the chapter {{data}}.", 33 iter_data=data, 34 prompt_memory="Be sure that the chapter is coherent with the previous chapters, this is the content of the previous chapters: {data}" 35 ) 36 ``` 37 Iterative prompt with memory from a .prompt file: 38 ``` 39 data = ["data types", "conditional statements", "iterative statements"] 40 prompt = IterativePrompt( 41 prompt_id="book_generation", 42 prompt_data={"topic": "python programming", "chapters": data}, 43 iter_data=data 44 ) 45 ``` 46 """ 47 48 def __init__(self, 49 prompt_id: str = None, 50 prompt: str = None, 51 prompt_data: dict = None, 52 iter_data: List[str] = None, 53 prompt_memory: str = "", 54 retain_all: bool = False): 55 """ 56 Initialize a new IterativePrompt instance. 57 58 Parameters 59 ---------- 60 prompt_id : str, optional 61 .prompt file name for loading a prompt from file 62 prompt : str, optional 63 Direct prompt text with {{data}} placeholder if prompt_id is not provided 64 prompt_data : dict, optional 65 Dictionary of values for formatting the base prompt 66 iter_data : List[str], optional 67 Sequence of data items to iterate over 68 prompt_memory : str, optional 69 Template for including memory of previous iterations 70 retain_all : bool, optional 71 If True, all responses are retained in memory, otherwise only the last response is retained 72 73 Raises 74 ------ 75 ValueError 76 If neither prompt_id nor prompt is provided 77 """ 78 if prompt_id is not None: 79 self._prompt, prompt_memory = _IterativePromptParser().parse(prompt_id) 80 elif prompt is not None: 81 self._prompt = prompt 82 else: 83 raise ValueError("Either prompt_id or prompt must be provided") 84 85 if prompt_data is not None: 86 self._prompt = self._prompt.format(**prompt_data) 87 88 self._iter_data = iter_data 89 self._size = len(iter_data) 90 self._prompt_memory = prompt_memory.replace("{{", "{").replace("}}", "}") 91 self._has_memory = prompt_memory != "" 92 self._retain_all = retain_all 93 94 def _format(self, index: int, context: str = "") -> str: 95 """ 96 Format the prompt for a specific iteration with optional context. 97 98 This method formats the prompt for the data item at the specified index, 99 optionally including context from previous iterations if memory is enabled. 100 101 Parameters 102 ---------- 103 index : int 104 Index of the current data item 105 context : str, optional 106 Context from previous iterations, default "" 107 108 Returns 109 ------- 110 str 111 The formatted prompt text with current data and optional memory 112 113 Examples 114 -------- 115 Format without memory: 116 >>> prompt = IterativePrompt( 117 ... prompt="Analyze {data}", 118 ... iter_data=["item1", "item2"] 119 ... ) 120 >>> formatted = prompt.format(0) 121 122 Format with memory: 123 >>> prompt = IterativePrompt( 124 ... prompt="Compare {data}", 125 ... iter_data=["item1", "item2"], 126 ... prompt_memory="Previous: {data}" 127 ... ) 128 >>> formatted = prompt.format(1, "Analysis of item1") 129 """ 130 prompt = self._prompt.format(data=self._iter_data[index]) 131 if self._has_memory and index > 0: 132 prompt += "\n\n" + self._prompt_memory.format(data=context) 133 return prompt 134 135 def __str__(self) -> str: 136 """ 137 Get the string representation of the prompt. 138 139 Returns 140 ------- 141 str 142 The base prompt template 143 """ 144 return self._prompt 145 146 def __repr__(self) -> str: 147 """ 148 Get the official string representation of the prompt. 149 150 Returns 151 ------- 152 str 153 The base prompt template 154 """ 155 return self.__str__()
IterativePrompt class for handling prompts that iterate over a sequence of data, with optional memory of previous iterations' responses. This allow for the generation of longer and structured responses.
.prompt file example:
<iterativeprompt>
<prompt>
Generate the content of a chapter of a book about {topic}
The chapters are {chapters}. Generate the chapter {{data}}.
</prompt>
<prompt_memory>
Be sure that the chapter is coherent with the previous chapters, this is the content of the previous chapters:
{{data}}
</prompt_memory>
</iterativeprompt>
Examples
Iterative prompt with memory:
data = ["data types", "conditional statements", "iterative statements"]
prompt = IterativePrompt(
prompt="Generate the content of a chapter of a book about {topic}. The chapters are {chapters}. Generate the chapter {{data}}.",
iter_data=data,
prompt_memory="Be sure that the chapter is coherent with the previous chapters, this is the content of the previous chapters: {data}"
)
Iterative prompt with memory from a .prompt file:
data = ["data types", "conditional statements", "iterative statements"]
prompt = IterativePrompt(
prompt_id="book_generation",
prompt_data={"topic": "python programming", "chapters": data},
iter_data=data
)
48 def __init__(self, 49 prompt_id: str = None, 50 prompt: str = None, 51 prompt_data: dict = None, 52 iter_data: List[str] = None, 53 prompt_memory: str = "", 54 retain_all: bool = False): 55 """ 56 Initialize a new IterativePrompt instance. 57 58 Parameters 59 ---------- 60 prompt_id : str, optional 61 .prompt file name for loading a prompt from file 62 prompt : str, optional 63 Direct prompt text with {{data}} placeholder if prompt_id is not provided 64 prompt_data : dict, optional 65 Dictionary of values for formatting the base prompt 66 iter_data : List[str], optional 67 Sequence of data items to iterate over 68 prompt_memory : str, optional 69 Template for including memory of previous iterations 70 retain_all : bool, optional 71 If True, all responses are retained in memory, otherwise only the last response is retained 72 73 Raises 74 ------ 75 ValueError 76 If neither prompt_id nor prompt is provided 77 """ 78 if prompt_id is not None: 79 self._prompt, prompt_memory = _IterativePromptParser().parse(prompt_id) 80 elif prompt is not None: 81 self._prompt = prompt 82 else: 83 raise ValueError("Either prompt_id or prompt must be provided") 84 85 if prompt_data is not None: 86 self._prompt = self._prompt.format(**prompt_data) 87 88 self._iter_data = iter_data 89 self._size = len(iter_data) 90 self._prompt_memory = prompt_memory.replace("{{", "{").replace("}}", "}") 91 self._has_memory = prompt_memory != "" 92 self._retain_all = retain_all
Initialize a new IterativePrompt instance.
Parameters
- prompt_id (str, optional): .prompt file name for loading a prompt from file
- prompt (str, optional): Direct prompt text with {{data}} placeholder if prompt_id is not provided
- prompt_data (dict, optional): Dictionary of values for formatting the base prompt
- iter_data (List[str], optional): Sequence of data items to iterate over
- prompt_memory (str, optional): Template for including memory of previous iterations
- retain_all (bool, optional): If True, all responses are retained in memory, otherwise only the last response is retained
Raises
- ValueError: If neither prompt_id nor prompt is provided