monoai.prompts
This module provides classes for handling different types of prompts in the COICOI framework. Prompt can be provided as a string or as a xml-like .prompt file.
1""" 2This module provides classes for handling different types of prompts in the COICOI framework. 3Prompt can be provided as a string or as a xml-like .prompt file. 4""" 5 6from .prompt_chain import PromptChain 7from .prompt import Prompt 8from .iterative_prompt import IterativePrompt 9from .system_prompt import SystemPrompt 10 11__all__ = ['Prompt', 'PromptChain', 'IterativePrompt', 'SystemPrompt']
8class Prompt: 9 """ 10 A class to handle text prompts with optional response type specification. 11 12 The Prompt class can be initialized either with a direct prompt string or by loading 13 a prompt from a .prompt file. It supports response type specification and 14 data formatting through Python's string formatting. 15 16 .prompt format 17 -------- 18 Simple prompt 19 ``` 20 <prompt> 21 Hello, coicoi! 22 </prompt> 23 ``` 24 In this case you could also use plain text without the prompt tag: 25 ``` 26 Hello, coicoi! 27 ``` 28 Prompt with data formatting 29 ``` 30 <prompt> 31 Hello, {name}! 32 </prompt> 33 ``` 34 Prompt with response type 35 ``` 36 <prompt response_type="int"> 37 What is 2+2? 38 </prompt> 39 ``` 40 41 Examples 42 -------- 43 Simple prompt with response type: 44 ``` 45 prompt = Prompt(prompt="What is 2+2?", response_type=int) 46 ``` 47 Prompt with data formatting: 48 ``` 49 prompt = Prompt( 50 prompt="What is the capital of {country}?", 51 prompt_data={"country": "France"}, 52 response_type=str 53 ) 54 ``` 55 Load prompt from file: 56 ``` 57 prompt = Prompt(prompt_id="math_question") 58 ``` 59 math_question.prompt 60 ``` 61 <prompt response_type="int"> 62 What is 2+2? 63 </prompt> 64 ``` 65 """ 66 67 def __init__(self, 68 prompt_id: str | None = None, 69 prompt_data: dict | None = None, 70 prompt: str | None = None, 71 image: str | None = None, 72 is_system: bool = False, 73 response_type: type | None = None): 74 """ 75 Initialize a new Prompt instance. 76 77 Parameters 78 ---------- 79 prompt_id : str, optional 80 A .prompt file name for loading a prompt from file 81 prompt_data : dict, optional 82 Dictionary of values for formatting the prompt 83 prompt : str, optional 84 Direct prompt text if prompt_id is not provided 85 is_system : bool, optional 86 If True, the prompt is a system prompt 87 response_type : type | BaseModel, optional 88 Expected type of the response or a Pydantic BaseModel for json schema response 89 90 Raises 91 ------ 92 ValueError 93 If neither prompt_id nor prompt is provided 94 """ 95 96 self.is_system = is_system 97 self._image = image 98 prompt_response_type = None 99 if prompt_id is not None: 100 if prompt_id=="system" or prompt_id=="system.prompt": 101 self.is_system = True 102 self._prompt, prompt_response_type = _PromptParser().parse(prompt_id) 103 elif prompt is not None: 104 self._prompt = prompt 105 else: 106 raise ValueError("Either prompt_id or prompt must be provided") 107 108 if prompt_data is not None: 109 self._prompt = self._prompt.format(**prompt_data) 110 111 if prompt_response_type is not None: 112 self.response_type = prompt_response_type 113 else: 114 self.response_type = response_type 115 116 def __str__(self) -> str: 117 """ 118 Get the string representation of the prompt. 119 120 Returns 121 ------- 122 str 123 The prompt text 124 """ 125 return self._prompt 126 127 def __repr__(self) -> str: 128 """ 129 Get the official string representation of the prompt. 130 131 Returns 132 ------- 133 str 134 The prompt text 135 """ 136 return self.__str__() 137 138 139 def _is_url(self, s: str) -> bool: 140 141 from urllib.parse import urlparse 142 143 """ 144 Verifica se una stringa è un URL valido. 145 146 Args: 147 s (str): la stringa da controllare 148 149 Returns: 150 bool: True se è un URL valido, False altrimenti 151 """ 152 try: 153 result = urlparse(s) 154 return all([result.scheme in ("http", "https"), result.netloc]) 155 except ValueError: 156 return False 157 158 159 def _image_to_base64_data_uri(self, image_path: str) -> str: 160 161 import base64 162 import mimetypes 163 164 """ 165 Converte un'immagine in una stringa base64 nel formato richiesto da OpenAI API. 166 167 Args: 168 image_path (str): percorso del file immagine 169 170 Returns: 171 str: stringa "data:image/<ext>;base64,<data>" 172 """ 173 174 mime_type, _ = mimetypes.guess_type(image_path) 175 if mime_type is None: 176 raise ValueError(f"Impossibile determinare il MIME type per {image_path}") 177 178 with open(image_path, "rb") as f: 179 encoded = base64.b64encode(f.read()).decode("utf-8") 180 181 return f"data:{mime_type};base64,{encoded}" 182 183 184 def as_dict(self)->dict: 185 if self._image is None: 186 return {"type":"user" if not self.is_system else "system", "content":self.__str__()} 187 else: 188 return {"type":"user" if not self.is_system else "system", 189 "content":[ 190 {"type":"image_url", 191 "image_url":{"url":self._image if self._is_url(self._image) else self._image_to_base64_data_uri(self._image)}}, 192 {"type":"text", "text":self.__str__()}]}
A class to handle text prompts with optional response type specification.
The Prompt class can be initialized either with a direct prompt string or by loading a prompt from a .prompt file. It supports response type specification and data formatting through Python's string formatting.
.prompt format
Simple prompt
<prompt>
Hello, coicoi!
</prompt>
In this case you could also use plain text without the prompt tag:
Hello, coicoi!
Prompt with data formatting
<prompt>
Hello, {name}!
</prompt>
Prompt with response type
<prompt response_type="int">
What is 2+2?
</prompt>
Examples
Simple prompt with response type:
prompt = Prompt(prompt="What is 2+2?", response_type=int)
Prompt with data formatting:
prompt = Prompt(
prompt="What is the capital of {country}?",
prompt_data={"country": "France"},
response_type=str
)
Load prompt from file:
prompt = Prompt(prompt_id="math_question")
math_question.prompt
<prompt response_type="int">
What is 2+2?
</prompt>
67 def __init__(self, 68 prompt_id: str | None = None, 69 prompt_data: dict | None = None, 70 prompt: str | None = None, 71 image: str | None = None, 72 is_system: bool = False, 73 response_type: type | None = None): 74 """ 75 Initialize a new Prompt instance. 76 77 Parameters 78 ---------- 79 prompt_id : str, optional 80 A .prompt file name for loading a prompt from file 81 prompt_data : dict, optional 82 Dictionary of values for formatting the prompt 83 prompt : str, optional 84 Direct prompt text if prompt_id is not provided 85 is_system : bool, optional 86 If True, the prompt is a system prompt 87 response_type : type | BaseModel, optional 88 Expected type of the response or a Pydantic BaseModel for json schema response 89 90 Raises 91 ------ 92 ValueError 93 If neither prompt_id nor prompt is provided 94 """ 95 96 self.is_system = is_system 97 self._image = image 98 prompt_response_type = None 99 if prompt_id is not None: 100 if prompt_id=="system" or prompt_id=="system.prompt": 101 self.is_system = True 102 self._prompt, prompt_response_type = _PromptParser().parse(prompt_id) 103 elif prompt is not None: 104 self._prompt = prompt 105 else: 106 raise ValueError("Either prompt_id or prompt must be provided") 107 108 if prompt_data is not None: 109 self._prompt = self._prompt.format(**prompt_data) 110 111 if prompt_response_type is not None: 112 self.response_type = prompt_response_type 113 else: 114 self.response_type = response_type
Initialize a new Prompt instance.
Parameters
- prompt_id (str, optional): A .prompt file name for loading a prompt from file
- prompt_data (dict, optional): Dictionary of values for formatting the prompt
- prompt (str, optional): Direct prompt text if prompt_id is not provided
- is_system (bool, optional): If True, the prompt is a system prompt
- response_type (type | BaseModel, optional): Expected type of the response or a Pydantic BaseModel for json schema response
Raises
- ValueError: If neither prompt_id nor prompt is provided
184 def as_dict(self)->dict: 185 if self._image is None: 186 return {"type":"user" if not self.is_system else "system", "content":self.__str__()} 187 else: 188 return {"type":"user" if not self.is_system else "system", 189 "content":[ 190 {"type":"image_url", 191 "image_url":{"url":self._image if self._is_url(self._image) else self._image_to_base64_data_uri(self._image)}}, 192 {"type":"text", "text":self.__str__()}]}
5class PromptChain(Prompt): 6 7 """ 8 PromptChain class for handling sequences of prompts, the output of one prompt is used as context for the next one. 9 10 .prompt file example: 11 -------- 12 ``` 13 <promptchain> 14 <prompt> 15 What is the capital of {country}? 16 </prompt> 17 <prompt> 18 What is its population? 19 </prompt> 20 </promptchain> 21 ``` 22 23 Examples 24 -------- 25 Create a chain from a sequence of prompts: 26 ``` 27 prompts = [ 28 Prompt(prompt="What is the capital of France?"), 29 Prompt(prompt="What is its population?") 30 ] 31 chain = PromptChain(prompts=prompts) 32 ``` 33 34 Create a chain from a file: 35 ``` 36 chain = PromptChain( 37 promptchain_id="city_analysis", 38 prompts_data=[{"country": "France"}] 39 ) 40 ``` 41 """ 42 43 def __init__(self, 44 promptchain_id: str = None, 45 prompts_data: list[dict] = None, 46 prompts: List[Prompt] = None, 47 response_type: type | None = None): 48 49 """ 50 Initialize a new PromptChain instance. 51 52 Parameters 53 ---------- 54 promptchain_id : str, optional 55 A .prompt file name for loading a prompt chain from file 56 prompts_data : list[dict], optional 57 List of dictionaries containing formatting data for each prompt 58 prompts : List[Prompt], optional 59 Direct list of Prompt objects to form the chain if promptchain_id is not provided 60 61 Raises 62 ------ 63 - ValueError 64 If neither promptchain_id nor prompts is provided 65 """ 66 if promptchain_id is not None: 67 self._prompts, self._response_type = _PromptChainParser().parse(promptchain_id) 68 for i in range(len(self._prompts)): 69 self._prompts[i] = Prompt(prompt=self._prompts[i], prompt_data=prompts_data[i]) 70 elif prompts is not None: 71 self._prompts = prompts 72 else: 73 raise ValueError("Either promptchain_id or prompts must be provided") 74 self._size = len(self._prompts) 75 self.response_type = response_type 76 77 def _format(self, index: int, context: str | None = None) -> str: 78 """ 79 Format a specific prompt in the chain with optional context. 80 81 This method formats the prompt at the specified index, optionally including 82 context from previous prompts' responses. 83 84 Parameters 85 ---------- 86 index : int 87 Index of the prompt to format 88 context : str, optional 89 Context from previous prompts' responses to include 90 91 Returns 92 ------- 93 str 94 The formatted prompt text with optional context 95 """ 96 if context is None: 97 return str(self._prompts[index]) 98 else: 99 return str(self._prompts[index]) + "\n\n" + context 100 101 def __str__(self) -> str: 102 """ 103 Get the string representation of the prompt chain. 104 105 Returns 106 ------- 107 str 108 All prompts in the chain joined by newlines 109 """ 110 return "\n".join([str(prompt) for prompt in self._prompts]) 111 112 def __repr__(self) -> str: 113 """ 114 Get the official string representation of the prompt chain. 115 116 Returns 117 ------- 118 str 119 All prompts in the chain joined by newlines 120 """ 121 return self.__str__()
PromptChain class for handling sequences of prompts, the output of one prompt is used as context for the next one.
.prompt file example:
<promptchain>
<prompt>
What is the capital of {country}?
</prompt>
<prompt>
What is its population?
</prompt>
</promptchain>
Examples
Create a chain from a sequence of prompts:
prompts = [
Prompt(prompt="What is the capital of France?"),
Prompt(prompt="What is its population?")
]
chain = PromptChain(prompts=prompts)
Create a chain from a file:
chain = PromptChain(
promptchain_id="city_analysis",
prompts_data=[{"country": "France"}]
)
43 def __init__(self, 44 promptchain_id: str = None, 45 prompts_data: list[dict] = None, 46 prompts: List[Prompt] = None, 47 response_type: type | None = None): 48 49 """ 50 Initialize a new PromptChain instance. 51 52 Parameters 53 ---------- 54 promptchain_id : str, optional 55 A .prompt file name for loading a prompt chain from file 56 prompts_data : list[dict], optional 57 List of dictionaries containing formatting data for each prompt 58 prompts : List[Prompt], optional 59 Direct list of Prompt objects to form the chain if promptchain_id is not provided 60 61 Raises 62 ------ 63 - ValueError 64 If neither promptchain_id nor prompts is provided 65 """ 66 if promptchain_id is not None: 67 self._prompts, self._response_type = _PromptChainParser().parse(promptchain_id) 68 for i in range(len(self._prompts)): 69 self._prompts[i] = Prompt(prompt=self._prompts[i], prompt_data=prompts_data[i]) 70 elif prompts is not None: 71 self._prompts = prompts 72 else: 73 raise ValueError("Either promptchain_id or prompts must be provided") 74 self._size = len(self._prompts) 75 self.response_type = response_type
Initialize a new PromptChain instance.
Parameters
- promptchain_id (str, optional): A .prompt file name for loading a prompt chain from file
- prompts_data (list[dict], optional): List of dictionaries containing formatting data for each prompt
- prompts (List[Prompt], optional): Direct list of Prompt objects to form the chain if promptchain_id is not provided
Raises
- - ValueError: If neither promptchain_id nor prompts is provided
5class IterativePrompt(Prompt): 6 """ 7 IterativePrompt class for handling prompts that iterate over a sequence of data, with optional memory 8 of previous iterations' responses. This allow for the generation of longer and structured responses. 9 10 .prompt file example: 11 -------- 12 ``` 13 <iterativeprompt> 14 <prompt> 15 Generate the content of a chapter of a book about {topic} 16 The chapters are {chapters}. Generate the chapter {{data}}. 17 </prompt> 18 <prompt_memory> 19 Be sure that the chapter is coherent with the previous chapters, this is the content of the previous chapters: 20 {{data}} 21 </prompt_memory> 22 </iterativeprompt> 23 ``` 24 25 Examples 26 -------- 27 28 Iterative prompt with memory: 29 ``` 30 data = ["data types", "conditional statements", "iterative statements"] 31 prompt = IterativePrompt( 32 prompt="Generate the content of a chapter of a book about {topic}. The chapters are {chapters}. Generate the chapter {{data}}.", 33 iter_data=data, 34 prompt_memory="Be sure that the chapter is coherent with the previous chapters, this is the content of the previous chapters: {data}" 35 ) 36 ``` 37 Iterative prompt with memory from a .prompt file: 38 ``` 39 data = ["data types", "conditional statements", "iterative statements"] 40 prompt = IterativePrompt( 41 prompt_id="book_generation", 42 prompt_data={"topic": "python programming", "chapters": data}, 43 iter_data=data 44 ) 45 ``` 46 """ 47 48 def __init__(self, 49 prompt_id: str = None, 50 prompt: str = None, 51 prompt_data: dict = None, 52 iter_data: List[str] = None, 53 prompt_memory: str = "", 54 retain_all: bool = False): 55 """ 56 Initialize a new IterativePrompt instance. 57 58 Parameters 59 ---------- 60 prompt_id : str, optional 61 .prompt file name for loading a prompt from file 62 prompt : str, optional 63 Direct prompt text with {{data}} placeholder if prompt_id is not provided 64 prompt_data : dict, optional 65 Dictionary of values for formatting the base prompt 66 iter_data : List[str], optional 67 Sequence of data items to iterate over 68 prompt_memory : str, optional 69 Template for including memory of previous iterations 70 retain_all : bool, optional 71 If True, all responses are retained in memory, otherwise only the last response is retained 72 73 Raises 74 ------ 75 ValueError 76 If neither prompt_id nor prompt is provided 77 """ 78 if prompt_id is not None: 79 self._prompt, prompt_memory = _IterativePromptParser().parse(prompt_id) 80 elif prompt is not None: 81 self._prompt = prompt 82 else: 83 raise ValueError("Either prompt_id or prompt must be provided") 84 85 if prompt_data is not None: 86 self._prompt = self._prompt.format(**prompt_data) 87 88 self._iter_data = iter_data 89 self._size = len(iter_data) 90 self._prompt_memory = prompt_memory.replace("{{", "{").replace("}}", "}") 91 self._has_memory = prompt_memory != "" 92 self._retain_all = retain_all 93 94 def _format(self, index: int, context: str = "") -> str: 95 """ 96 Format the prompt for a specific iteration with optional context. 97 98 This method formats the prompt for the data item at the specified index, 99 optionally including context from previous iterations if memory is enabled. 100 101 Parameters 102 ---------- 103 index : int 104 Index of the current data item 105 context : str, optional 106 Context from previous iterations, default "" 107 108 Returns 109 ------- 110 str 111 The formatted prompt text with current data and optional memory 112 113 Examples 114 -------- 115 Format without memory: 116 >>> prompt = IterativePrompt( 117 ... prompt="Analyze {data}", 118 ... iter_data=["item1", "item2"] 119 ... ) 120 >>> formatted = prompt.format(0) 121 122 Format with memory: 123 >>> prompt = IterativePrompt( 124 ... prompt="Compare {data}", 125 ... iter_data=["item1", "item2"], 126 ... prompt_memory="Previous: {data}" 127 ... ) 128 >>> formatted = prompt.format(1, "Analysis of item1") 129 """ 130 prompt = self._prompt.format(data=self._iter_data[index]) 131 if self._has_memory and index > 0: 132 prompt += "\n\n" + self._prompt_memory.format(data=context) 133 return prompt 134 135 def __str__(self) -> str: 136 """ 137 Get the string representation of the prompt. 138 139 Returns 140 ------- 141 str 142 The base prompt template 143 """ 144 return self._prompt 145 146 def __repr__(self) -> str: 147 """ 148 Get the official string representation of the prompt. 149 150 Returns 151 ------- 152 str 153 The base prompt template 154 """ 155 return self.__str__()
IterativePrompt class for handling prompts that iterate over a sequence of data, with optional memory of previous iterations' responses. This allow for the generation of longer and structured responses.
.prompt file example:
<iterativeprompt>
<prompt>
Generate the content of a chapter of a book about {topic}
The chapters are {chapters}. Generate the chapter {{data}}.
</prompt>
<prompt_memory>
Be sure that the chapter is coherent with the previous chapters, this is the content of the previous chapters:
{{data}}
</prompt_memory>
</iterativeprompt>
Examples
Iterative prompt with memory:
data = ["data types", "conditional statements", "iterative statements"]
prompt = IterativePrompt(
prompt="Generate the content of a chapter of a book about {topic}. The chapters are {chapters}. Generate the chapter {{data}}.",
iter_data=data,
prompt_memory="Be sure that the chapter is coherent with the previous chapters, this is the content of the previous chapters: {data}"
)
Iterative prompt with memory from a .prompt file:
data = ["data types", "conditional statements", "iterative statements"]
prompt = IterativePrompt(
prompt_id="book_generation",
prompt_data={"topic": "python programming", "chapters": data},
iter_data=data
)
48 def __init__(self, 49 prompt_id: str = None, 50 prompt: str = None, 51 prompt_data: dict = None, 52 iter_data: List[str] = None, 53 prompt_memory: str = "", 54 retain_all: bool = False): 55 """ 56 Initialize a new IterativePrompt instance. 57 58 Parameters 59 ---------- 60 prompt_id : str, optional 61 .prompt file name for loading a prompt from file 62 prompt : str, optional 63 Direct prompt text with {{data}} placeholder if prompt_id is not provided 64 prompt_data : dict, optional 65 Dictionary of values for formatting the base prompt 66 iter_data : List[str], optional 67 Sequence of data items to iterate over 68 prompt_memory : str, optional 69 Template for including memory of previous iterations 70 retain_all : bool, optional 71 If True, all responses are retained in memory, otherwise only the last response is retained 72 73 Raises 74 ------ 75 ValueError 76 If neither prompt_id nor prompt is provided 77 """ 78 if prompt_id is not None: 79 self._prompt, prompt_memory = _IterativePromptParser().parse(prompt_id) 80 elif prompt is not None: 81 self._prompt = prompt 82 else: 83 raise ValueError("Either prompt_id or prompt must be provided") 84 85 if prompt_data is not None: 86 self._prompt = self._prompt.format(**prompt_data) 87 88 self._iter_data = iter_data 89 self._size = len(iter_data) 90 self._prompt_memory = prompt_memory.replace("{{", "{").replace("}}", "}") 91 self._has_memory = prompt_memory != "" 92 self._retain_all = retain_all
Initialize a new IterativePrompt instance.
Parameters
- prompt_id (str, optional): .prompt file name for loading a prompt from file
- prompt (str, optional): Direct prompt text with {{data}} placeholder if prompt_id is not provided
- prompt_data (dict, optional): Dictionary of values for formatting the base prompt
- iter_data (List[str], optional): Sequence of data items to iterate over
- prompt_memory (str, optional): Template for including memory of previous iterations
- retain_all (bool, optional): If True, all responses are retained in memory, otherwise only the last response is retained
Raises
- ValueError: If neither prompt_id nor prompt is provided
5class SystemPrompt(Prompt): 6 """ 7 A specialized Prompt class that is always a system prompt. 8 9 This class extends Prompt but automatically sets is_system=True, 10 making it convenient for creating system prompts without having 11 to specify the is_system parameter every time. 12 """ 13 14 def __init__(self, **kwargs): 15 """ 16 Initialize a new SystemPrompt instance. 17 18 All parameters are passed directly to the parent Prompt class, 19 with is_system automatically set to True. 20 21 Parameters 22 ---------- 23 **kwargs 24 All parameters accepted by the parent Prompt class: 25 - prompt_id: str, optional - A .prompt file name 26 - prompt_data: dict, optional - Data for prompt formatting 27 - prompt: str, optional - Direct prompt text 28 - response_type: type, optional - Expected response type 29 """ 30 # Always set is_system=True, but allow it to be overridden if needed 31 kwargs.setdefault('is_system', True) 32 super().__init__(**kwargs)
A specialized Prompt class that is always a system prompt.
This class extends Prompt but automatically sets is_system=True, making it convenient for creating system prompts without having to specify the is_system parameter every time.
14 def __init__(self, **kwargs): 15 """ 16 Initialize a new SystemPrompt instance. 17 18 All parameters are passed directly to the parent Prompt class, 19 with is_system automatically set to True. 20 21 Parameters 22 ---------- 23 **kwargs 24 All parameters accepted by the parent Prompt class: 25 - prompt_id: str, optional - A .prompt file name 26 - prompt_data: dict, optional - Data for prompt formatting 27 - prompt: str, optional - Direct prompt text 28 - response_type: type, optional - Expected response type 29 """ 30 # Always set is_system=True, but allow it to be overridden if needed 31 kwargs.setdefault('is_system', True) 32 super().__init__(**kwargs)
Initialize a new SystemPrompt instance.
All parameters are passed directly to the parent Prompt class, with is_system automatically set to True.
Parameters
- **kwargs: All parameters accepted by the parent Prompt class:
- prompt_id: str, optional - A .prompt file name
- prompt_data: dict, optional - Data for prompt formatting
- prompt: str, optional - Direct prompt text
- response_type: type, optional - Expected response type