LLMs
GatedBranchingPipeline ¶
Bases: SimpleBranchingPipeline
A simple gated branching pipeline for executing multiple branches based on a condition.
This class extends the SimpleBranchingPipeline class and adds the ability to execute the branches until a branch returns a non-empty output based on a condition.
Attributes:
Name | Type | Description |
---|---|---|
branches |
List[BaseComponent]
|
The list of branches to be executed. |
Example
Source code in libs/kotaemon/kotaemon/llms/branching.py
run ¶
Execute the pipeline by running each branch and return the output of the first branch that returns a non-empty output based on the provided condition.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
condition_text
|
str
|
The condition text to evaluate for each branch. Default to None. |
None
|
**prompt_kwargs
|
Keyword arguments for the branches. |
{}
|
Returns:
Type | Description |
---|---|
Union[OutputType, None]: The output of the first branch that satisfies the |
|
condition, or None if no branch satisfies the condition. |
Raises:
Type | Description |
---|---|
ValueError
|
If condition_text is None |
Source code in libs/kotaemon/kotaemon/llms/branching.py
SimpleBranchingPipeline ¶
Bases: BaseComponent
A simple branching pipeline for executing multiple branches.
Attributes:
Name | Type | Description |
---|---|---|
branches |
List[BaseComponent]
|
The list of branches to be executed. |
Example
Source code in libs/kotaemon/kotaemon/llms/branching.py
add_branch ¶
Add a new branch to the pipeline.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
component
|
BaseComponent
|
The branch component to be added. |
required |
run ¶
Execute the pipeline by running each branch and return the outputs as a list.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
**prompt_kwargs
|
Keyword arguments for the branches. |
{}
|
Returns:
Name | Type | Description |
---|---|---|
List |
The outputs of each branch as a list. |
Source code in libs/kotaemon/kotaemon/llms/branching.py
AzureChatOpenAI ¶
Bases: BaseChatOpenAI
OpenAI chat model provided by Microsoft Azure
Source code in libs/kotaemon/kotaemon/llms/chats/openai.py
prepare_client ¶
Get the OpenAI client
Parameters:
Name | Type | Description | Default |
---|---|---|---|
async_version
|
bool
|
Whether to get the async version of the client |
False
|
Source code in libs/kotaemon/kotaemon/llms/chats/openai.py
openai_response ¶
Get the openai response
Source code in libs/kotaemon/kotaemon/llms/chats/openai.py
ChatOpenAI ¶
Bases: BaseChatOpenAI
OpenAI chat model
Source code in libs/kotaemon/kotaemon/llms/chats/openai.py
prepare_client ¶
Get the OpenAI client
Parameters:
Name | Type | Description | Default |
---|---|---|---|
async_version
|
bool
|
Whether to get the async version of the client |
False
|
Source code in libs/kotaemon/kotaemon/llms/chats/openai.py
openai_response ¶
Get the openai response
Source code in libs/kotaemon/kotaemon/llms/chats/openai.py
EndpointChatLLM ¶
Bases: ChatLLM
A ChatLLM that uses an endpoint to generate responses. This expects an OpenAI API compatible endpoint.
Attributes:
Name | Type | Description |
---|---|---|
endpoint_url |
str
|
The url of a OpenAI API compatible endpoint. |
Source code in libs/kotaemon/kotaemon/llms/chats/endpoint_based.py
run ¶
Generate response from messages Args: messages (str | BaseMessage | list[BaseMessage]): history of messages to generate response from **kwargs: additional arguments to pass to the OpenAI API Returns: LLMInterface: generated response
Source code in libs/kotaemon/kotaemon/llms/chats/endpoint_based.py
LlamaCppChat ¶
Bases: ChatLLM
Wrapper around the llama-cpp-python's Llama model
Source code in libs/kotaemon/kotaemon/llms/chats/llamacpp.py
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 |
|
client_object ¶
Get the llama-cpp-python client object
Source code in libs/kotaemon/kotaemon/llms/chats/llamacpp.py
AzureOpenAI ¶
Bases: LCCompletionMixin
, LLM
Wrapper around Langchain's AzureOpenAI class, focusing on key parameters
Source code in libs/kotaemon/kotaemon/llms/completions/langchain_based.py
LlamaCpp ¶
Bases: LCCompletionMixin
, LLM
Wrapper around Langchain's LlamaCpp class, focusing on key parameters
Source code in libs/kotaemon/kotaemon/llms/completions/langchain_based.py
OpenAI ¶
Bases: LCCompletionMixin
, LLM
Wrapper around Langchain's OpenAI class, focusing on key parameters
Source code in libs/kotaemon/kotaemon/llms/completions/langchain_based.py
ManualSequentialChainOfThought ¶
Bases: BaseComponent
Perform sequential chain-of-thought with manual pre-defined prompts
This method supports variable number of steps. Each step corresponds to a
kotaemon.pipelines.cot.Thought
. Please refer that section for
Thought's detail. This section is about chaining thought together.
Usage:
Create and run a chain of thought without "+" operator:
Create and run a chain of thought without "+" operator: Please refer the
kotaemon.pipelines.cot.Thought
section for examples.
This chain-of-thought optionally takes a termination check callback function. This function will be called after each thought is executed. It takes in a dictionary of all thought outputs so far, and it returns True or False. If True, the chain-of-thought will terminate. If unset, the default callback always returns False.
Source code in libs/kotaemon/kotaemon/llms/cot.py
run ¶
Run the manual chain of thought
Source code in libs/kotaemon/kotaemon/llms/cot.py
Thought ¶
Bases: BaseComponent
A thought in the chain of thought
- Input:
**kwargs
pairs, where key is the placeholder in the prompt, and value is the value. - Output: an output dictionary
Usage:
Create and run a thought:
Basically, when a thought is run, it will:
- Populate the prompt template with the input
**kwargs
. - Run the LLM model with the populated prompt.
- Post-process the LLM output with the post-processor.
This Thought
allows chaining sequentially with the + operator. For example:
Under the hood, when the +
operator is used, a ManualSequentialChainOfThought
is created.
Source code in libs/kotaemon/kotaemon/llms/cot.py
GatedLinearPipeline ¶
Bases: SimpleLinearPipeline
A pipeline that extends the SimpleLinearPipeline class and adds a condition attribute.
Attributes:
Name | Type | Description |
---|---|---|
condition |
Callable[[IO_Type], Any]
|
A callable function that represents the condition. |
Usage
Source code in libs/kotaemon/kotaemon/llms/linear.py
run ¶
Run the pipeline with the given arguments and return the final output as a Document object.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
condition_text
|
str
|
The condition text to evaluate. Default to None. |
None
|
llm_kwargs
|
dict
|
Additional keyword arguments for the language model call. |
{}
|
post_processor_kwargs
|
dict
|
Additional keyword arguments for the post-processor. |
{}
|
**prompt_kwargs
|
Keyword arguments for populating the prompt. |
{}
|
Returns:
Name | Type | Description |
---|---|---|
Document |
Document
|
The final output of the pipeline as a Document object. |
Raises:
Type | Description |
---|---|
ValueError
|
If condition_text is None |
Source code in libs/kotaemon/kotaemon/llms/linear.py
SimpleLinearPipeline ¶
Bases: BaseComponent
A simple pipeline for running a function with a prompt, a language model, and an optional post-processor.
Attributes:
Name | Type | Description |
---|---|---|
prompt |
BasePromptComponent
|
The prompt component used to generate the initial input. |
llm |
Union[ChatLLM, LLM]
|
The language model component used to generate the output. |
post_processor |
Union[BaseComponent, Callable[[IO_Type], IO_Type]]
|
An optional post-processor component or function. |
Example Usage
Source code in libs/kotaemon/kotaemon/llms/linear.py
run ¶
Run the function with the given arguments and return the final output as a Document object.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
llm_kwargs
|
dict
|
Keyword arguments for the llm call. |
{}
|
post_processor_kwargs
|
dict
|
Keyword arguments for the post_processor. |
{}
|
**prompt_kwargs
|
Keyword arguments for populating the prompt. |
{}
|
Returns:
Name | Type | Description |
---|---|---|
Document |
The final output of the function as a Document object. |
Source code in libs/kotaemon/kotaemon/llms/linear.py
BasePromptComponent ¶
Bases: BaseComponent
Base class for prompt components.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
template
|
PromptTemplate
|
The prompt template. |
required |
**kwargs
|
Any additional keyword arguments that will be used to populate the given template. |
{}
|
Source code in libs/kotaemon/kotaemon/llms/prompts/base.py
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 |
|
set_value ¶
Similar to __set
but for external use.
Set the values of the attributes in the object based on the provided keyword arguments.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
kwargs
|
dict
|
A dictionary with the attribute names as keys and the new values as values. |
{}
|
Returns:
Type | Description |
---|---|
None |
Source code in libs/kotaemon/kotaemon/llms/prompts/base.py
run ¶
Run the function with the given keyword arguments.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
**kwargs
|
The keyword arguments to pass to the function. |
{}
|
Returns:
Type | Description |
---|---|
The result of calling the |
|
with the given keyword arguments. |
Source code in libs/kotaemon/kotaemon/llms/prompts/base.py
PromptTemplate ¶
Base class for prompt templates.
Source code in libs/kotaemon/kotaemon/llms/prompts/template.py
5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 |
|
check_missing_kwargs ¶
Check if all the placeholders in the template are set.
This function checks if all the expected placeholders in the template are set as
attributes of the object. If any placeholders are missing, a ValueError
is raised with the names of the missing keys.
Returns:
Type | Description |
---|---|
None |
Source code in libs/kotaemon/kotaemon/llms/prompts/template.py
check_redundant_kwargs ¶
Check if all the placeholders in the template are set.
This function checks if all the expected placeholders in the template are set as
attributes of the object. If any placeholders are missing, a ValueError
is raised with the names of the missing keys.
Returns:
Type | Description |
---|---|
None |
Source code in libs/kotaemon/kotaemon/llms/prompts/template.py
populate ¶
Strictly populate the template with the given keyword arguments.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
**kwargs
|
The keyword arguments to populate the template. Each keyword corresponds to a placeholder in the template. |
{}
|
Returns:
Type | Description |
---|---|
str
|
The populated template. |
Raises:
Type | Description |
---|---|
ValueError
|
If an unknown placeholder is provided. |
Source code in libs/kotaemon/kotaemon/llms/prompts/template.py
partial_populate ¶
Partially populate the template with the given keyword arguments.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
**kwargs
|
The keyword arguments to populate the template. Each keyword corresponds to a placeholder in the template. |
{}
|
Returns:
Name | Type | Description |
---|---|---|
str |
The populated template. |