Openai
BaseOpenAIEmbeddings ¶
Bases: BaseEmbeddings
Base interface for OpenAI embedding model, using the openai library.
This class exposes the parameters in resources.Chat. To subclass this class:
1 2 3 |
|
Source code in libs/kotaemon/kotaemon/embeddings/openai.py
38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 |
|
prepare_client ¶
Get the OpenAI client
Parameters:
Name | Type | Description | Default |
---|---|---|---|
async_version
|
bool
|
Whether to get the async version of the client |
False
|
OpenAIEmbeddings ¶
Bases: BaseOpenAIEmbeddings
OpenAI chat model
Source code in libs/kotaemon/kotaemon/embeddings/openai.py
prepare_client ¶
Get the OpenAI client
Parameters:
Name | Type | Description | Default |
---|---|---|---|
async_version
|
bool
|
Whether to get the async version of the client |
False
|
Source code in libs/kotaemon/kotaemon/embeddings/openai.py
openai_response ¶
Get the openai response
Source code in libs/kotaemon/kotaemon/embeddings/openai.py
AzureOpenAIEmbeddings ¶
Bases: BaseOpenAIEmbeddings
Source code in libs/kotaemon/kotaemon/embeddings/openai.py
prepare_client ¶
Get the OpenAI client
Parameters:
Name | Type | Description | Default |
---|---|---|---|
async_version
|
bool
|
Whether to get the async version of the client |
False
|
Source code in libs/kotaemon/kotaemon/embeddings/openai.py
openai_response ¶
Get the openai response
Source code in libs/kotaemon/kotaemon/embeddings/openai.py
split_text_by_chunk_size ¶
Split the text into chunks of a given size
Parameters:
Name | Type | Description | Default |
---|---|---|---|
text
|
str
|
text to split |
required |
chunk_size
|
int
|
size of each chunk |
required |
Returns:
Type | Description |
---|---|
list[list[int]]
|
list of chunks (as tokens) |