|
llm-chat 0.0.0
LLM-Chat
|
Public Member Functions | |
| llm_chat::ThreadList * | getThreadList () |
| llm_chat::ThreadProxyList * | getProxyList () |
Public Member Functions inherited from llm_chat::ChatBackend | |
| ChatBackend (QObject *parent=nullptr) | |
| Constructs a new ChatBackend object. | |
| ThreadList * | threadList () const |
| Returns the chat threads. | |
| ThreadProxyList * | threadProxyList () const |
| Returns the sorted chat threads. | |
| QString | model () const |
| Get the name of the model. | |
| QStringList | modelList () const |
| Get the list of available models. | |
| QString | systemPrompt () const |
| Get the system prompt. | |
| QString | ollamaServerUrl () const |
| Get ollama server url. | |
Additional Inherited Members | |
Public Slots inherited from llm_chat::ChatBackend | |
| void | setModel (const QString &model) |
| Sets the model name. | |
| void | fetchModelList () |
| Fetches the list of available models from the Ollama server. | |
| Thread * | getThread (const int index) |
| Returns the thread at the given index. | |
| void | deleteThread (const int index) |
| Removes the thread at the given index. | |
| void | clearThreads () |
| Removes all the threads. | |
| void | sendMessage (const int index, const QString &prompt) |
| Sends a message to the Ollama server. | |
| void | setSystemPrompt (const QString &prompt) |
| Set the system prompt. | |
| void | setOllamaServerUrl (const QString &url) |
| Set the ollama server url. | |
| void | retryLatestMessage (const int index) |
| Retries the latest message. | |
Signals inherited from llm_chat::ChatBackend | |
| void | modelListFetched () |
| Emitted when the list of models is fetched. | |
| void | modelChanged () |
| Emitted when the model is changed. | |
| void | newThreadCreated () |
| Emitted when the current thread is changed. | |
| void | systemPromptChanged () |
| Emitted when the system prompt is changed. | |
| void | ollamaServerUrlChanged () |
| Emitted when the ollama server url is changed. | |
Properties inherited from llm_chat::ChatBackend | |
| ThreadProxyList * | sortedThreads |
| QString | model |
| QStringList | modelList |
| QString | systemPrompt |
| QString | ollamaServerUrl |
|
inline |
|
inline |