llm-chat 0.0.0
LLM-Chat
Loading...
Searching...
No Matches
llm_chat::ChatBackend Class Reference

The ChatBackend class handles the communication with the Ollama server. More...

#include <backend.h>

Inheritance diagram for llm_chat::ChatBackend:
[legend]
Collaboration diagram for llm_chat::ChatBackend:
[legend]

Public Slots

void setModel (const QString &model)
 Sets the model name.
 
void fetchModelList ()
 Fetches the list of available models from the Ollama server.
 
ThreadgetThread (const int index)
 Returns the thread at the given index.
 
void deleteThread (const int index)
 Removes the thread at the given index.
 
void clearThreads ()
 Removes all the threads.
 
void sendMessage (const int index, const QString &prompt)
 Sends a message to the Ollama server.
 
void setSystemPrompt (const QString &prompt)
 Set the system prompt.
 
void setOllamaServerUrl (const QString &url)
 Set the ollama server url.
 
void retryLatestMessage (const int index)
 Retries the latest message.
 

Signals

void modelListFetched ()
 Emitted when the list of models is fetched.
 
void modelChanged ()
 Emitted when the model is changed.
 
void newThreadCreated ()
 Emitted when the current thread is changed.
 
void systemPromptChanged ()
 Emitted when the system prompt is changed.
 
void ollamaServerUrlChanged ()
 Emitted when the ollama server url is changed.
 

Public Member Functions

 ChatBackend (QObject *parent=nullptr)
 Constructs a new ChatBackend object.
 
ThreadListthreadList () const
 Returns the chat threads.
 
ThreadProxyListthreadProxyList () const
 Returns the sorted chat threads.
 
QString model () const
 Get the name of the model.
 
QStringList modelList () const
 Get the list of available models.
 
QString systemPrompt () const
 Get the system prompt.
 
QString ollamaServerUrl () const
 Get ollama server url.
 

Properties

ThreadProxyListsortedThreads
 
QString model
 
QStringList modelList
 
QString systemPrompt
 
QString ollamaServerUrl
 

Detailed Description

The ChatBackend class handles the communication with the Ollama server.

Constructor & Destructor Documentation

◆ ChatBackend()

llm_chat::ChatBackend::ChatBackend ( QObject * parent = nullptr)
explicit

Constructs a new ChatBackend object.

Parameters
parentThe parent object.
Here is the call graph for this function:

Member Function Documentation

◆ clearThreads

void llm_chat::ChatBackend::clearThreads ( )
slot

Removes all the threads.

Here is the caller graph for this function:

◆ deleteThread

void llm_chat::ChatBackend::deleteThread ( const int index)
slot

Removes the thread at the given index.

Parameters
indexThe index of the thread in proxy model.
Here is the caller graph for this function:

◆ fetchModelList

void llm_chat::ChatBackend::fetchModelList ( )
slot

Fetches the list of available models from the Ollama server.

Here is the call graph for this function:
Here is the caller graph for this function:

◆ getThread

Thread * llm_chat::ChatBackend::getThread ( const int index)
slot

Returns the thread at the given index.

Parameters
indexThe index of the thread in proxy model.
Returns
The thread at the given index.
Here is the caller graph for this function:

◆ model()

QString llm_chat::ChatBackend::model ( ) const
nodiscard

Get the name of the model.

Returns
The name of the model.

◆ modelChanged

void llm_chat::ChatBackend::modelChanged ( )
signal

Emitted when the model is changed.

Here is the caller graph for this function:

◆ modelList()

QStringList llm_chat::ChatBackend::modelList ( ) const
inlinenodiscard

Get the list of available models.

◆ modelListFetched

void llm_chat::ChatBackend::modelListFetched ( )
signal

Emitted when the list of models is fetched.

Here is the caller graph for this function:

◆ newThreadCreated

void llm_chat::ChatBackend::newThreadCreated ( )
signal

Emitted when the current thread is changed.

Here is the caller graph for this function:

◆ ollamaServerUrl()

QString llm_chat::ChatBackend::ollamaServerUrl ( ) const
nodiscard

Get ollama server url.

Returns
The ollama server url.

◆ ollamaServerUrlChanged

void llm_chat::ChatBackend::ollamaServerUrlChanged ( )
signal

Emitted when the ollama server url is changed.

◆ retryLatestMessage

void llm_chat::ChatBackend::retryLatestMessage ( const int index)
slot

Retries the latest message.

Parameters
indexThe index of the thread in proxy model.

◆ sendMessage

void llm_chat::ChatBackend::sendMessage ( const int index,
const QString & prompt )
slot

Sends a message to the Ollama server.

Parameters
promptThe message to send.
Here is the call graph for this function:
Here is the caller graph for this function:

◆ setModel

void llm_chat::ChatBackend::setModel ( const QString & model)
slot

Sets the model name.

This function sets the name of the model to be used in the backend.

Parameters
modelThe name of the model.
Here is the call graph for this function:
Here is the caller graph for this function:

◆ setOllamaServerUrl

void llm_chat::ChatBackend::setOllamaServerUrl ( const QString & url)
slot

Set the ollama server url.

Parameters
urlThe ollama server url to set.

◆ setSystemPrompt

void llm_chat::ChatBackend::setSystemPrompt ( const QString & prompt)
slot

Set the system prompt.

Parameters
promptThe system prompt to set.

◆ systemPrompt()

QString llm_chat::ChatBackend::systemPrompt ( ) const
nodiscard

Get the system prompt.

Returns
The system prompt .

◆ systemPromptChanged

void llm_chat::ChatBackend::systemPromptChanged ( )
signal

Emitted when the system prompt is changed.

◆ threadList()

ThreadList * llm_chat::ChatBackend::threadList ( ) const
inlinenodiscard

Returns the chat threads.

Returns
All the chat threads.
Here is the caller graph for this function:

◆ threadProxyList()

ThreadProxyList * llm_chat::ChatBackend::threadProxyList ( ) const
inlinenodiscard

Returns the sorted chat threads.

Returns
The sorted chat threads.
Here is the caller graph for this function:

Property Documentation

◆ model

QString llm_chat::ChatBackend::model
readwrite

◆ modelList

QStringList llm_chat::ChatBackend::modelList
read

◆ ollamaServerUrl

QString llm_chat::ChatBackend::ollamaServerUrl
read

◆ sortedThreads

ThreadProxyList * llm_chat::ChatBackend::sortedThreads
read

◆ systemPrompt

QString llm_chat::ChatBackend::systemPrompt
readwrite

The documentation for this class was generated from the following files: