llm-chat 0.0.0
LLM-Chat
|
▼Nllm_chat | |
Nparameters | This namespace contains all the parameters used in the project |
CApplication | The main application class |
CChatBackend | Handles the communication with the Ollama server |
CMessage | Chat message |
CSettings | Wrapper around QSettings that provides a more convenient interface for accessing and modifying application settings |
CThread | Model for the chat interface |
CThreadList | Model for the chat threads |
CThreadProxyList | Sorted model for the chat |