llmqore
Health Pass
- License — License: MIT
- Description — Repository has a description
- Active repo — Last push 0 days ago
- Community trust — 11 GitHub stars
Code Pass
- Code scan — Scanned 2 files during light audit, no dangerous patterns found
Permissions Pass
- Permissions — No dangerous permissions requested
Security Assessment: Overall risk is Medium. As an LLM and MCP client/server framework, the tool inherently makes external network requests to interact with AI provider APIs and local servers. It also actively facilitates the execution of arbitrary shell commands (like running `npx`) based on JSON configuration files via its MCP client and bridge features. The automated code scan (2 files reviewed) found no dangerous static patterns or hardcoded secrets, but the dynamic nature of launching subprocesses and passing data requires careful configuration.
Quality Assessment: The project is actively maintained, with its most recent update pushed today. It is backed by the permissive MIT license, making it suitable for broad integration. Community trust is currently limited given its early stage, reflected by a modest 11 GitHub stars. The repository is well-documented and includes continuous integration testing for builds.
Verdict: Use with caution.
Qt C++ library for working with AI/LLM Providers and MCP
LLMQore
Qt/C++ library for working with LLM providers and MCP servers. Streaming chat, tool calling, and a full MCP 2025-11-25 client/server — all in one library.
LLM clients — unified streaming API across six providers:
auto *client = new LLMQore::ClaudeClient(url, apiKey, model, this);
client->ask("What is Qt?", cb);
MCP server — expose tools, resources and prompts over stdio or HTTP:
// stdio (stdin/stdout, e.g. for Claude Desktop)
auto *transport = new LLMQore::McpStdioServerTransport(&app);
// or Streamable HTTP
auto *transport = new LLMQore::McpHttpServerTransport({.port = 8080, .path = "/mcp"}, &app);
auto *server = new LLMQore::McpServer(transport, cfg, &app);
server->addTool(new MyTool(server));
server->start();
MCP client — connect to MCP servers and bind their tools into LLM clients:
// Add servers one by one
client->tools()->addMcpServer({.name = "filesystem", .command = "npx",
.arguments = {"-y", "@modelcontextprotocol/server-filesystem", "/home/user"}});
// Or load from a JSON config
client->tools()->loadMcpServers(QJsonDocument::fromJson(configData).object());
loadMcpServers accepts:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/home/user"]
}
}
}
See Quick Start for complete examples.
MCP Bridge
A standalone CLI tool built on llmqore that aggregates multiple MCP servers (stdio or SSE) and re-exposes them either behind a single HTTP/SSE endpoint or as one stdio server — useful when the upstreams and the client disagree on transport.
mcp-bridge bridge.json # HTTP endpoint
mcp-bridge --stdio bridge.json # stdio (for Claude Desktop and friends)
Config uses the familiar mcpServers schema:
{
"port": 8808,
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/home/user"]
}
}
}
Prebuilt binaries for Linux/macOS/Windows (with Qt runtime bundled) are published to GitHub Releases. See MCP Bridge docs for full usage, config reference, and build instructions.
Supported Providers
| Provider | Client class | Streaming | Tools | Thinking |
|---|---|---|---|---|
| Anthropic Claude | ClaudeClient |
✓ | ✓ | ✓ |
| OpenAI (Chat Completions) | OpenAIClient |
✓ | ✓ | — |
| OpenAI (Responses API) | OpenAIResponsesClient |
✓ | ✓ | ✓ |
| Ollama | OllamaClient |
✓ | ✓ | ✓ |
| Google AI | GoogleAIClient |
✓ | ✓ | ✓ |
| llama.cpp | LlamaCppClient |
✓ | ✓ | ✓ |
MCP (Model Context Protocol)
Client and server implementation of the MCP 2025-11-25 spec:
- Transports: stdio, Streamable HTTP
- Server: tools, resources, resource templates, prompts, completions, sampling, elicitation
- Client: tools, resources, prompts, completions, sampling, elicitation, roots
See MCP Protocol Coverage for the full spec-conformance matrix.
Requirements
- C++20
- Qt 6.5+
- CMake 3.21+
Documentation
- Quick Start — examples for LLM clients, tools, MCP server and client
- Integration — FetchContent and installed setup
- MCP Bridge — aggregate stdio MCP servers behind one HTTP/SSE endpoint
- MCP Protocol Coverage — spec-conformance matrix
- Architecture — internals, for contributors
Support
- Report Issues: open an issue on GitHub
- Contribute: pull requests with bug fixes or new features are welcome
- Spread the Word: star the repository and share with fellow developers
- Financial Support:
- Bitcoin (BTC):
bc1qndq7f0mpnlya48vk7kugvyqj5w89xrg4wzg68t - Ethereum (ETH):
0xA5e8c37c94b24e25F9f1f292a01AF55F03099D8D - Litecoin (LTC):
ltc1qlrxnk30s2pcjchzx4qrxvdjt5gzuervy5mv0vy - USDT (TRC20):
THdZrE7d6epW6ry98GA3MLXRjha1DjKtUx
- Bitcoin (BTC):
License
MIT — see LICENSE.
Reviews (0)
Sign in to leave a review.
Leave a reviewNo results found