feat: make fully standalone — remove local LoRA Manager dependency

Proxy JS widget files from the remote instance via /extensions/ComfyUI-Lora-Manager/
prefix, and handle send_sync routes locally instead of falling through to the
original package. This eliminates the requirement to install ComfyUI-Lora-Manager
alongside.

- Add /extensions/ComfyUI-Lora-Manager/ to proxy prefixes
- Replace _SEND_SYNC_SKIP_ROUTES with local handler functions that fetch
  data from remote and broadcast events via PromptServer.send_sync()
- Add lm_remote_bootstrap.js to load Vue widget bundle from remote
- Update docstrings and README to reflect standalone operation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-02-22 01:21:45 +01:00
parent 5fffb3b0a6
commit 1ba04c6d88
4 changed files with 180 additions and 38 deletions

View File

@@ -14,22 +14,21 @@ When ComfyUI runs on a GPU workstation and LoRA Manager runs in Docker on a NAS
ComfyUI Workstation NAS (Docker) ComfyUI Workstation NAS (Docker)
+--------------------------+ +------------------------+ +--------------------------+ +------------------------+
| ComfyUI | | LoRA Manager | | ComfyUI | | LoRA Manager |
| +- ComfyUI-Lora-Manager | HTTP API | +- SQLite metadata DB | | +- ComfyUI-LM-Remote | HTTP API | +- SQLite metadata DB |
| | (widgets + types) |<------------>| +- CivitAI sync | | | (this package) |<------------>| +- CivitAI sync |
| +- ComfyUI-LM-Remote | | +- Port 8188 | | +- /mnt/loras/ (NFS) | | +- Port 8188 |
| | (this package) | +------------------------+ +--------------------------+ +------------------------+
| +- /mnt/loras/ (NFS) | |
+--------------------------+ |
| | | |
+------- Shared NFS/SMB storage ------------+ +------- Shared NFS/SMB storage ------------+
``` ```
## Prerequisites ## Prerequisites
- [ComfyUI-Lora-Manager](https://github.com/willmiao/ComfyUI-Lora-Manager) must be installed alongside -- it provides widget JS files and custom widget types
- A running LoRA Manager instance accessible over the network (e.g., in Docker) - A running LoRA Manager instance accessible over the network (e.g., in Docker)
- Shared storage so both machines see the same LoRA files at compatible paths - Shared storage so both machines see the same LoRA files at compatible paths
> **Note:** The original [ComfyUI-Lora-Manager](https://github.com/willmiao/ComfyUI-Lora-Manager) package is **not required**. Widget JS files and Vue widget types are served from the remote instance via the proxy. You may still install it alongside if you want the original (non-remote) nodes available too.
## Installation ## Installation
Clone into your ComfyUI `custom_nodes/` directory: Clone into your ComfyUI `custom_nodes/` directory:
@@ -103,16 +102,17 @@ All nodes appear under the **Lora Manager** category in the ComfyUI node menu, w
An aiohttp middleware is registered at startup that intercepts requests to LoRA Manager endpoints and forwards them to the remote instance: An aiohttp middleware is registered at startup that intercepts requests to LoRA Manager endpoints and forwards them to the remote instance:
**Proxied routes:** **Proxied routes:**
- `/api/lm/*` -- all REST API endpoints - `/api/lm/*` -- all REST API endpoints (except send_sync routes below)
- `/extensions/ComfyUI-Lora-Manager/*` -- widget JS files and Vue widget bundle
- `/loras_static/*`, `/locales/*`, `/example_images_static/*` -- static assets - `/loras_static/*`, `/locales/*`, `/example_images_static/*` -- static assets
- `/loras`, `/checkpoints`, `/embeddings`, `/loras/recipes`, `/statistics` -- web UI pages - `/loras`, `/checkpoints`, `/embeddings`, `/loras/recipes`, `/statistics` -- web UI pages
- `/ws/fetch-progress`, `/ws/download-progress`, `/ws/init-progress` -- WebSocket connections - `/ws/fetch-progress`, `/ws/download-progress`, `/ws/init-progress` -- WebSocket connections
**Not proxied** (handled locally to preserve `send_sync` event broadcasting): **Handled locally** (events broadcast to local browser via `send_sync`):
- `/api/lm/loras/get_trigger_words` - `/api/lm/loras/get_trigger_words` -- fetches trigger words from remote, broadcasts `trigger_word_update`
- `/api/lm/update-lora-code` - `/api/lm/update-lora-code` -- broadcasts `lora_code_update`
- `/api/lm/update-node-widget` - `/api/lm/update-node-widget` -- broadcasts `lm_widget_update`
- `/api/lm/register-nodes` - `/api/lm/register-nodes` -- no-op in remote mode
### Remote Metadata ### Remote Metadata

View File

@@ -3,13 +3,15 @@ ComfyUI-LM-Remote — Remote LoRA Manager integration for ComfyUI.
Provides: Provides:
1. A reverse-proxy middleware that forwards all LoRA Manager API/UI/WS 1. A reverse-proxy middleware that forwards all LoRA Manager API/UI/WS
requests to a remote Docker instance. requests to a remote Docker instance (including JS widget files).
2. Remote-aware node classes that fetch metadata via HTTP instead of the 2. Remote-aware node classes that fetch metadata via HTTP instead of the
local ServiceRegistry, while still loading LoRA files from local local ServiceRegistry, while still loading LoRA files from local
NFS/SMB-mounted paths. NFS/SMB-mounted paths.
3. Local send_sync handlers so widget events (trigger words, lora code
updates) are broadcast to the local ComfyUI frontend.
Requires the original ComfyUI-Lora-Manager package to be installed alongside Does NOT require the original ComfyUI-Lora-Manager package to be installed.
for its widget JS files and custom widget types. Widget JS files and Vue widget types are served from the remote instance.
""" """
from __future__ import annotations from __future__ import annotations

159
proxy.py
View File

@@ -5,9 +5,8 @@ Registered as an aiohttp middleware on PromptServer.instance.app. It intercepts
requests matching known LoRA Manager URL prefixes and proxies them to the remote requests matching known LoRA Manager URL prefixes and proxies them to the remote
Docker instance. Non-matching requests fall through to the regular ComfyUI router. Docker instance. Non-matching requests fall through to the regular ComfyUI router.
Routes that use ``PromptServer.instance.send_sync()`` are explicitly excluded Routes that use ``send_sync`` are handled locally so that events are broadcast
from proxying so the local original LoRA Manager handler can broadcast events to the local ComfyUI frontend (the remote instance has no connected browsers).
to the local ComfyUI frontend.
""" """
from __future__ import annotations from __future__ import annotations
@@ -18,6 +17,7 @@ import aiohttp
from aiohttp import web, WSMsgType from aiohttp import web, WSMsgType
from .config import remote_config from .config import remote_config
from .remote_client import RemoteLoraClient
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -29,6 +29,7 @@ _PROXY_PREFIXES = (
"/loras_static/", "/loras_static/",
"/locales/", "/locales/",
"/example_images_static/", "/example_images_static/",
"/extensions/ComfyUI-Lora-Manager/",
) )
# Page routes served by the standalone LoRA Manager web UI # Page routes served by the standalone LoRA Manager web UI
@@ -47,21 +48,134 @@ _WS_ROUTES = {
"/ws/init-progress", "/ws/init-progress",
} }
# Routes that call send_sync on the remote side — these are NOT proxied. # ---------------------------------------------------------------------------
# Instead they fall through to the local original LoRA Manager handler, # Local handlers for routes that need send_sync (event broadcasting)
# which broadcasts events to the local ComfyUI frontend. The remote # ---------------------------------------------------------------------------
# handler would broadcast to its own (empty) frontend, which is useless. # These routes are NOT proxied. They are handled locally so that events
# # reach the local ComfyUI frontend via PromptServer.send_sync().
# These routes:
# /api/lm/loras/get_trigger_words -> trigger_word_update event
# /api/lm/update-lora-code -> lora_code_update event def _get_prompt_server():
# /api/lm/update-node-widget -> lm_widget_update event """Lazily import PromptServer to avoid circular imports at module level."""
# /api/lm/register-nodes -> lora_registry_refresh event from server import PromptServer # type: ignore
_SEND_SYNC_SKIP_ROUTES = { return PromptServer.instance
"/api/lm/loras/get_trigger_words",
"/api/lm/update-lora-code",
"/api/lm/update-node-widget", def _parse_node_id(entry):
"/api/lm/register-nodes", """Parse a node ID entry that can be int, string, or dict.
Returns (parsed_id, graph_id_or_None).
"""
node_identifier = entry
graph_identifier = None
if isinstance(entry, dict):
node_identifier = entry.get("node_id")
graph_identifier = entry.get("graph_id")
try:
parsed_id = int(node_identifier)
except (TypeError, ValueError):
parsed_id = node_identifier
return parsed_id, graph_identifier
async def _handle_get_trigger_words(request: web.Request) -> web.Response:
"""Fetch trigger words from remote and broadcast via send_sync."""
try:
data = await request.json()
lora_names = data.get("lora_names", [])
node_ids = data.get("node_ids", [])
client = RemoteLoraClient.get_instance()
server = _get_prompt_server()
# Collect trigger words for ALL loras into a single combined list,
# then broadcast the same combined text to ALL node_ids.
all_trigger_words = []
for lora_name in lora_names:
_, trigger_words = await client.get_lora_info(lora_name)
all_trigger_words.extend(trigger_words)
trigger_words_text = ",, ".join(all_trigger_words) if all_trigger_words else ""
for entry in node_ids:
parsed_id, graph_id = _parse_node_id(entry)
payload = {"id": parsed_id, "message": trigger_words_text}
if graph_id is not None:
payload["graph_id"] = str(graph_id)
server.send_sync("trigger_word_update", payload)
return web.json_response({"success": True})
except Exception as exc:
logger.error("[LM-Remote] Error getting trigger words: %s", exc)
return web.json_response(
{"success": False, "error": str(exc)}, status=500
)
async def _handle_update_lora_code(request: web.Request) -> web.Response:
"""Parse lora code update and broadcast via send_sync."""
data = await request.json()
node_ids = data.get("node_ids")
lora_code = data.get("lora_code", "")
mode = data.get("mode", "append")
server = _get_prompt_server()
if node_ids is None:
# Broadcast to all nodes
server.send_sync(
"lora_code_update",
{"id": -1, "lora_code": lora_code, "mode": mode},
)
else:
for entry in node_ids:
parsed_id, graph_id = _parse_node_id(entry)
payload = {"id": parsed_id, "lora_code": lora_code, "mode": mode}
if graph_id is not None:
payload["graph_id"] = str(graph_id)
server.send_sync("lora_code_update", payload)
return web.json_response({"success": True})
async def _handle_update_node_widget(request: web.Request) -> web.Response:
"""Parse widget update and broadcast via send_sync."""
data = await request.json()
widget_name = data.get("widget_name")
value = data.get("value")
node_ids = data.get("node_ids")
if not widget_name or value is None or not node_ids:
return web.json_response(
{"error": "widget_name, value, and node_ids are required"},
status=400,
)
server = _get_prompt_server()
for entry in node_ids:
parsed_id, graph_id = _parse_node_id(entry)
payload = {"id": parsed_id, "widget_name": widget_name, "value": value}
if graph_id is not None:
payload["graph_id"] = str(graph_id)
server.send_sync("lm_widget_update", payload)
return web.json_response({"success": True})
async def _handle_register_nodes(request: web.Request) -> web.Response:
"""No-op handler — node registration is not needed in remote mode."""
return web.json_response({"success": True, "message": "No-op in remote mode"})
# Dispatch table for send_sync routes
_SEND_SYNC_HANDLERS = {
"/api/lm/loras/get_trigger_words": _handle_get_trigger_words,
"/api/lm/update-lora-code": _handle_update_lora_code,
"/api/lm/update-node-widget": _handle_update_node_widget,
"/api/lm/register-nodes": _handle_register_nodes,
} }
# Shared HTTP session for proxied requests (connection pooling) # Shared HTTP session for proxied requests (connection pooling)
@@ -206,10 +320,11 @@ async def lm_remote_proxy_middleware(request: web.Request, handler):
path = request.path path = request.path
# Routes that use send_sync must NOT be proxied — let the local # Routes that need send_sync are handled locally so events reach
# original LoRA Manager handle them so events reach the local browser. # the local browser (the remote instance has no connected browsers).
if path in _SEND_SYNC_SKIP_ROUTES: local_handler = _SEND_SYNC_HANDLERS.get(path)
return await handler(request) if local_handler is not None:
return await local_handler(request)
# WebSocket routes # WebSocket routes
if _is_ws_route(path): if _is_ws_route(path):

View File

@@ -0,0 +1,25 @@
/**
* Bootstrap loader for LoRA Manager Vue widget bundle.
*
* When the original ComfyUI-Lora-Manager package is NOT installed locally,
* the Vue widget types (AUTOCOMPLETE_TEXT_LORAS, LORAS, LORA_POOL_CONFIG,
* RANDOMIZER_CONFIG, CYCLER_CONFIG) would never be registered and nodes
* wouldn't render.
*
* This script loads the Vue widget bundle from the remote instance via the
* proxy at /extensions/ComfyUI-Lora-Manager/vue-widgets/. If the original
* package IS installed, the bundle is already loaded and we skip the import.
*/
import { app } from "../../scripts/app.js";
const alreadyLoaded = app.extensions?.some(
ext => ext.name === "LoraManager.VueWidgets"
);
if (!alreadyLoaded) {
try {
await import("/extensions/ComfyUI-Lora-Manager/vue-widgets/lora-manager-widgets.js");
} catch (err) {
console.warn("[LM-Remote] Failed to load Vue widget bundle:", err);
}
}