21 Commits

Author SHA1 Message Date
b042fe4368 Fix 25+ bugs across rounds 4-8 of comprehensive code review
history_tree.py:
- Cycle protection in generate_graph() parent walk
- KeyError → .get() for malformed node data in commit() and generate_graph()
- UUID collision check with for/else raise in commit() and _migrate_legacy()
- RuntimeError → ValueError for consistent exception handling

tab_timeline_ng.py:
- Re-parent children walks to surviving ancestor for batch deletes
- Branch tip deletion re-points to parent instead of removing branch
- Cycle protection in _walk_branch_nodes and _find_branch_for_node
- Full data.clear() restore instead of merge in _restore_node
- Safe .get('data', {}) in restore and preview
- Reset stale branch selection after node deletion
- json.dumps for safe JS string escaping in graphviz renderer

tab_batch_ng.py:
- NaN/inf rejection in dict_number with math.isfinite()
- _safe_int used in recalc_vace, update_mode_label, frame_to_skip
- Uncaught ValueError from htree.commit() caught with user notification

tab_comfy_ng.py:
- asyncio.get_event_loop() → get_running_loop()

utils.py:
- Atomic writes for save_config and save_snippets
- save_config extra_data can't override explicit last_dir/favorites

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 10:23:14 +01:00
04b9ed0e27 Fix 4 bugs: SQL conflict handling, HTML escaping, backup cap, safe int cast
- sync_to_db: use ON CONFLICT for duplicate sequence numbers
- history_tree: html.escape() for Graphviz DOT labels
- tab_timeline_ng: cap history_tree_backup to 10 entries
- tab_batch_ng: add _safe_int() helper for VACE settings

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 21:03:13 +01:00
1b8d13f7c4 Fix 7 bugs: bounds checks, deepcopy, time import, JS keys, unused import
- Add bounds check on src_batch index in add_from_source and copy_source
- Guard delete callback against stale index after rapid clicks
- Replace __import__('time').time() with time.time() in sync_to_db
- Use deepcopy(DEFAULTS) consistently in utils.py and main.py
- Use JSON.stringify in JS onConfigure fallback path for key storage
- Read state.show_comfy_monitor for checkbox initial value
- Remove unused KEY_BATCH_DATA import from tab_projects_ng

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 20:41:00 +01:00
497e6b06fb Fix 7 bugs: async proxies, mode default, JS key serialization, validation
- Use asyncio.to_thread for proxy endpoints to avoid blocking event loop
- Add mode to DEFAULTS so it doesn't silently insert 0
- Use JSON serialization for keys in project_dynamic.js (with comma fallback)
- Validate path exists in change_path, friendly error on duplicate rename
- Remove unused exp param from rename closure
- Use deepcopy for DEFAULTS consistently

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 20:29:24 +01:00
993fc86070 Auto-detect change notes when saving without a manual note
Compares current batch data against the last snapshot to generate
descriptive notes like "Added seq 3; Changed: prompt, seed".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 20:24:07 +01:00
c9bcc735f4 Change refresh input from INT to combo dropdown (off/on)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 14:29:30 +01:00
dc8f44f02b Add refresh input to ProjectLoaderDynamic to force re-fetch
Changing the refresh value triggers the node to re-fetch keys from
the API, picking up any new or modified fields in the data.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 14:27:51 +01:00
2a6b4f5245 Add mode integer field above sequence number in batch cards
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 01:07:09 +01:00
60d1162700 Fix async callbacks: make rename/change_path directly async
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 00:42:00 +01:00
204fc4ea85 Add rename for sequences and projects, per-sub color cycling, project path editing
- Sequences: add rename button with name shown in expansion header
- Subsequences: cycle through 6 distinct border colors by sub-index
- Projects: add rename and change path buttons with DB methods

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 00:40:35 +01:00
033b3415c2 Merge branch 'feature/sqlite-project-db' 2026-03-01 00:32:07 +01:00
2ccc3821d6 Add visual distinction for subsequence cards with teal border
The subsegment-card CSS class was not being applied to subsequence
expansion items. Add the class conditionally and include the teal
accent CSS rules with a 6px left border.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 00:31:00 +01:00
187b85b054 Clean up: remove unnecessary info.outputs logic, set label on reused slots
The actual fix was setting slot.label alongside slot.name. Reverted
onConfigure to read from widget values (which work correctly) and
ensured label is set on both new and reused output slots.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 00:12:30 +01:00
a0d8cb8bbf Fix: set output label alongside name for LiteGraph rendering
LiteGraph renders slot.label over slot.name — we were updating name
but the display uses label.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 00:10:50 +01:00
d55b3198e8 Fix output names not surviving page refresh in ProjectLoaderDynamic
Read output names from info.outputs (serialized node data) instead of
hidden widget values, which ComfyUI may not persist across reloads.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 23:43:34 +01:00
bf2fca53e0 Remove JSONLoaderDynamic, handled by ComfyUI-JSON-Dynamic extension
The separate ComfyUI-JSON-Dynamic extension provides the same node.
Removes json_loader.py, web/json_dynamic.js, and their tests. Only
ProjectLoaderDynamic remains in this extension.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 23:18:39 +01:00
5b71d1b276 Fix output name persistence: use comma-separated like reference impl
JSON.stringify format for hidden widget values didn't survive ComfyUI's
serialization round-trip. Switch to comma-separated strings matching
the proven ComfyUI-JSON-Dynamic implementation. Remove properties-based
approach in favor of the simpler, working pattern.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 23:11:04 +01:00
027ef8e78a Fix ProjectLoaderDynamic output names lost on page reload
Hidden widget values for output_keys/output_types were not reliably
restored by ComfyUI on workflow reload. Store keys/types in
node.properties (always persisted by LiteGraph) as primary storage,
with hidden widgets as fallback.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 23:04:14 +01:00
86693f608a Remove 9 redundant JSON loader nodes, keep only JSONLoaderDynamic
JSONLoaderDynamic auto-discovers keys at runtime, making the hardcoded
Standard, Batch, and Custom nodes unnecessary.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 22:36:33 +01:00
615755ba44 Revert "Remove 9 redundant JSON loader nodes, keep only JSONLoaderDynamic"
This reverts commit 4b09491242.
2026-02-28 22:36:25 +01:00
4b09491242 Remove 9 redundant JSON loader nodes, keep only JSONLoaderDynamic
JSONLoaderDynamic auto-discovers keys at runtime, making the hardcoded
Standard, Batch, and Custom nodes unnecessary.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 22:33:51 +01:00
14 changed files with 317 additions and 809 deletions

View File

@@ -1,8 +1,7 @@
from .json_loader import NODE_CLASS_MAPPINGS, NODE_DISPLAY_NAME_MAPPINGS
from .project_loader import PROJECT_NODE_CLASS_MAPPINGS, PROJECT_NODE_DISPLAY_NAME_MAPPINGS from .project_loader import PROJECT_NODE_CLASS_MAPPINGS, PROJECT_NODE_DISPLAY_NAME_MAPPINGS
NODE_CLASS_MAPPINGS.update(PROJECT_NODE_CLASS_MAPPINGS) NODE_CLASS_MAPPINGS = PROJECT_NODE_CLASS_MAPPINGS
NODE_DISPLAY_NAME_MAPPINGS.update(PROJECT_NODE_DISPLAY_NAME_MAPPINGS) NODE_DISPLAY_NAME_MAPPINGS = PROJECT_NODE_DISPLAY_NAME_MAPPINGS
WEB_DIRECTORY = "./web" WEB_DIRECTORY = "./web"

18
db.py
View File

@@ -98,6 +98,24 @@ class ProjectDB:
).fetchone() ).fetchone()
return dict(row) if row else None return dict(row) if row else None
def rename_project(self, old_name: str, new_name: str) -> bool:
now = time.time()
cur = self.conn.execute(
"UPDATE projects SET name = ?, updated_at = ? WHERE name = ?",
(new_name, now, old_name),
)
self.conn.commit()
return cur.rowcount > 0
def update_project_path(self, name: str, folder_path: str) -> bool:
now = time.time()
cur = self.conn.execute(
"UPDATE projects SET folder_path = ?, updated_at = ? WHERE name = ?",
(folder_path, now, name),
)
self.conn.commit()
return cur.rowcount > 0
def delete_project(self, name: str) -> bool: def delete_project(self, name: str) -> bool:
cur = self.conn.execute("DELETE FROM projects WHERE name = ?", (name,)) cur = self.conn.execute("DELETE FROM projects WHERE name = ?", (name,))
self.conn.commit() self.conn.commit()

View File

@@ -1,3 +1,4 @@
import html
import time import time
import uuid import uuid
from typing import Any from typing import Any
@@ -17,7 +18,10 @@ class HistoryTree:
def _migrate_legacy(self, old_list: list[dict[str, Any]]) -> None: def _migrate_legacy(self, old_list: list[dict[str, Any]]) -> None:
parent = None parent = None
for item in reversed(old_list): for item in reversed(old_list):
for _ in range(10):
node_id = str(uuid.uuid4())[:8] node_id = str(uuid.uuid4())[:8]
if node_id not in self.nodes:
break
self.nodes[node_id] = { self.nodes[node_id] = {
"id": node_id, "parent": parent, "timestamp": time.time(), "id": node_id, "parent": parent, "timestamp": time.time(),
"data": item, "note": item.get("note", "Legacy Import") "data": item, "note": item.get("note", "Legacy Import")
@@ -27,7 +31,13 @@ class HistoryTree:
self.head_id = parent self.head_id = parent
def commit(self, data: dict[str, Any], note: str = "Snapshot") -> str: def commit(self, data: dict[str, Any], note: str = "Snapshot") -> str:
# Generate unique node ID with collision check
for _ in range(10):
new_id = str(uuid.uuid4())[:8] new_id = str(uuid.uuid4())[:8]
if new_id not in self.nodes:
break
else:
raise ValueError("Failed to generate unique node ID after 10 attempts")
# Cycle detection: walk parent chain from head to verify no cycle # Cycle detection: walk parent chain from head to verify no cycle
if self.head_id: if self.head_id:
@@ -38,7 +48,7 @@ class HistoryTree:
raise ValueError(f"Cycle detected in history tree at node {current}") raise ValueError(f"Cycle detected in history tree at node {current}")
visited.add(current) visited.add(current)
node = self.nodes.get(current) node = self.nodes.get(current)
current = node["parent"] if node else None current = node.get("parent") if node else None
active_branch = None active_branch = None
for b_name, tip_id in self.branches.items(): for b_name, tip_id in self.branches.items():
@@ -114,8 +124,12 @@ class HistoryTree:
# Build reverse lookup: node_id -> branch name (walk each branch ancestry) # Build reverse lookup: node_id -> branch name (walk each branch ancestry)
node_to_branch: dict[str, str] = {} node_to_branch: dict[str, str] = {}
for b_name, tip_id in self.branches.items(): for b_name, tip_id in self.branches.items():
visited = set()
current = tip_id current = tip_id
while current and current in self.nodes: while current and current in self.nodes:
if current in visited:
break
visited.add(current)
if current not in node_to_branch: if current not in node_to_branch:
node_to_branch[current] = b_name node_to_branch[current] = b_name
current = self.nodes[current].get('parent') current = self.nodes[current].get('parent')
@@ -154,13 +168,14 @@ class HistoryTree:
full_note = n.get('note', 'Step') full_note = n.get('note', 'Step')
display_note = (full_note[:max_note_len] + '..') if len(full_note) > max_note_len else full_note display_note = (full_note[:max_note_len] + '..') if len(full_note) > max_note_len else full_note
display_note = html.escape(display_note)
ts = time.strftime('%b %d %H:%M', time.localtime(n['timestamp'])) ts = time.strftime('%b %d %H:%M', time.localtime(n['timestamp']))
# Branch label for tip nodes # Branch label for tip nodes
branch_label = "" branch_label = ""
if nid in tip_to_branches: if nid in tip_to_branches:
branch_label = ", ".join(tip_to_branches[nid]) branch_label = html.escape(", ".join(tip_to_branches[nid]))
# COLORS — per-branch tint, overridden for HEAD and tips # COLORS — per-branch tint, overridden for HEAD and tips
b_name = node_to_branch.get(nid) b_name = node_to_branch.get(nid)
@@ -190,11 +205,18 @@ class HistoryTree:
+ '</TABLE>>' + '</TABLE>>'
) )
safe_tooltip = full_note.replace('"', "'") safe_tooltip = (full_note
dot.append(f' "{nid}" [label={label}, tooltip="{safe_tooltip}"];') .replace('\\', '\\\\')
.replace('"', '\\"')
.replace('\n', ' ')
.replace('\r', '')
.replace(']', '&#93;'))
safe_nid = nid.replace('"', '_')
dot.append(f' "{safe_nid}" [label={label}, tooltip="{safe_tooltip}"];')
if n["parent"] and n["parent"] in self.nodes: if n.get("parent") and n["parent"] in self.nodes:
dot.append(f' "{n["parent"]}" -> "{nid}";') safe_parent = n["parent"].replace('"', '_')
dot.append(f' "{safe_parent}" -> "{safe_nid}";')
dot.append("}") dot.append("}")
return "\n".join(dot) return "\n".join(dot)

View File

@@ -1,386 +0,0 @@
import json
import os
import logging
from typing import Any
logger = logging.getLogger(__name__)
KEY_BATCH_DATA = "batch_data"
MAX_DYNAMIC_OUTPUTS = 32
class AnyType(str):
"""Universal connector type that matches any ComfyUI type."""
def __ne__(self, __value: object) -> bool:
return False
any_type = AnyType("*")
try:
from server import PromptServer
from aiohttp import web
except ImportError:
PromptServer = None
def to_float(val: Any) -> float:
try:
return float(val)
except (ValueError, TypeError):
return 0.0
def to_int(val: Any) -> int:
try:
return int(float(val))
except (ValueError, TypeError):
return 0
def get_batch_item(data: dict[str, Any], sequence_number: int) -> dict[str, Any]:
"""Resolve batch item by sequence_number field, falling back to array index."""
if KEY_BATCH_DATA in data and isinstance(data[KEY_BATCH_DATA], list) and len(data[KEY_BATCH_DATA]) > 0:
# Search by sequence_number field first
for item in data[KEY_BATCH_DATA]:
if int(item.get("sequence_number", 0)) == sequence_number:
return item
# Fallback to array index
idx = max(0, min(sequence_number - 1, len(data[KEY_BATCH_DATA]) - 1))
logger.warning(f"No item with sequence_number={sequence_number}, falling back to index {idx}")
return data[KEY_BATCH_DATA][idx]
return data
# --- Shared Helper ---
def read_json_data(json_path: str) -> dict[str, Any]:
if not os.path.exists(json_path):
logger.warning(f"File not found at {json_path}")
return {}
try:
with open(json_path, 'r') as f:
data = json.load(f)
except (json.JSONDecodeError, IOError) as e:
logger.warning(f"Error reading {json_path}: {e}")
return {}
if not isinstance(data, dict):
logger.warning(f"Expected dict from {json_path}, got {type(data).__name__}")
return {}
return data
# --- API Route ---
if PromptServer is not None:
@PromptServer.instance.routes.get("/json_manager/get_keys")
async def get_keys_route(request):
json_path = request.query.get("path", "")
try:
seq = int(request.query.get("sequence_number", "1"))
except (ValueError, TypeError):
seq = 1
data = read_json_data(json_path)
if not data:
return web.json_response({"keys": [], "types": [], "error": "file_not_found"})
target = get_batch_item(data, seq)
keys = []
types = []
if isinstance(target, dict):
for k, v in target.items():
keys.append(k)
if isinstance(v, bool):
types.append("STRING")
elif isinstance(v, int):
types.append("INT")
elif isinstance(v, float):
types.append("FLOAT")
else:
types.append("STRING")
return web.json_response({"keys": keys, "types": types})
# ==========================================
# 0. DYNAMIC NODE
# ==========================================
class JSONLoaderDynamic:
@classmethod
def INPUT_TYPES(s):
return {
"required": {
"json_path": ("STRING", {"default": "", "multiline": False}),
"sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999}),
},
"optional": {
"output_keys": ("STRING", {"default": ""}),
"output_types": ("STRING", {"default": ""}),
},
}
RETURN_TYPES = tuple(any_type for _ in range(MAX_DYNAMIC_OUTPUTS))
RETURN_NAMES = tuple(f"output_{i}" for i in range(MAX_DYNAMIC_OUTPUTS))
FUNCTION = "load_dynamic"
CATEGORY = "utils/json"
OUTPUT_NODE = False
def load_dynamic(self, json_path, sequence_number, output_keys="", output_types=""):
data = read_json_data(json_path)
target = get_batch_item(data, sequence_number)
keys = [k.strip() for k in output_keys.split(",") if k.strip()] if output_keys else []
results = []
for key in keys:
val = target.get(key, "")
if isinstance(val, bool):
results.append(str(val).lower())
elif isinstance(val, int):
results.append(val)
elif isinstance(val, float):
results.append(val)
else:
results.append(str(val))
# Pad to MAX_DYNAMIC_OUTPUTS
while len(results) < MAX_DYNAMIC_OUTPUTS:
results.append("")
return tuple(results)
# ==========================================
# 1. STANDARD NODES (Single File)
# ==========================================
class JSONLoaderLoRA:
@classmethod
def INPUT_TYPES(s):
return {"required": {"json_path": ("STRING", {"default": "", "multiline": False})}}
RETURN_TYPES = ("STRING", "STRING", "STRING", "STRING", "STRING", "STRING")
RETURN_NAMES = ("lora_1_high", "lora_1_low", "lora_2_high", "lora_2_low", "lora_3_high", "lora_3_low")
FUNCTION = "load_loras"
CATEGORY = "utils/json"
def load_loras(self, json_path):
data = read_json_data(json_path)
return (
str(data.get("lora 1 high", "")), str(data.get("lora 1 low", "")),
str(data.get("lora 2 high", "")), str(data.get("lora 2 low", "")),
str(data.get("lora 3 high", "")), str(data.get("lora 3 low", ""))
)
class JSONLoaderStandard:
@classmethod
def INPUT_TYPES(s):
return {"required": {"json_path": ("STRING", {"default": "", "multiline": False})}}
RETURN_TYPES = ("STRING", "STRING", "STRING", "STRING", "STRING", "FLOAT", "INT", "STRING", "STRING", "STRING")
RETURN_NAMES = ("general_prompt", "general_negative", "current_prompt", "negative", "camera", "flf", "seed", "video_file_path", "reference_image_path", "flf_image_path")
FUNCTION = "load_standard"
CATEGORY = "utils/json"
def load_standard(self, json_path):
data = read_json_data(json_path)
return (
str(data.get("general_prompt", "")), str(data.get("general_negative", "")),
str(data.get("current_prompt", "")), str(data.get("negative", "")),
str(data.get("camera", "")), to_float(data.get("flf", 0.0)),
to_int(data.get("seed", 0)), str(data.get("video file path", "")),
str(data.get("reference image path", "")), str(data.get("flf image path", ""))
)
class JSONLoaderVACE:
@classmethod
def INPUT_TYPES(s):
return {"required": {"json_path": ("STRING", {"default": "", "multiline": False})}}
RETURN_TYPES = ("STRING", "STRING", "STRING", "STRING", "STRING", "FLOAT", "INT", "INT", "INT", "INT", "STRING", "INT", "INT", "STRING", "STRING")
RETURN_NAMES = ("general_prompt", "general_negative", "current_prompt", "negative", "camera", "flf", "seed", "frame_to_skip", "input_a_frames", "input_b_frames", "reference_path", "reference_switch", "vace_schedule", "video_file_path", "reference_image_path")
FUNCTION = "load_vace"
CATEGORY = "utils/json"
def load_vace(self, json_path):
data = read_json_data(json_path)
return (
str(data.get("general_prompt", "")), str(data.get("general_negative", "")),
str(data.get("current_prompt", "")), str(data.get("negative", "")),
str(data.get("camera", "")), to_float(data.get("flf", 0.0)),
to_int(data.get("seed", 0)),
to_int(data.get("frame_to_skip", 81)), to_int(data.get("input_a_frames", 16)),
to_int(data.get("input_b_frames", 16)), str(data.get("reference path", "")),
to_int(data.get("reference switch", 1)), to_int(data.get("vace schedule", 1)),
str(data.get("video file path", "")), str(data.get("reference image path", ""))
)
# ==========================================
# 2. BATCH NODES
# ==========================================
class JSONLoaderBatchLoRA:
@classmethod
def INPUT_TYPES(s):
return {"required": {"json_path": ("STRING", {"default": "", "multiline": False}), "sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999})}}
RETURN_TYPES = ("STRING", "STRING", "STRING", "STRING", "STRING", "STRING")
RETURN_NAMES = ("lora_1_high", "lora_1_low", "lora_2_high", "lora_2_low", "lora_3_high", "lora_3_low")
FUNCTION = "load_batch_loras"
CATEGORY = "utils/json"
def load_batch_loras(self, json_path, sequence_number):
data = read_json_data(json_path)
target_data = get_batch_item(data, sequence_number)
return (
str(target_data.get("lora 1 high", "")), str(target_data.get("lora 1 low", "")),
str(target_data.get("lora 2 high", "")), str(target_data.get("lora 2 low", "")),
str(target_data.get("lora 3 high", "")), str(target_data.get("lora 3 low", ""))
)
class JSONLoaderBatchI2V:
@classmethod
def INPUT_TYPES(s):
return {"required": {"json_path": ("STRING", {"default": "", "multiline": False}), "sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999})}}
RETURN_TYPES = ("STRING", "STRING", "STRING", "STRING", "STRING", "FLOAT", "INT", "STRING", "STRING", "STRING")
RETURN_NAMES = ("general_prompt", "general_negative", "current_prompt", "negative", "camera", "flf", "seed", "video_file_path", "reference_image_path", "flf_image_path")
FUNCTION = "load_batch_i2v"
CATEGORY = "utils/json"
def load_batch_i2v(self, json_path, sequence_number):
data = read_json_data(json_path)
target_data = get_batch_item(data, sequence_number)
return (
str(target_data.get("general_prompt", "")), str(target_data.get("general_negative", "")),
str(target_data.get("current_prompt", "")), str(target_data.get("negative", "")),
str(target_data.get("camera", "")), to_float(target_data.get("flf", 0.0)),
to_int(target_data.get("seed", 0)), str(target_data.get("video file path", "")),
str(target_data.get("reference image path", "")), str(target_data.get("flf image path", ""))
)
class JSONLoaderBatchVACE:
@classmethod
def INPUT_TYPES(s):
return {"required": {"json_path": ("STRING", {"default": "", "multiline": False}), "sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999})}}
RETURN_TYPES = ("STRING", "STRING", "STRING", "STRING", "STRING", "FLOAT", "INT", "INT", "INT", "INT", "STRING", "INT", "INT", "STRING", "STRING")
RETURN_NAMES = ("general_prompt", "general_negative", "current_prompt", "negative", "camera", "flf", "seed", "frame_to_skip", "input_a_frames", "input_b_frames", "reference_path", "reference_switch", "vace_schedule", "video_file_path", "reference_image_path")
FUNCTION = "load_batch_vace"
CATEGORY = "utils/json"
def load_batch_vace(self, json_path, sequence_number):
data = read_json_data(json_path)
target_data = get_batch_item(data, sequence_number)
return (
str(target_data.get("general_prompt", "")), str(target_data.get("general_negative", "")),
str(target_data.get("current_prompt", "")), str(target_data.get("negative", "")),
str(target_data.get("camera", "")), to_float(target_data.get("flf", 0.0)),
to_int(target_data.get("seed", 0)), to_int(target_data.get("frame_to_skip", 81)),
to_int(target_data.get("input_a_frames", 16)), to_int(target_data.get("input_b_frames", 16)),
str(target_data.get("reference path", "")), to_int(target_data.get("reference switch", 1)),
to_int(target_data.get("vace schedule", 1)), str(target_data.get("video file path", "")),
str(target_data.get("reference image path", ""))
)
# ==========================================
# 3. UNIVERSAL CUSTOM NODES (1, 3, 6 Slots)
# ==========================================
class JSONLoaderCustom1:
@classmethod
def INPUT_TYPES(s):
return {
"required": {
"json_path": ("STRING", {"default": "", "multiline": False}),
"sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999}),
},
"optional": { "key_1": ("STRING", {"default": "", "multiline": False}) }
}
RETURN_TYPES = ("STRING",)
RETURN_NAMES = ("val_1",)
FUNCTION = "load_custom"
CATEGORY = "utils/json"
def load_custom(self, json_path, sequence_number, key_1=""):
data = read_json_data(json_path)
target_data = get_batch_item(data, sequence_number)
return (str(target_data.get(key_1, "")),)
class JSONLoaderCustom3:
@classmethod
def INPUT_TYPES(s):
return {
"required": {
"json_path": ("STRING", {"default": "", "multiline": False}),
"sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999}),
},
"optional": {
"key_1": ("STRING", {"default": "", "multiline": False}),
"key_2": ("STRING", {"default": "", "multiline": False}),
"key_3": ("STRING", {"default": "", "multiline": False})
}
}
RETURN_TYPES = ("STRING", "STRING", "STRING")
RETURN_NAMES = ("val_1", "val_2", "val_3")
FUNCTION = "load_custom"
CATEGORY = "utils/json"
def load_custom(self, json_path, sequence_number, key_1="", key_2="", key_3=""):
data = read_json_data(json_path)
target_data = get_batch_item(data, sequence_number)
return (
str(target_data.get(key_1, "")),
str(target_data.get(key_2, "")),
str(target_data.get(key_3, ""))
)
class JSONLoaderCustom6:
@classmethod
def INPUT_TYPES(s):
return {
"required": {
"json_path": ("STRING", {"default": "", "multiline": False}),
"sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999}),
},
"optional": {
"key_1": ("STRING", {"default": "", "multiline": False}),
"key_2": ("STRING", {"default": "", "multiline": False}),
"key_3": ("STRING", {"default": "", "multiline": False}),
"key_4": ("STRING", {"default": "", "multiline": False}),
"key_5": ("STRING", {"default": "", "multiline": False}),
"key_6": ("STRING", {"default": "", "multiline": False})
}
}
RETURN_TYPES = ("STRING", "STRING", "STRING", "STRING", "STRING", "STRING")
RETURN_NAMES = ("val_1", "val_2", "val_3", "val_4", "val_5", "val_6")
FUNCTION = "load_custom"
CATEGORY = "utils/json"
def load_custom(self, json_path, sequence_number, key_1="", key_2="", key_3="", key_4="", key_5="", key_6=""):
data = read_json_data(json_path)
target_data = get_batch_item(data, sequence_number)
return (
str(target_data.get(key_1, "")), str(target_data.get(key_2, "")),
str(target_data.get(key_3, "")), str(target_data.get(key_4, "")),
str(target_data.get(key_5, "")), str(target_data.get(key_6, ""))
)
# --- Mappings ---
NODE_CLASS_MAPPINGS = {
"JSONLoaderDynamic": JSONLoaderDynamic,
"JSONLoaderLoRA": JSONLoaderLoRA,
"JSONLoaderStandard": JSONLoaderStandard,
"JSONLoaderVACE": JSONLoaderVACE,
"JSONLoaderBatchLoRA": JSONLoaderBatchLoRA,
"JSONLoaderBatchI2V": JSONLoaderBatchI2V,
"JSONLoaderBatchVACE": JSONLoaderBatchVACE,
"JSONLoaderCustom1": JSONLoaderCustom1,
"JSONLoaderCustom3": JSONLoaderCustom3,
"JSONLoaderCustom6": JSONLoaderCustom6
}
NODE_DISPLAY_NAME_MAPPINGS = {
"JSONLoaderDynamic": "JSON Loader (Dynamic)",
"JSONLoaderLoRA": "JSON Loader (LoRAs Only)",
"JSONLoaderStandard": "JSON Loader (Standard/I2V)",
"JSONLoaderVACE": "JSON Loader (VACE Full)",
"JSONLoaderBatchLoRA": "JSON Batch Loader (LoRAs)",
"JSONLoaderBatchI2V": "JSON Batch Loader (I2V)",
"JSONLoaderBatchVACE": "JSON Batch Loader (VACE)",
"JSONLoaderCustom1": "JSON Loader (Custom 1)",
"JSONLoaderCustom3": "JSON Loader (Custom 3)",
"JSONLoaderCustom6": "JSON Loader (Custom 6)"
}

19
main.py
View File

@@ -1,3 +1,4 @@
import copy
import json import json
import logging import logging
from pathlib import Path from pathlib import Path
@@ -156,6 +157,20 @@ def index():
background: rgba(255,255,255,0.2); background: rgba(255,255,255,0.2);
} }
/* Sub-sequence accent colors (per sub-index, cycling) */
.body--dark .subsegment-color-0 > .q-expansion-item__container > .q-item { border-left: 6px solid #06B6D4; padding-left: 10px; }
.body--dark .subsegment-color-0 .q-expansion-item__toggle-icon { color: #06B6D4 !important; }
.body--dark .subsegment-color-1 > .q-expansion-item__container > .q-item { border-left: 6px solid #A78BFA; padding-left: 10px; }
.body--dark .subsegment-color-1 .q-expansion-item__toggle-icon { color: #A78BFA !important; }
.body--dark .subsegment-color-2 > .q-expansion-item__container > .q-item { border-left: 6px solid #34D399; padding-left: 10px; }
.body--dark .subsegment-color-2 .q-expansion-item__toggle-icon { color: #34D399 !important; }
.body--dark .subsegment-color-3 > .q-expansion-item__container > .q-item { border-left: 6px solid #F472B6; padding-left: 10px; }
.body--dark .subsegment-color-3 .q-expansion-item__toggle-icon { color: #F472B6 !important; }
.body--dark .subsegment-color-4 > .q-expansion-item__container > .q-item { border-left: 6px solid #FBBF24; padding-left: 10px; }
.body--dark .subsegment-color-4 .q-expansion-item__toggle-icon { color: #FBBF24 !important; }
.body--dark .subsegment-color-5 > .q-expansion-item__container > .q-item { border-left: 6px solid #FB923C; padding-left: 10px; }
.body--dark .subsegment-color-5 .q-expansion-item__toggle-icon { color: #FB923C !important; }
/* Secondary pane teal accent */ /* Secondary pane teal accent */
.pane-secondary .q-field--outlined.q-field--focused .q-field__control:after { .pane-secondary .q-field--outlined.q-field--focused .q-field__control:after {
border-color: #06B6D4 !important; border-color: #06B6D4 !important;
@@ -467,7 +482,7 @@ def render_sidebar(state: AppState, dual_pane: dict):
if not fn.endswith('.json'): if not fn.endswith('.json'):
fn += '.json' fn += '.json'
path = state.current_dir / fn path = state.current_dir / fn
first_item = DEFAULTS.copy() first_item = copy.deepcopy(DEFAULTS)
first_item[KEY_SEQUENCE_NUMBER] = 1 first_item[KEY_SEQUENCE_NUMBER] = 1
save_json(path, {KEY_BATCH_DATA: [first_item]}) save_json(path, {KEY_BATCH_DATA: [first_item]})
new_fn_input.set_value('') new_fn_input.set_value('')
@@ -500,7 +515,7 @@ def render_sidebar(state: AppState, dual_pane: dict):
state.show_comfy_monitor = e.value state.show_comfy_monitor = e.value
state._render_main.refresh() state._render_main.refresh()
ui.checkbox('Show Comfy Monitor', value=True, on_change=on_monitor_toggle) ui.checkbox('Show Comfy Monitor', value=state.show_comfy_monitor, on_change=on_monitor_toggle)
# Register REST API routes for ComfyUI connectivity (uses the shared DB instance) # Register REST API routes for ComfyUI connectivity (uses the shared DB instance)

View File

@@ -1,3 +1,4 @@
import asyncio
import json import json
import logging import logging
import urllib.parse import urllib.parse
@@ -88,7 +89,7 @@ if PromptServer is not None:
async def list_projects_proxy(request): async def list_projects_proxy(request):
manager_url = request.query.get("url", "http://localhost:8080") manager_url = request.query.get("url", "http://localhost:8080")
url = f"{manager_url.rstrip('/')}/api/projects" url = f"{manager_url.rstrip('/')}/api/projects"
data = _fetch_json(url) data = await asyncio.to_thread(_fetch_json, url)
return web.json_response(data) return web.json_response(data)
@PromptServer.instance.routes.get("/json_manager/list_project_files") @PromptServer.instance.routes.get("/json_manager/list_project_files")
@@ -96,7 +97,7 @@ if PromptServer is not None:
manager_url = request.query.get("url", "http://localhost:8080") manager_url = request.query.get("url", "http://localhost:8080")
project = urllib.parse.quote(request.query.get("project", ""), safe='') project = urllib.parse.quote(request.query.get("project", ""), safe='')
url = f"{manager_url.rstrip('/')}/api/projects/{project}/files" url = f"{manager_url.rstrip('/')}/api/projects/{project}/files"
data = _fetch_json(url) data = await asyncio.to_thread(_fetch_json, url)
return web.json_response(data) return web.json_response(data)
@PromptServer.instance.routes.get("/json_manager/list_project_sequences") @PromptServer.instance.routes.get("/json_manager/list_project_sequences")
@@ -105,7 +106,7 @@ if PromptServer is not None:
project = urllib.parse.quote(request.query.get("project", ""), safe='') project = urllib.parse.quote(request.query.get("project", ""), safe='')
file_name = urllib.parse.quote(request.query.get("file", ""), safe='') file_name = urllib.parse.quote(request.query.get("file", ""), safe='')
url = f"{manager_url.rstrip('/')}/api/projects/{project}/files/{file_name}/sequences" url = f"{manager_url.rstrip('/')}/api/projects/{project}/files/{file_name}/sequences"
data = _fetch_json(url) data = await asyncio.to_thread(_fetch_json, url)
return web.json_response(data) return web.json_response(data)
@PromptServer.instance.routes.get("/json_manager/get_project_keys") @PromptServer.instance.routes.get("/json_manager/get_project_keys")
@@ -117,7 +118,7 @@ if PromptServer is not None:
seq = int(request.query.get("seq", "1")) seq = int(request.query.get("seq", "1"))
except (ValueError, TypeError): except (ValueError, TypeError):
seq = 1 seq = 1
data = _fetch_keys(manager_url, project, file_name, seq) data = await asyncio.to_thread(_fetch_keys, manager_url, project, file_name, seq)
if data.get("error") in ("http_error", "network_error", "parse_error"): if data.get("error") in ("http_error", "network_error", "parse_error"):
status = data.get("status", 502) status = data.get("status", 502)
return web.json_response(data, status=status) return web.json_response(data, status=status)
@@ -138,6 +139,7 @@ class ProjectLoaderDynamic:
"project_name": ("STRING", {"default": "", "multiline": False}), "project_name": ("STRING", {"default": "", "multiline": False}),
"file_name": ("STRING", {"default": "", "multiline": False}), "file_name": ("STRING", {"default": "", "multiline": False}),
"sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999}), "sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999}),
"refresh": (["off", "on"],),
}, },
"optional": { "optional": {
"output_keys": ("STRING", {"default": ""}), "output_keys": ("STRING", {"default": ""}),
@@ -152,7 +154,7 @@ class ProjectLoaderDynamic:
OUTPUT_NODE = False OUTPUT_NODE = False
def load_dynamic(self, manager_url, project_name, file_name, sequence_number, def load_dynamic(self, manager_url, project_name, file_name, sequence_number,
output_keys="", output_types=""): refresh="off", output_keys="", output_types=""):
# Fetch keys metadata (includes total_sequences count) # Fetch keys metadata (includes total_sequences count)
keys_meta = _fetch_keys(manager_url, project_name, file_name, sequence_number) keys_meta = _fetch_keys(manager_url, project_name, file_name, sequence_number)
if keys_meta.get("error") in ("http_error", "network_error", "parse_error"): if keys_meta.get("error") in ("http_error", "network_error", "parse_error"):

View File

@@ -1,4 +1,6 @@
import copy import copy
import json
import math
import random import random
from pathlib import Path from pathlib import Path
@@ -13,6 +15,7 @@ from history_tree import HistoryTree
IMAGE_EXTENSIONS = {'.png', '.jpg', '.jpeg', '.webp', '.bmp', '.gif'} IMAGE_EXTENSIONS = {'.png', '.jpg', '.jpeg', '.webp', '.bmp', '.gif'}
SUB_SEGMENT_MULTIPLIER = 1000 SUB_SEGMENT_MULTIPLIER = 1000
SUB_SEGMENT_NUM_COLORS = 6
FRAME_TO_SKIP_DEFAULT = DEFAULTS['frame_to_skip'] FRAME_TO_SKIP_DEFAULT = DEFAULTS['frame_to_skip']
VACE_MODES = [ VACE_MODES = [
@@ -76,6 +79,48 @@ def find_insert_position(batch_list, parent_index, parent_seq_num):
return pos return pos
# --- Auto change note ---
def _auto_change_note(htree, batch_list):
"""Compare current batch_list against last snapshot and describe changes."""
# Get previous batch data from the current head
if not htree.head_id or htree.head_id not in htree.nodes:
return f'Initial save ({len(batch_list)} sequences)'
prev_data = htree.nodes[htree.head_id].get('data', {})
prev_batch = prev_data.get(KEY_BATCH_DATA, [])
prev_by_seq = {int(s.get(KEY_SEQUENCE_NUMBER, 0)): s for s in prev_batch}
curr_by_seq = {int(s.get(KEY_SEQUENCE_NUMBER, 0)): s for s in batch_list}
added = sorted(set(curr_by_seq) - set(prev_by_seq))
removed = sorted(set(prev_by_seq) - set(curr_by_seq))
changed_keys = set()
for seq_num in sorted(set(curr_by_seq) & set(prev_by_seq)):
old, new = prev_by_seq[seq_num], curr_by_seq[seq_num]
all_keys = set(old) | set(new)
for k in all_keys:
if old.get(k) != new.get(k):
changed_keys.add(k)
parts = []
if added:
parts.append(f'Added seq {", ".join(str(s) for s in added)}')
if removed:
parts.append(f'Removed seq {", ".join(str(s) for s in removed)}')
if changed_keys:
# Show up to 4 changed field names
keys_list = sorted(changed_keys)
if len(keys_list) > 4:
keys_str = ', '.join(keys_list[:4]) + f' +{len(keys_list) - 4} more'
else:
keys_str = ', '.join(keys_list)
parts.append(f'Changed: {keys_str}')
return '; '.join(parts) if parts else 'No changes detected'
# --- Helper for repetitive dict-bound inputs --- # --- Helper for repetitive dict-bound inputs ---
def dict_input(element_fn, label, seq, key, **kwargs): def dict_input(element_fn, label, seq, key, **kwargs):
@@ -99,6 +144,8 @@ def dict_number(label, seq, key, default=0, **kwargs):
try: try:
# Try float first to handle "1.5" strings, then check if it's a clean int # Try float first to handle "1.5" strings, then check if it's a clean int
fval = float(val) fval = float(val)
if not math.isfinite(fval):
fval = float(default)
val = int(fval) if fval == int(fval) else fval val = int(fval) if fval == int(fval) else fval
except (ValueError, TypeError, OverflowError): except (ValueError, TypeError, OverflowError):
val = default val = default
@@ -109,6 +156,9 @@ def dict_number(label, seq, key, default=0, **kwargs):
if v is None: if v is None:
v = d v = d
elif isinstance(v, float): elif isinstance(v, float):
if not math.isfinite(v):
v = d
else:
try: try:
v = int(v) if v == int(v) else v v = int(v) if v == int(v) else v
except (OverflowError, ValueError): except (OverflowError, ValueError):
@@ -223,13 +273,13 @@ def render_batch_processor(state: AppState):
with ui.row().classes('q-mt-sm'): with ui.row().classes('q-mt-sm'):
def add_empty(): def add_empty():
_add_sequence(DEFAULTS.copy()) _add_sequence(copy.deepcopy(DEFAULTS))
def add_from_source(): def add_from_source():
item = copy.deepcopy(DEFAULTS) item = copy.deepcopy(DEFAULTS)
src_batch = _src_cache['batch'] src_batch = _src_cache['batch']
sel_idx = src_seq_select.value sel_idx = src_seq_select.value
if src_batch and sel_idx is not None: if src_batch and sel_idx is not None and int(sel_idx) < len(src_batch):
item.update(copy.deepcopy(src_batch[int(sel_idx)])) item.update(copy.deepcopy(src_batch[int(sel_idx)]))
elif _src_cache['data']: elif _src_cache['data']:
item.update(copy.deepcopy(_src_cache['data'])) item.update(copy.deepcopy(_src_cache['data']))
@@ -242,7 +292,7 @@ def render_batch_processor(state: AppState):
lora_keys = ['lora 1 high', 'lora 1 low', 'lora 2 high', 'lora 2 low', lora_keys = ['lora 1 high', 'lora 1 low', 'lora 2 high', 'lora 2 low',
'lora 3 high', 'lora 3 low'] 'lora 3 high', 'lora 3 low']
standard_keys = { standard_keys = {
'general_prompt', 'general_negative', 'current_prompt', 'negative', 'prompt', 'name', 'mode', 'general_prompt', 'general_negative', 'current_prompt', 'negative', 'prompt',
'seed', 'cfg', 'camera', 'flf', KEY_SEQUENCE_NUMBER, 'seed', 'cfg', 'camera', 'flf', KEY_SEQUENCE_NUMBER,
'frame_to_skip', 'end_frame', 'transition', 'vace_length', 'frame_to_skip', 'end_frame', 'transition', 'vace_length',
'input_a_frames', 'input_b_frames', 'reference switch', 'vace schedule', 'input_a_frames', 'input_b_frames', 'reference switch', 'vace schedule',
@@ -291,8 +341,12 @@ def render_batch_processor(state: AppState):
htree = HistoryTree(tree_data) htree = HistoryTree(tree_data)
snapshot_payload = copy.deepcopy(data) snapshot_payload = copy.deepcopy(data)
snapshot_payload.pop(KEY_HISTORY_TREE, None) snapshot_payload.pop(KEY_HISTORY_TREE, None)
note = commit_input.value if commit_input.value else 'Batch Update' note = commit_input.value if commit_input.value else _auto_change_note(htree, batch_list)
try:
htree.commit(snapshot_payload, note=note) htree.commit(snapshot_payload, note=note)
except ValueError as e:
ui.notify(f'Save failed: {e}', type='negative')
return
data[KEY_HISTORY_TREE] = htree.to_dict() data[KEY_HISTORY_TREE] = htree.to_dict()
save_json(file_path, data) save_json(file_path, data)
if state.db_enabled and state.current_project and state.db: if state.db_enabled and state.current_project and state.db:
@@ -321,21 +375,40 @@ def _render_sequence_card(i, seq, batch_list, data, file_path, state,
refresh_list.refresh() refresh_list.refresh()
seq_num = seq.get(KEY_SEQUENCE_NUMBER, i + 1) seq_num = seq.get(KEY_SEQUENCE_NUMBER, i + 1)
seq_name = seq.get('name', '')
if is_subsegment(seq_num): if is_subsegment(seq_num):
label = f'Sub #{parent_of(seq_num)}.{sub_index_of(seq_num)} ({int(seq_num)})' label = f'Sub #{parent_of(seq_num)}.{sub_index_of(seq_num)} ({int(seq_num)})'
else: else:
label = f'Sequence #{seq_num}' label = f'Sequence #{seq_num}'
if seq_name:
label += f'{seq_name}'
with ui.expansion(label, icon='movie').classes('w-full'): if is_subsegment(seq_num):
color_idx = (sub_index_of(seq_num) - 1) % SUB_SEGMENT_NUM_COLORS
exp_classes = f'w-full subsegment-color-{color_idx}'
else:
exp_classes = 'w-full'
with ui.expansion(label, icon='movie').classes(exp_classes) as expansion:
# --- Action row --- # --- Action row ---
with ui.row().classes('w-full q-gutter-sm action-row'): with ui.row().classes('w-full q-gutter-sm action-row'):
# Rename
async def rename(s=seq):
result = await ui.run_javascript(
f'prompt("Rename sequence:", {json.dumps(s.get("name", ""))})',
timeout=30.0,
)
if result is not None:
s['name'] = result
commit('Renamed!')
ui.button('Rename', icon='edit', on_click=rename).props('outline')
# Copy from source # Copy from source
def copy_source(idx=i, sn=seq_num): def copy_source(idx=i, sn=seq_num):
item = copy.deepcopy(DEFAULTS) item = copy.deepcopy(DEFAULTS)
src_batch = src_cache['batch'] src_batch = src_cache['batch']
sel_idx = src_seq_select.value sel_idx = src_seq_select.value
if src_batch and sel_idx is not None: if src_batch and sel_idx is not None and int(sel_idx) < len(src_batch):
item.update(copy.deepcopy(src_batch[int(sel_idx)])) item.update(copy.deepcopy(src_batch[int(sel_idx)]))
elif src_cache['data']: elif src_cache['data']:
item.update(copy.deepcopy(src_cache['data'])) item.update(copy.deepcopy(src_cache['data']))
@@ -390,6 +463,7 @@ def _render_sequence_card(i, seq, batch_list, data, file_path, state,
# Delete # Delete
def delete(idx=i): def delete(idx=i):
if idx < len(batch_list):
batch_list.pop(idx) batch_list.pop(idx)
commit() commit()
@@ -410,6 +484,9 @@ def _render_sequence_card(i, seq, batch_list, data, file_path, state,
'w-full q-mt-sm').props('outlined rows=2') 'w-full q-mt-sm').props('outlined rows=2')
with splitter.after: with splitter.after:
# Mode
dict_number('Mode', seq, 'mode').props('outlined').classes('w-full')
# Sequence number # Sequence number
sn_label = ( sn_label = (
f'Seq Number (Sub #{parent_of(seq_num)}.{sub_index_of(seq_num)})' f'Seq Number (Sub #{parent_of(seq_num)}.{sub_index_of(seq_num)})'
@@ -541,7 +618,13 @@ def _render_sequence_card(i, seq, batch_list, data, file_path, state,
def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_list): def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_list):
# VACE Schedule (needed early for both columns) # VACE Schedule (needed early for both columns)
sched_val = max(0, min(int(seq.get('vace schedule', 1)), len(VACE_MODES) - 1)) def _safe_int(val, default=0):
try:
return int(float(val))
except (ValueError, TypeError, OverflowError):
return default
sched_val = max(0, min(_safe_int(seq.get('vace schedule', 1), 1), len(VACE_MODES) - 1))
# Mode reference dialog # Mode reference dialog
with ui.dialog() as ref_dlg, ui.card(): with ui.dialog() as ref_dlg, ui.card():
@@ -562,18 +645,18 @@ def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_li
fts_input = dict_number('Frame to Skip', seq, 'frame_to_skip').classes( fts_input = dict_number('Frame to Skip', seq, 'frame_to_skip').classes(
'col').props('outlined') 'col').props('outlined')
_original_fts = int(seq.get('frame_to_skip', FRAME_TO_SKIP_DEFAULT)) _original_fts = _safe_int(seq.get('frame_to_skip', FRAME_TO_SKIP_DEFAULT), FRAME_TO_SKIP_DEFAULT)
def shift_fts(idx=i, orig=_original_fts): def shift_fts(idx=i, orig=_original_fts):
new_fts = int(fts_input.value) if fts_input.value is not None else orig new_fts = _safe_int(fts_input.value, orig)
delta = new_fts - orig delta = new_fts - orig
if delta == 0: if delta == 0:
ui.notify('No change to shift', type='info') ui.notify('No change to shift', type='info')
return return
shifted = 0 shifted = 0
for j in range(idx + 1, len(batch_list)): for j in range(idx + 1, len(batch_list)):
batch_list[j]['frame_to_skip'] = int( batch_list[j]['frame_to_skip'] = _safe_int(
batch_list[j].get('frame_to_skip', FRAME_TO_SKIP_DEFAULT)) + delta batch_list[j].get('frame_to_skip', FRAME_TO_SKIP_DEFAULT), FRAME_TO_SKIP_DEFAULT) + delta
shifted += 1 shifted += 1
data[KEY_BATCH_DATA] = batch_list data[KEY_BATCH_DATA] = batch_list
save_json(file_path, data) save_json(file_path, data)
@@ -597,7 +680,7 @@ def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_li
ui.button(icon='help', on_click=ref_dlg.open).props('flat dense round') ui.button(icon='help', on_click=ref_dlg.open).props('flat dense round')
def update_mode_label(e): def update_mode_label(e):
idx = int(e.sender.value) if e.sender.value is not None else 0 idx = _safe_int(e.sender.value, 0)
idx = max(0, min(idx, len(VACE_MODES) - 1)) idx = max(0, min(idx, len(VACE_MODES) - 1))
mode_label.set_text(VACE_MODES[idx]) mode_label.set_text(VACE_MODES[idx])
@@ -611,10 +694,10 @@ def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_li
'outlined').classes('w-full q-mt-sm') 'outlined').classes('w-full q-mt-sm')
# VACE Length + output calculation # VACE Length + output calculation
input_a = int(seq.get('input_a_frames', 16)) input_a = _safe_int(seq.get('input_a_frames', 16), 16)
input_b = int(seq.get('input_b_frames', 16)) input_b = _safe_int(seq.get('input_b_frames', 16), 16)
stored_total = int(seq.get('vace_length', 49)) stored_total = _safe_int(seq.get('vace_length', 49), 49)
mode_idx = int(seq.get('vace schedule', 1)) mode_idx = _safe_int(seq.get('vace schedule', 1), 1)
if mode_idx == 0: if mode_idx == 0:
base_length = max(stored_total - input_a, 1) base_length = max(stored_total - input_a, 1)
@@ -633,10 +716,10 @@ def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_li
# Recalculate VACE output when any input changes # Recalculate VACE output when any input changes
def recalc_vace(*_args): def recalc_vace(*_args):
mi = int(vs_input.value) if vs_input.value is not None else 0 mi = _safe_int(vs_input.value, 0)
ia = int(ia_input.value) if ia_input.value is not None else 16 ia = _safe_int(ia_input.value, 16)
ib = int(ib_input.value) if ib_input.value is not None else 16 ib = _safe_int(ib_input.value, 16)
nb = int(vl_input.value) if vl_input.value is not None else 1 nb = _safe_int(vl_input.value, 1)
if mi == 0: if mi == 0:
raw = nb + ia raw = nb + ia
@@ -721,7 +804,11 @@ def _render_mass_update(batch_list, data, file_path, state: AppState, refresh_li
htree = HistoryTree(data.get(KEY_HISTORY_TREE, {})) htree = HistoryTree(data.get(KEY_HISTORY_TREE, {}))
snapshot = copy.deepcopy(data) snapshot = copy.deepcopy(data)
snapshot.pop(KEY_HISTORY_TREE, None) snapshot.pop(KEY_HISTORY_TREE, None)
try:
htree.commit(snapshot, f"Mass update: {', '.join(selected_keys)}") htree.commit(snapshot, f"Mass update: {', '.join(selected_keys)}")
except ValueError as e:
ui.notify(f'Mass update failed: {e}', type='negative')
return
data[KEY_HISTORY_TREE] = htree.to_dict() data[KEY_HISTORY_TREE] = htree.to_dict()
save_json(file_path, data) save_json(file_path, data)
if state.db_enabled and state.current_project and state.db: if state.db_enabled and state.current_project and state.db:

View File

@@ -139,7 +139,7 @@ def _render_single_instance(state: AppState, instance_config: dict, index: int,
async def refresh_status(): async def refresh_status():
status_container.clear() status_container.clear()
loop = asyncio.get_event_loop() loop = asyncio.get_running_loop()
res, err = await loop.run_in_executor( res, err = await loop.run_in_executor(
None, lambda: _fetch_blocking(f'{comfy_url}/queue')) None, lambda: _fetch_blocking(f'{comfy_url}/queue'))
with status_container: with status_container:
@@ -237,7 +237,7 @@ def _render_single_instance(state: AppState, instance_config: dict, index: int,
async def check_image(): async def check_image():
img_container.clear() img_container.clear()
loop = asyncio.get_event_loop() loop = asyncio.get_running_loop()
res, err = await loop.run_in_executor( res, err = await loop.run_in_executor(
None, lambda: _fetch_blocking(f'{comfy_url}/history', timeout=2)) None, lambda: _fetch_blocking(f'{comfy_url}/history', timeout=2))
with img_container: with img_container:

View File

@@ -1,11 +1,13 @@
import json
import logging import logging
import sqlite3
from pathlib import Path from pathlib import Path
from nicegui import ui from nicegui import ui
from state import AppState from state import AppState
from db import ProjectDB from db import ProjectDB
from utils import save_config, sync_to_db, KEY_BATCH_DATA from utils import save_config, sync_to_db
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -109,6 +111,49 @@ def render_projects_tab(state: AppState):
ui.button('Deactivate', icon='cancel', ui.button('Deactivate', icon='cancel',
on_click=deactivate).props('flat dense') on_click=deactivate).props('flat dense')
async def rename_proj(name=proj['name']):
new_name = await ui.run_javascript(
f'prompt("Rename project:", {json.dumps(name)})',
timeout=30.0,
)
if new_name and new_name.strip() and new_name.strip() != name:
new_name = new_name.strip()
try:
state.db.rename_project(name, new_name)
if state.current_project == name:
state.current_project = new_name
state.config['current_project'] = new_name
save_config(state.current_dir,
state.config.get('favorites', []),
state.config)
ui.notify(f'Renamed to "{new_name}"', type='positive')
render_project_list.refresh()
except sqlite3.IntegrityError:
ui.notify(f'A project named "{new_name}" already exists',
type='warning')
except Exception as e:
ui.notify(f'Error: {e}', type='negative')
ui.button('Rename', icon='edit',
on_click=rename_proj).props('flat dense')
async def change_path(name=proj['name'], path=proj['folder_path']):
new_path = await ui.run_javascript(
f'prompt("New path for project:", {json.dumps(path)})',
timeout=30.0,
)
if new_path and new_path.strip() and new_path.strip() != path:
new_path = new_path.strip()
if not Path(new_path).is_dir():
ui.notify(f'Warning: "{new_path}" does not exist',
type='warning')
state.db.update_project_path(name, new_path)
ui.notify(f'Path updated to "{new_path}"', type='positive')
render_project_list.refresh()
ui.button('Path', icon='folder',
on_click=change_path).props('flat dense')
def import_folder(pid=proj['id'], pname=proj['name']): def import_folder(pid=proj['id'], pname=proj['name']):
_import_folder(state, pid, pname, render_project_list) _import_folder(state, pid, pname, render_project_list)

View File

@@ -1,4 +1,5 @@
import copy import copy
import json
import time import time
from nicegui import ui from nicegui import ui
@@ -9,14 +10,36 @@ from utils import save_json, sync_to_db, KEY_BATCH_DATA, KEY_HISTORY_TREE
def _delete_nodes(htree, data, file_path, node_ids): def _delete_nodes(htree, data, file_path, node_ids):
"""Delete nodes with backup, branch cleanup, and head fallback.""" """Delete nodes with backup, branch cleanup, re-parenting, and head fallback."""
if 'history_tree_backup' not in data: if 'history_tree_backup' not in data:
data['history_tree_backup'] = [] data['history_tree_backup'] = []
data['history_tree_backup'].append(copy.deepcopy(htree.to_dict())) data['history_tree_backup'].append(copy.deepcopy(htree.to_dict()))
data['history_tree_backup'] = data['history_tree_backup'][-10:]
# Save deleted node parents before removal (needed for branch re-pointing)
deleted_parents = {}
for nid in node_ids:
deleted_node = htree.nodes.get(nid)
if deleted_node:
deleted_parents[nid] = deleted_node.get('parent')
# Re-parent children of deleted nodes — walk up to find a surviving ancestor
for nid in node_ids:
surviving_parent = deleted_parents.get(nid)
while surviving_parent in node_ids:
surviving_parent = deleted_parents.get(surviving_parent)
for child in htree.nodes.values():
if child.get('parent') == nid:
child['parent'] = surviving_parent
for nid in node_ids: for nid in node_ids:
htree.nodes.pop(nid, None) htree.nodes.pop(nid, None)
# Re-point branches whose tip was deleted to a surviving ancestor
for b, tip in list(htree.branches.items()): for b, tip in list(htree.branches.items()):
if tip in node_ids: if tip in node_ids:
new_tip = deleted_parents.get(tip)
while new_tip in node_ids:
new_tip = deleted_parents.get(new_tip)
if new_tip and new_tip in htree.nodes:
htree.branches[b] = new_tip
else:
del htree.branches[b] del htree.branches[b]
if htree.head_id in node_ids: if htree.head_id in node_ids:
if htree.nodes: if htree.nodes:
@@ -152,8 +175,12 @@ def _render_batch_delete(htree, data, file_path, state, refresh_fn):
def _walk_branch_nodes(htree, tip_id): def _walk_branch_nodes(htree, tip_id):
"""Walk parent pointers from tip, returning nodes newest-first.""" """Walk parent pointers from tip, returning nodes newest-first."""
nodes = [] nodes = []
visited = set()
current = tip_id current = tip_id
while current and current in htree.nodes: while current and current in htree.nodes:
if current in visited:
break
visited.add(current)
nodes.append(htree.nodes[current]) nodes.append(htree.nodes[current])
current = htree.nodes[current].get('parent') current = htree.nodes[current].get('parent')
return nodes return nodes
@@ -172,10 +199,14 @@ def _find_active_branch(htree):
def _find_branch_for_node(htree, node_id): def _find_branch_for_node(htree, node_id):
"""Return the branch name whose ancestry contains node_id, or None.""" """Return the branch name whose ancestry contains node_id, or None."""
for b_name, tip_id in htree.branches.items(): for b_name, tip_id in htree.branches.items():
visited = set()
current = tip_id current = tip_id
while current and current in htree.nodes: while current and current in htree.nodes:
if current in visited:
break
if current == node_id: if current == node_id:
return b_name return b_name
visited.add(current)
current = htree.nodes[current].get('parent') current = htree.nodes[current].get('parent')
return None return None
@@ -310,6 +341,10 @@ def _render_node_manager(all_nodes, htree, data, file_path, restore_fn, refresh_
_delete_nodes(htree, data, file_path, {sel_id}) _delete_nodes(htree, data, file_path, {sel_id})
if state and state.db_enabled and state.current_project and state.db: if state and state.db_enabled and state.current_project and state.db:
sync_to_db(state.db, state.current_project, file_path, data) sync_to_db(state.db, state.current_project, file_path, data)
# Reset selection if branch was removed
if selected['branch'] not in htree.branches:
selected['branch'] = next(iter(htree.branches), None)
selected['node_id'] = htree.head_id
ui.notify('Node Deleted', type='positive') ui.notify('Node Deleted', type='positive')
refresh_fn() refresh_fn()
@@ -433,7 +468,7 @@ def _render_graphviz(dot_source: str, selected_node_id: str | None = None):
src = graphviz.Source(dot_source) src = graphviz.Source(dot_source)
svg = src.pipe(format='svg').decode('utf-8') svg = src.pipe(format='svg').decode('utf-8')
sel_escaped = selected_node_id.replace("'", "\\'") if selected_node_id else '' sel_escaped = json.dumps(selected_node_id or '')[1:-1] # strip quotes, get JS-safe content
# CSS inline (allowed), JS via run_javascript (script tags blocked) # CSS inline (allowed), JS via run_javascript (script tags blocked)
css = '''<style> css = '''<style>
@@ -490,11 +525,18 @@ def _render_graphviz(dot_source: str, selected_node_id: str | None = None):
def _restore_node(data, node, htree, file_path, state: AppState): def _restore_node(data, node, htree, file_path, state: AppState):
"""Restore a history node as the current version.""" """Restore a history node as the current version (full replace, not merge)."""
node_data = copy.deepcopy(node['data']) node_data = copy.deepcopy(node.get('data', {}))
if KEY_BATCH_DATA not in node_data and KEY_BATCH_DATA in data: # Preserve the history tree before clearing
del data[KEY_BATCH_DATA] preserved_tree = data.get(KEY_HISTORY_TREE)
preserved_backup = data.get('history_tree_backup')
data.clear()
data.update(node_data) data.update(node_data)
# Re-attach history tree (not part of snapshot data)
if preserved_tree is not None:
data[KEY_HISTORY_TREE] = preserved_tree
if preserved_backup is not None:
data['history_tree_backup'] = preserved_backup
htree.head_id = node['id'] htree.head_id = node['id']
data[KEY_HISTORY_TREE] = htree.to_dict() data[KEY_HISTORY_TREE] = htree.to_dict()
save_json(file_path, data) save_json(file_path, data)
@@ -511,7 +553,7 @@ def _render_data_preview(nid, htree):
ui.label('No node selected.').classes('text-caption') ui.label('No node selected.').classes('text-caption')
return return
node_data = htree.nodes[nid]['data'] node_data = htree.nodes[nid].get('data', {})
batch_list = node_data.get(KEY_BATCH_DATA, []) batch_list = node_data.get(KEY_BATCH_DATA, [])
if batch_list and isinstance(batch_list, list) and len(batch_list) > 0: if batch_list and isinstance(batch_list, list) and len(batch_list) > 0:

View File

@@ -1,165 +0,0 @@
import json
import os
import pytest
from json_loader import (
to_float, to_int, get_batch_item, read_json_data,
JSONLoaderDynamic, MAX_DYNAMIC_OUTPUTS,
)
class TestToFloat:
def test_valid(self):
assert to_float("3.14") == 3.14
assert to_float(5) == 5.0
def test_invalid(self):
assert to_float("abc") == 0.0
def test_none(self):
assert to_float(None) == 0.0
class TestToInt:
def test_valid(self):
assert to_int("7") == 7
assert to_int(3.9) == 3
def test_invalid(self):
assert to_int("xyz") == 0
def test_none(self):
assert to_int(None) == 0
class TestGetBatchItem:
def test_lookup_by_sequence_number_field(self):
data = {"batch_data": [
{"sequence_number": 1, "a": "first"},
{"sequence_number": 5, "a": "fifth"},
{"sequence_number": 3, "a": "third"},
]}
assert get_batch_item(data, 5) == {"sequence_number": 5, "a": "fifth"}
assert get_batch_item(data, 3) == {"sequence_number": 3, "a": "third"}
def test_fallback_to_index(self):
data = {"batch_data": [{"a": 1}, {"a": 2}, {"a": 3}]}
assert get_batch_item(data, 2) == {"a": 2}
def test_clamp_high(self):
data = {"batch_data": [{"a": 1}, {"a": 2}]}
assert get_batch_item(data, 99) == {"a": 2}
def test_clamp_low(self):
data = {"batch_data": [{"a": 1}, {"a": 2}]}
assert get_batch_item(data, 0) == {"a": 1}
def test_no_batch_data(self):
data = {"key": "val"}
assert get_batch_item(data, 1) == data
class TestReadJsonData:
def test_missing_file(self, tmp_path):
assert read_json_data(str(tmp_path / "nope.json")) == {}
def test_invalid_json(self, tmp_path):
p = tmp_path / "bad.json"
p.write_text("{broken")
assert read_json_data(str(p)) == {}
def test_non_dict_json(self, tmp_path):
p = tmp_path / "list.json"
p.write_text(json.dumps([1, 2, 3]))
assert read_json_data(str(p)) == {}
def test_valid(self, tmp_path):
p = tmp_path / "ok.json"
p.write_text(json.dumps({"key": "val"}))
assert read_json_data(str(p)) == {"key": "val"}
class TestJSONLoaderDynamic:
def _make_json(self, tmp_path, data):
p = tmp_path / "test.json"
p.write_text(json.dumps(data))
return str(p)
def test_known_keys(self, tmp_path):
path = self._make_json(tmp_path, {"name": "alice", "age": 30, "score": 9.5})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 1, output_keys="name,age,score")
assert result[0] == "alice"
assert result[1] == 30
assert result[2] == 9.5
def test_empty_output_keys(self, tmp_path):
path = self._make_json(tmp_path, {"name": "alice"})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 1, output_keys="")
assert len(result) == MAX_DYNAMIC_OUTPUTS
assert all(v == "" for v in result)
def test_pads_to_max(self, tmp_path):
path = self._make_json(tmp_path, {"a": "1", "b": "2"})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 1, output_keys="a,b")
assert len(result) == MAX_DYNAMIC_OUTPUTS
assert result[0] == "1"
assert result[1] == "2"
assert all(v == "" for v in result[2:])
def test_type_preservation_int(self, tmp_path):
path = self._make_json(tmp_path, {"count": 42})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 1, output_keys="count")
assert result[0] == 42
assert isinstance(result[0], int)
def test_type_preservation_float(self, tmp_path):
path = self._make_json(tmp_path, {"rate": 3.14})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 1, output_keys="rate")
assert result[0] == 3.14
assert isinstance(result[0], float)
def test_type_preservation_str(self, tmp_path):
path = self._make_json(tmp_path, {"label": "hello"})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 1, output_keys="label")
assert result[0] == "hello"
assert isinstance(result[0], str)
def test_bool_becomes_string(self, tmp_path):
path = self._make_json(tmp_path, {"flag": True, "off": False})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 1, output_keys="flag,off")
assert result[0] == "true"
assert result[1] == "false"
assert isinstance(result[0], str)
def test_missing_key_returns_empty_string(self, tmp_path):
path = self._make_json(tmp_path, {"a": "1"})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 1, output_keys="a,nonexistent")
assert result[0] == "1"
assert result[1] == ""
def test_missing_file_returns_all_empty(self, tmp_path):
loader = JSONLoaderDynamic()
result = loader.load_dynamic(str(tmp_path / "nope.json"), 1, output_keys="a,b")
assert len(result) == MAX_DYNAMIC_OUTPUTS
assert result[0] == ""
assert result[1] == ""
def test_batch_data(self, tmp_path):
path = self._make_json(tmp_path, {
"batch_data": [
{"sequence_number": 1, "x": "first"},
{"sequence_number": 2, "x": "second"},
]
})
loader = JSONLoaderDynamic()
result = loader.load_dynamic(path, 2, output_keys="x")
assert result[0] == "second"

View File

@@ -1,3 +1,4 @@
import copy
import json import json
import logging import logging
import os import os
@@ -30,6 +31,7 @@ DEFAULTS = {
"cfg": 1.5, "cfg": 1.5,
# --- Settings --- # --- Settings ---
"mode": 0,
"camera": "static", "camera": "static",
"flf": 0.0, "flf": 0.0,
@@ -112,14 +114,17 @@ def save_config(current_dir, favorites, extra_data=None):
existing = load_config() existing = load_config()
data.update(existing) data.update(existing)
data["last_dir"] = str(current_dir)
data["favorites"] = favorites
if extra_data: if extra_data:
data.update(extra_data) data.update(extra_data)
with open(CONFIG_FILE, 'w') as f: # Force-set explicit params last so extra_data can't override them
data["last_dir"] = str(current_dir)
data["favorites"] = favorites
tmp = CONFIG_FILE.with_suffix('.json.tmp')
with open(tmp, 'w') as f:
json.dump(data, f, indent=4) json.dump(data, f, indent=4)
os.replace(tmp, CONFIG_FILE)
def load_snippets(): def load_snippets():
if SNIPPETS_FILE.exists(): if SNIPPETS_FILE.exists():
@@ -131,8 +136,10 @@ def load_snippets():
return {} return {}
def save_snippets(snippets): def save_snippets(snippets):
with open(SNIPPETS_FILE, 'w') as f: tmp = SNIPPETS_FILE.with_suffix('.json.tmp')
with open(tmp, 'w') as f:
json.dump(snippets, f, indent=4) json.dump(snippets, f, indent=4)
os.replace(tmp, SNIPPETS_FILE)
def load_json(path: str | Path) -> tuple[dict[str, Any], float]: def load_json(path: str | Path) -> tuple[dict[str, Any], float]:
path = Path(path) path = Path(path)
@@ -181,7 +188,7 @@ def sync_to_db(db, project_name: str, file_path: Path, data: dict) -> None:
top_level = {k: v for k, v in data.items() top_level = {k: v for k, v in data.items()
if k not in (KEY_BATCH_DATA, KEY_HISTORY_TREE)} if k not in (KEY_BATCH_DATA, KEY_HISTORY_TREE)}
if not df: if not df:
now = __import__('time').time() now = time.time()
cur = db.conn.execute( cur = db.conn.execute(
"INSERT INTO data_files (project_id, name, data_type, top_level, created_at, updated_at) " "INSERT INTO data_files (project_id, name, data_type, top_level, created_at, updated_at) "
"VALUES (?, ?, ?, ?, ?, ?)", "VALUES (?, ?, ?, ?, ?, ?)",
@@ -191,7 +198,7 @@ def sync_to_db(db, project_name: str, file_path: Path, data: dict) -> None:
else: else:
df_id = df["id"] df_id = df["id"]
# Update top_level metadata # Update top_level metadata
now = __import__('time').time() now = time.time()
db.conn.execute( db.conn.execute(
"UPDATE data_files SET top_level = ?, updated_at = ? WHERE id = ?", "UPDATE data_files SET top_level = ?, updated_at = ? WHERE id = ?",
(json.dumps(top_level), now, df_id), (json.dumps(top_level), now, df_id),
@@ -205,17 +212,18 @@ def sync_to_db(db, project_name: str, file_path: Path, data: dict) -> None:
if not isinstance(item, dict): if not isinstance(item, dict):
continue continue
seq_num = int(item.get(KEY_SEQUENCE_NUMBER, 0)) seq_num = int(item.get(KEY_SEQUENCE_NUMBER, 0))
now = __import__('time').time() now = time.time()
db.conn.execute( db.conn.execute(
"INSERT INTO sequences (data_file_id, sequence_number, data, updated_at) " "INSERT INTO sequences (data_file_id, sequence_number, data, updated_at) "
"VALUES (?, ?, ?, ?)", "VALUES (?, ?, ?, ?) "
"ON CONFLICT(data_file_id, sequence_number) DO UPDATE SET data=excluded.data, updated_at=excluded.updated_at",
(df_id, seq_num, json.dumps(item), now), (df_id, seq_num, json.dumps(item), now),
) )
# Sync history tree # Sync history tree
history_tree = data.get(KEY_HISTORY_TREE) history_tree = data.get(KEY_HISTORY_TREE)
if history_tree and isinstance(history_tree, dict): if history_tree and isinstance(history_tree, dict):
now = __import__('time').time() now = time.time()
db.conn.execute( db.conn.execute(
"INSERT INTO history_trees (data_file_id, tree_data, updated_at) " "INSERT INTO history_trees (data_file_id, tree_data, updated_at) "
"VALUES (?, ?, ?) " "VALUES (?, ?, ?) "
@@ -236,10 +244,10 @@ def sync_to_db(db, project_name: str, file_path: Path, data: dict) -> None:
def generate_templates(current_dir: Path) -> None: def generate_templates(current_dir: Path) -> None:
"""Creates batch template files if folder is empty.""" """Creates batch template files if folder is empty."""
first = DEFAULTS.copy() first = copy.deepcopy(DEFAULTS)
first[KEY_SEQUENCE_NUMBER] = 1 first[KEY_SEQUENCE_NUMBER] = 1
save_json(current_dir / "batch_prompt_i2v.json", {KEY_BATCH_DATA: [first]}) save_json(current_dir / "batch_prompt_i2v.json", {KEY_BATCH_DATA: [first]})
first2 = DEFAULTS.copy() first2 = copy.deepcopy(DEFAULTS)
first2[KEY_SEQUENCE_NUMBER] = 1 first2[KEY_SEQUENCE_NUMBER] = 1
save_json(current_dir / "batch_prompt_vace_extend.json", {KEY_BATCH_DATA: [first2]}) save_json(current_dir / "batch_prompt_vace_extend.json", {KEY_BATCH_DATA: [first2]})

View File

@@ -1,168 +0,0 @@
import { app } from "../../scripts/app.js";
import { api } from "../../scripts/api.js";
app.registerExtension({
name: "json.manager.dynamic",
async beforeRegisterNodeDef(nodeType, nodeData, app) {
if (nodeData.name !== "JSONLoaderDynamic") return;
const origOnNodeCreated = nodeType.prototype.onNodeCreated;
nodeType.prototype.onNodeCreated = function () {
origOnNodeCreated?.apply(this, arguments);
// Hide internal widgets (managed by JS)
for (const name of ["output_keys", "output_types"]) {
const w = this.widgets?.find(w => w.name === name);
if (w) { w.type = "hidden"; w.computeSize = () => [0, -4]; }
}
// Do NOT remove default outputs synchronously here.
// During graph loading, ComfyUI creates all nodes (firing onNodeCreated)
// before configuring them. Other nodes (e.g. Kijai Set/Get) may resolve
// links to our outputs during their configure step. If we remove outputs
// here, those nodes find no output slot and error out.
//
// Instead, defer cleanup: for loaded workflows onConfigure sets _configured
// before this runs; for new nodes the defaults are cleaned up.
this._configured = false;
// Add Refresh button
this.addWidget("button", "Refresh Outputs", null, () => {
this.refreshDynamicOutputs();
});
queueMicrotask(() => {
if (!this._configured) {
// New node (not loading) — remove the 32 Python default outputs
while (this.outputs.length > 0) {
this.removeOutput(0);
}
this.setSize(this.computeSize());
app.graph?.setDirtyCanvas(true, true);
}
});
};
nodeType.prototype.refreshDynamicOutputs = async function () {
const pathWidget = this.widgets?.find(w => w.name === "json_path");
const seqWidget = this.widgets?.find(w => w.name === "sequence_number");
if (!pathWidget?.value) return;
try {
const resp = await api.fetchApi(
`/json_manager/get_keys?path=${encodeURIComponent(pathWidget.value)}&sequence_number=${seqWidget?.value || 1}`
);
const data = await resp.json();
const { keys, types } = data;
// If the file wasn't found, keep existing outputs and links intact
if (data.error === "file_not_found") {
console.warn("[JSONLoaderDynamic] File not found, keeping existing outputs:", pathWidget.value);
return;
}
// Store keys and types in hidden widgets for persistence
const okWidget = this.widgets?.find(w => w.name === "output_keys");
if (okWidget) okWidget.value = keys.join(",");
const otWidget = this.widgets?.find(w => w.name === "output_types");
if (otWidget) otWidget.value = types.join(",");
// Build a map of current output names to slot indices
const oldSlots = {};
for (let i = 0; i < this.outputs.length; i++) {
oldSlots[this.outputs[i].name] = i;
}
// Build new outputs, reusing existing slots to preserve links
const newOutputs = [];
for (let k = 0; k < keys.length; k++) {
const key = keys[k];
const type = types[k] || "*";
if (key in oldSlots) {
// Reuse existing slot object (keeps links intact)
const slot = this.outputs[oldSlots[key]];
slot.type = type;
newOutputs.push(slot);
delete oldSlots[key];
} else {
// New key — create a fresh slot
newOutputs.push({ name: key, type: type, links: null });
}
}
// Disconnect links on slots that are being removed
for (const name in oldSlots) {
const idx = oldSlots[name];
if (this.outputs[idx]?.links?.length) {
for (const linkId of [...this.outputs[idx].links]) {
this.graph?.removeLink(linkId);
}
}
}
// Reassign the outputs array and fix link slot indices
this.outputs = newOutputs;
if (this.graph) {
for (let i = 0; i < this.outputs.length; i++) {
const links = this.outputs[i].links;
if (!links) continue;
for (const linkId of links) {
const link = this.graph.links[linkId];
if (link) link.origin_slot = i;
}
}
}
this.setSize(this.computeSize());
app.graph.setDirtyCanvas(true, true);
} catch (e) {
console.error("[JSONLoaderDynamic] Refresh failed:", e);
}
};
// Restore state on workflow load
const origOnConfigure = nodeType.prototype.onConfigure;
nodeType.prototype.onConfigure = function (info) {
origOnConfigure?.apply(this, arguments);
this._configured = true;
// Hide internal widgets
for (const name of ["output_keys", "output_types"]) {
const w = this.widgets?.find(w => w.name === name);
if (w) { w.type = "hidden"; w.computeSize = () => [0, -4]; }
}
const okWidget = this.widgets?.find(w => w.name === "output_keys");
const otWidget = this.widgets?.find(w => w.name === "output_types");
const keys = okWidget?.value
? okWidget.value.split(",").filter(k => k.trim())
: [];
const types = otWidget?.value
? otWidget.value.split(",")
: [];
if (keys.length > 0) {
// On load, LiteGraph already restored serialized outputs with links.
// Rename and set types to match stored state (preserves links).
for (let i = 0; i < this.outputs.length && i < keys.length; i++) {
this.outputs[i].name = keys[i].trim();
if (types[i]) this.outputs[i].type = types[i];
}
// Remove any extra outputs beyond the key count
while (this.outputs.length > keys.length) {
this.removeOutput(this.outputs.length - 1);
}
} else if (this.outputs.length > 0) {
// Widget values empty but serialized outputs exist — sync widgets
// from the outputs LiteGraph already restored (fallback).
if (okWidget) okWidget.value = this.outputs.map(o => o.name).join(",");
if (otWidget) otWidget.value = this.outputs.map(o => o.type).join(",");
}
this.setSize(this.computeSize());
};
},
});

View File

@@ -34,7 +34,7 @@ app.registerExtension({
// Auto-refresh with 500ms debounce on widget changes // Auto-refresh with 500ms debounce on widget changes
this._refreshTimer = null; this._refreshTimer = null;
const autoRefreshWidgets = ["project_name", "file_name", "sequence_number"]; const autoRefreshWidgets = ["project_name", "file_name", "sequence_number", "refresh"];
for (const widgetName of autoRefreshWidgets) { for (const widgetName of autoRefreshWidgets) {
const w = this.widgets?.find(w => w.name === widgetName); const w = this.widgets?.find(w => w.name === widgetName);
if (w) { if (w) {
@@ -117,7 +117,7 @@ app.registerExtension({
return; return;
} }
// Store keys and types in hidden widgets for persistence (JSON-encoded) // Store keys and types in hidden widgets for persistence (JSON)
const okWidget = this.widgets?.find(w => w.name === "output_keys"); const okWidget = this.widgets?.find(w => w.name === "output_keys");
if (okWidget) okWidget.value = JSON.stringify(keys); if (okWidget) okWidget.value = JSON.stringify(keys);
const otWidget = this.widgets?.find(w => w.name === "output_types"); const otWidget = this.widgets?.find(w => w.name === "output_types");
@@ -143,10 +143,11 @@ app.registerExtension({
if (key in oldSlots) { if (key in oldSlots) {
const slot = this.outputs[oldSlots[key]]; const slot = this.outputs[oldSlots[key]];
slot.type = type; slot.type = type;
slot.label = key;
newOutputs.push(slot); newOutputs.push(slot);
delete oldSlots[key]; delete oldSlots[key];
} else { } else {
newOutputs.push({ name: key, type: type, links: null }); newOutputs.push({ name: key, label: key, type: type, links: null });
} }
} }
@@ -197,36 +198,28 @@ app.registerExtension({
const okWidget = this.widgets?.find(w => w.name === "output_keys"); const okWidget = this.widgets?.find(w => w.name === "output_keys");
const otWidget = this.widgets?.find(w => w.name === "output_types"); const otWidget = this.widgets?.find(w => w.name === "output_types");
// Parse keys/types — try JSON array first, fall back to comma-split
let keys = []; let keys = [];
let types = [];
if (okWidget?.value) { if (okWidget?.value) {
try { keys = JSON.parse(okWidget.value); } catch (_) { try { keys = JSON.parse(okWidget.value); } catch (_) {
keys = okWidget.value.split(",").map(k => k.trim()).filter(Boolean); keys = okWidget.value.split(",").filter(k => k.trim());
} }
} }
let types = [];
if (otWidget?.value) { if (otWidget?.value) {
try { types = JSON.parse(otWidget.value); } catch (_) { try { types = JSON.parse(otWidget.value); } catch (_) {
types = otWidget.value.split(",").map(t => t.trim()).filter(Boolean); types = otWidget.value.split(",");
} }
} }
// Ensure slot 0 is total_sequences (INT) // Ensure slot 0 is total_sequences (INT)
if (this.outputs.length === 0 || this.outputs[0].name !== "total_sequences") { if (this.outputs.length === 0 || this.outputs[0].name !== "total_sequences") {
this.outputs.unshift({ name: "total_sequences", type: "INT", links: null }); this.outputs.unshift({ name: "total_sequences", type: "INT", links: null });
// LiteGraph restores links AFTER onConfigure, so graph.links is
// empty here. Defer link fixup to a microtask that runs after the
// synchronous graph.configure() finishes (including link restoration).
// We must also rebuild output.links arrays because LiteGraph will
// place link IDs on the wrong outputs (shifted by the unshift above).
const node = this; const node = this;
queueMicrotask(() => { queueMicrotask(() => {
if (!node.graph) return; if (!node.graph) return;
// Clear all output.links — they were populated at old indices
for (const output of node.outputs) { for (const output of node.outputs) {
output.links = null; output.links = null;
} }
// Rebuild from graph.links with corrected origin_slot (+1)
for (const linkId in node.graph.links) { for (const linkId in node.graph.links) {
const link = node.graph.links[linkId]; const link = node.graph.links[linkId];
if (!link || link.origin_id !== node.id) continue; if (!link || link.origin_id !== node.id) continue;
@@ -244,23 +237,19 @@ app.registerExtension({
this.outputs[0].name = "total_sequences"; this.outputs[0].name = "total_sequences";
if (keys.length > 0) { if (keys.length > 0) {
// On load, LiteGraph already restored serialized outputs with links.
// Dynamic outputs start at slot 1. Rename and set types to match stored state.
for (let i = 0; i < keys.length; i++) { for (let i = 0; i < keys.length; i++) {
const slotIdx = i + 1; // offset by 1 for total_sequences const slotIdx = i + 1;
if (slotIdx < this.outputs.length) { if (slotIdx < this.outputs.length) {
this.outputs[slotIdx].name = keys[i]; this.outputs[slotIdx].name = keys[i].trim();
this.outputs[slotIdx].label = keys[i].trim();
if (types[i]) this.outputs[slotIdx].type = types[i]; if (types[i]) this.outputs[slotIdx].type = types[i];
} }
} }
// Remove any extra outputs beyond keys + total_sequences
while (this.outputs.length > keys.length + 1) { while (this.outputs.length > keys.length + 1) {
this.removeOutput(this.outputs.length - 1); this.removeOutput(this.outputs.length - 1);
} }
} else if (this.outputs.length > 1) { } else if (this.outputs.length > 1) {
// Widget values empty but serialized dynamic outputs exist — sync widgets // Widget values empty but serialized dynamic outputs exist — sync widgets
// from the outputs LiteGraph already restored (fallback, skip slot 0).
const dynamicOutputs = this.outputs.slice(1); const dynamicOutputs = this.outputs.slice(1);
if (okWidget) okWidget.value = JSON.stringify(dynamicOutputs.map(o => o.name)); if (okWidget) okWidget.value = JSON.stringify(dynamicOutputs.map(o => o.name));
if (otWidget) otWidget.value = JSON.stringify(dynamicOutputs.map(o => o.type)); if (otWidget) otWidget.value = JSON.stringify(dynamicOutputs.map(o => o.type));