14 Commits

Author SHA1 Message Date
b042fe4368 Fix 25+ bugs across rounds 4-8 of comprehensive code review
history_tree.py:
- Cycle protection in generate_graph() parent walk
- KeyError → .get() for malformed node data in commit() and generate_graph()
- UUID collision check with for/else raise in commit() and _migrate_legacy()
- RuntimeError → ValueError for consistent exception handling

tab_timeline_ng.py:
- Re-parent children walks to surviving ancestor for batch deletes
- Branch tip deletion re-points to parent instead of removing branch
- Cycle protection in _walk_branch_nodes and _find_branch_for_node
- Full data.clear() restore instead of merge in _restore_node
- Safe .get('data', {}) in restore and preview
- Reset stale branch selection after node deletion
- json.dumps for safe JS string escaping in graphviz renderer

tab_batch_ng.py:
- NaN/inf rejection in dict_number with math.isfinite()
- _safe_int used in recalc_vace, update_mode_label, frame_to_skip
- Uncaught ValueError from htree.commit() caught with user notification

tab_comfy_ng.py:
- asyncio.get_event_loop() → get_running_loop()

utils.py:
- Atomic writes for save_config and save_snippets
- save_config extra_data can't override explicit last_dir/favorites

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 10:23:14 +01:00
04b9ed0e27 Fix 4 bugs: SQL conflict handling, HTML escaping, backup cap, safe int cast
- sync_to_db: use ON CONFLICT for duplicate sequence numbers
- history_tree: html.escape() for Graphviz DOT labels
- tab_timeline_ng: cap history_tree_backup to 10 entries
- tab_batch_ng: add _safe_int() helper for VACE settings

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 21:03:13 +01:00
1b8d13f7c4 Fix 7 bugs: bounds checks, deepcopy, time import, JS keys, unused import
- Add bounds check on src_batch index in add_from_source and copy_source
- Guard delete callback against stale index after rapid clicks
- Replace __import__('time').time() with time.time() in sync_to_db
- Use deepcopy(DEFAULTS) consistently in utils.py and main.py
- Use JSON.stringify in JS onConfigure fallback path for key storage
- Read state.show_comfy_monitor for checkbox initial value
- Remove unused KEY_BATCH_DATA import from tab_projects_ng

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 20:41:00 +01:00
497e6b06fb Fix 7 bugs: async proxies, mode default, JS key serialization, validation
- Use asyncio.to_thread for proxy endpoints to avoid blocking event loop
- Add mode to DEFAULTS so it doesn't silently insert 0
- Use JSON serialization for keys in project_dynamic.js (with comma fallback)
- Validate path exists in change_path, friendly error on duplicate rename
- Remove unused exp param from rename closure
- Use deepcopy for DEFAULTS consistently

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 20:29:24 +01:00
993fc86070 Auto-detect change notes when saving without a manual note
Compares current batch data against the last snapshot to generate
descriptive notes like "Added seq 3; Changed: prompt, seed".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 20:24:07 +01:00
c9bcc735f4 Change refresh input from INT to combo dropdown (off/on)
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 14:29:30 +01:00
dc8f44f02b Add refresh input to ProjectLoaderDynamic to force re-fetch
Changing the refresh value triggers the node to re-fetch keys from
the API, picking up any new or modified fields in the data.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 14:27:51 +01:00
2a6b4f5245 Add mode integer field above sequence number in batch cards
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 01:07:09 +01:00
60d1162700 Fix async callbacks: make rename/change_path directly async
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 00:42:00 +01:00
204fc4ea85 Add rename for sequences and projects, per-sub color cycling, project path editing
- Sequences: add rename button with name shown in expansion header
- Subsequences: cycle through 6 distinct border colors by sub-index
- Projects: add rename and change path buttons with DB methods

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 00:40:35 +01:00
033b3415c2 Merge branch 'feature/sqlite-project-db' 2026-03-01 00:32:07 +01:00
2ccc3821d6 Add visual distinction for subsequence cards with teal border
The subsegment-card CSS class was not being applied to subsequence
expansion items. Add the class conditionally and include the teal
accent CSS rules with a 6px left border.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-01 00:31:00 +01:00
615755ba44 Revert "Remove 9 redundant JSON loader nodes, keep only JSONLoaderDynamic"
This reverts commit 4b09491242.
2026-02-28 22:36:25 +01:00
4b09491242 Remove 9 redundant JSON loader nodes, keep only JSONLoaderDynamic
JSONLoaderDynamic auto-discovers keys at runtime, making the hardcoded
Standard, Batch, and Custom nodes unnecessary.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-28 22:33:51 +01:00
10 changed files with 323 additions and 78 deletions

18
db.py
View File

@@ -98,6 +98,24 @@ class ProjectDB:
).fetchone()
return dict(row) if row else None
def rename_project(self, old_name: str, new_name: str) -> bool:
now = time.time()
cur = self.conn.execute(
"UPDATE projects SET name = ?, updated_at = ? WHERE name = ?",
(new_name, now, old_name),
)
self.conn.commit()
return cur.rowcount > 0
def update_project_path(self, name: str, folder_path: str) -> bool:
now = time.time()
cur = self.conn.execute(
"UPDATE projects SET folder_path = ?, updated_at = ? WHERE name = ?",
(folder_path, now, name),
)
self.conn.commit()
return cur.rowcount > 0
def delete_project(self, name: str) -> bool:
cur = self.conn.execute("DELETE FROM projects WHERE name = ?", (name,))
self.conn.commit()

View File

@@ -1,3 +1,4 @@
import html
import time
import uuid
from typing import Any
@@ -17,7 +18,10 @@ class HistoryTree:
def _migrate_legacy(self, old_list: list[dict[str, Any]]) -> None:
parent = None
for item in reversed(old_list):
for _ in range(10):
node_id = str(uuid.uuid4())[:8]
if node_id not in self.nodes:
break
self.nodes[node_id] = {
"id": node_id, "parent": parent, "timestamp": time.time(),
"data": item, "note": item.get("note", "Legacy Import")
@@ -27,7 +31,13 @@ class HistoryTree:
self.head_id = parent
def commit(self, data: dict[str, Any], note: str = "Snapshot") -> str:
# Generate unique node ID with collision check
for _ in range(10):
new_id = str(uuid.uuid4())[:8]
if new_id not in self.nodes:
break
else:
raise ValueError("Failed to generate unique node ID after 10 attempts")
# Cycle detection: walk parent chain from head to verify no cycle
if self.head_id:
@@ -38,7 +48,7 @@ class HistoryTree:
raise ValueError(f"Cycle detected in history tree at node {current}")
visited.add(current)
node = self.nodes.get(current)
current = node["parent"] if node else None
current = node.get("parent") if node else None
active_branch = None
for b_name, tip_id in self.branches.items():
@@ -114,8 +124,12 @@ class HistoryTree:
# Build reverse lookup: node_id -> branch name (walk each branch ancestry)
node_to_branch: dict[str, str] = {}
for b_name, tip_id in self.branches.items():
visited = set()
current = tip_id
while current and current in self.nodes:
if current in visited:
break
visited.add(current)
if current not in node_to_branch:
node_to_branch[current] = b_name
current = self.nodes[current].get('parent')
@@ -154,13 +168,14 @@ class HistoryTree:
full_note = n.get('note', 'Step')
display_note = (full_note[:max_note_len] + '..') if len(full_note) > max_note_len else full_note
display_note = html.escape(display_note)
ts = time.strftime('%b %d %H:%M', time.localtime(n['timestamp']))
# Branch label for tip nodes
branch_label = ""
if nid in tip_to_branches:
branch_label = ", ".join(tip_to_branches[nid])
branch_label = html.escape(", ".join(tip_to_branches[nid]))
# COLORS — per-branch tint, overridden for HEAD and tips
b_name = node_to_branch.get(nid)
@@ -190,11 +205,18 @@ class HistoryTree:
+ '</TABLE>>'
)
safe_tooltip = full_note.replace('"', "'")
dot.append(f' "{nid}" [label={label}, tooltip="{safe_tooltip}"];')
safe_tooltip = (full_note
.replace('\\', '\\\\')
.replace('"', '\\"')
.replace('\n', ' ')
.replace('\r', '')
.replace(']', '&#93;'))
safe_nid = nid.replace('"', '_')
dot.append(f' "{safe_nid}" [label={label}, tooltip="{safe_tooltip}"];')
if n["parent"] and n["parent"] in self.nodes:
dot.append(f' "{n["parent"]}" -> "{nid}";')
if n.get("parent") and n["parent"] in self.nodes:
safe_parent = n["parent"].replace('"', '_')
dot.append(f' "{safe_parent}" -> "{safe_nid}";')
dot.append("}")
return "\n".join(dot)

19
main.py
View File

@@ -1,3 +1,4 @@
import copy
import json
import logging
from pathlib import Path
@@ -156,6 +157,20 @@ def index():
background: rgba(255,255,255,0.2);
}
/* Sub-sequence accent colors (per sub-index, cycling) */
.body--dark .subsegment-color-0 > .q-expansion-item__container > .q-item { border-left: 6px solid #06B6D4; padding-left: 10px; }
.body--dark .subsegment-color-0 .q-expansion-item__toggle-icon { color: #06B6D4 !important; }
.body--dark .subsegment-color-1 > .q-expansion-item__container > .q-item { border-left: 6px solid #A78BFA; padding-left: 10px; }
.body--dark .subsegment-color-1 .q-expansion-item__toggle-icon { color: #A78BFA !important; }
.body--dark .subsegment-color-2 > .q-expansion-item__container > .q-item { border-left: 6px solid #34D399; padding-left: 10px; }
.body--dark .subsegment-color-2 .q-expansion-item__toggle-icon { color: #34D399 !important; }
.body--dark .subsegment-color-3 > .q-expansion-item__container > .q-item { border-left: 6px solid #F472B6; padding-left: 10px; }
.body--dark .subsegment-color-3 .q-expansion-item__toggle-icon { color: #F472B6 !important; }
.body--dark .subsegment-color-4 > .q-expansion-item__container > .q-item { border-left: 6px solid #FBBF24; padding-left: 10px; }
.body--dark .subsegment-color-4 .q-expansion-item__toggle-icon { color: #FBBF24 !important; }
.body--dark .subsegment-color-5 > .q-expansion-item__container > .q-item { border-left: 6px solid #FB923C; padding-left: 10px; }
.body--dark .subsegment-color-5 .q-expansion-item__toggle-icon { color: #FB923C !important; }
/* Secondary pane teal accent */
.pane-secondary .q-field--outlined.q-field--focused .q-field__control:after {
border-color: #06B6D4 !important;
@@ -467,7 +482,7 @@ def render_sidebar(state: AppState, dual_pane: dict):
if not fn.endswith('.json'):
fn += '.json'
path = state.current_dir / fn
first_item = DEFAULTS.copy()
first_item = copy.deepcopy(DEFAULTS)
first_item[KEY_SEQUENCE_NUMBER] = 1
save_json(path, {KEY_BATCH_DATA: [first_item]})
new_fn_input.set_value('')
@@ -500,7 +515,7 @@ def render_sidebar(state: AppState, dual_pane: dict):
state.show_comfy_monitor = e.value
state._render_main.refresh()
ui.checkbox('Show Comfy Monitor', value=True, on_change=on_monitor_toggle)
ui.checkbox('Show Comfy Monitor', value=state.show_comfy_monitor, on_change=on_monitor_toggle)
# Register REST API routes for ComfyUI connectivity (uses the shared DB instance)

View File

@@ -1,3 +1,4 @@
import asyncio
import json
import logging
import urllib.parse
@@ -88,7 +89,7 @@ if PromptServer is not None:
async def list_projects_proxy(request):
manager_url = request.query.get("url", "http://localhost:8080")
url = f"{manager_url.rstrip('/')}/api/projects"
data = _fetch_json(url)
data = await asyncio.to_thread(_fetch_json, url)
return web.json_response(data)
@PromptServer.instance.routes.get("/json_manager/list_project_files")
@@ -96,7 +97,7 @@ if PromptServer is not None:
manager_url = request.query.get("url", "http://localhost:8080")
project = urllib.parse.quote(request.query.get("project", ""), safe='')
url = f"{manager_url.rstrip('/')}/api/projects/{project}/files"
data = _fetch_json(url)
data = await asyncio.to_thread(_fetch_json, url)
return web.json_response(data)
@PromptServer.instance.routes.get("/json_manager/list_project_sequences")
@@ -105,7 +106,7 @@ if PromptServer is not None:
project = urllib.parse.quote(request.query.get("project", ""), safe='')
file_name = urllib.parse.quote(request.query.get("file", ""), safe='')
url = f"{manager_url.rstrip('/')}/api/projects/{project}/files/{file_name}/sequences"
data = _fetch_json(url)
data = await asyncio.to_thread(_fetch_json, url)
return web.json_response(data)
@PromptServer.instance.routes.get("/json_manager/get_project_keys")
@@ -117,7 +118,7 @@ if PromptServer is not None:
seq = int(request.query.get("seq", "1"))
except (ValueError, TypeError):
seq = 1
data = _fetch_keys(manager_url, project, file_name, seq)
data = await asyncio.to_thread(_fetch_keys, manager_url, project, file_name, seq)
if data.get("error") in ("http_error", "network_error", "parse_error"):
status = data.get("status", 502)
return web.json_response(data, status=status)
@@ -138,6 +139,7 @@ class ProjectLoaderDynamic:
"project_name": ("STRING", {"default": "", "multiline": False}),
"file_name": ("STRING", {"default": "", "multiline": False}),
"sequence_number": ("INT", {"default": 1, "min": 1, "max": 9999}),
"refresh": (["off", "on"],),
},
"optional": {
"output_keys": ("STRING", {"default": ""}),
@@ -152,7 +154,7 @@ class ProjectLoaderDynamic:
OUTPUT_NODE = False
def load_dynamic(self, manager_url, project_name, file_name, sequence_number,
output_keys="", output_types=""):
refresh="off", output_keys="", output_types=""):
# Fetch keys metadata (includes total_sequences count)
keys_meta = _fetch_keys(manager_url, project_name, file_name, sequence_number)
if keys_meta.get("error") in ("http_error", "network_error", "parse_error"):

View File

@@ -1,4 +1,6 @@
import copy
import json
import math
import random
from pathlib import Path
@@ -13,6 +15,7 @@ from history_tree import HistoryTree
IMAGE_EXTENSIONS = {'.png', '.jpg', '.jpeg', '.webp', '.bmp', '.gif'}
SUB_SEGMENT_MULTIPLIER = 1000
SUB_SEGMENT_NUM_COLORS = 6
FRAME_TO_SKIP_DEFAULT = DEFAULTS['frame_to_skip']
VACE_MODES = [
@@ -76,6 +79,48 @@ def find_insert_position(batch_list, parent_index, parent_seq_num):
return pos
# --- Auto change note ---
def _auto_change_note(htree, batch_list):
"""Compare current batch_list against last snapshot and describe changes."""
# Get previous batch data from the current head
if not htree.head_id or htree.head_id not in htree.nodes:
return f'Initial save ({len(batch_list)} sequences)'
prev_data = htree.nodes[htree.head_id].get('data', {})
prev_batch = prev_data.get(KEY_BATCH_DATA, [])
prev_by_seq = {int(s.get(KEY_SEQUENCE_NUMBER, 0)): s for s in prev_batch}
curr_by_seq = {int(s.get(KEY_SEQUENCE_NUMBER, 0)): s for s in batch_list}
added = sorted(set(curr_by_seq) - set(prev_by_seq))
removed = sorted(set(prev_by_seq) - set(curr_by_seq))
changed_keys = set()
for seq_num in sorted(set(curr_by_seq) & set(prev_by_seq)):
old, new = prev_by_seq[seq_num], curr_by_seq[seq_num]
all_keys = set(old) | set(new)
for k in all_keys:
if old.get(k) != new.get(k):
changed_keys.add(k)
parts = []
if added:
parts.append(f'Added seq {", ".join(str(s) for s in added)}')
if removed:
parts.append(f'Removed seq {", ".join(str(s) for s in removed)}')
if changed_keys:
# Show up to 4 changed field names
keys_list = sorted(changed_keys)
if len(keys_list) > 4:
keys_str = ', '.join(keys_list[:4]) + f' +{len(keys_list) - 4} more'
else:
keys_str = ', '.join(keys_list)
parts.append(f'Changed: {keys_str}')
return '; '.join(parts) if parts else 'No changes detected'
# --- Helper for repetitive dict-bound inputs ---
def dict_input(element_fn, label, seq, key, **kwargs):
@@ -99,6 +144,8 @@ def dict_number(label, seq, key, default=0, **kwargs):
try:
# Try float first to handle "1.5" strings, then check if it's a clean int
fval = float(val)
if not math.isfinite(fval):
fval = float(default)
val = int(fval) if fval == int(fval) else fval
except (ValueError, TypeError, OverflowError):
val = default
@@ -109,6 +156,9 @@ def dict_number(label, seq, key, default=0, **kwargs):
if v is None:
v = d
elif isinstance(v, float):
if not math.isfinite(v):
v = d
else:
try:
v = int(v) if v == int(v) else v
except (OverflowError, ValueError):
@@ -223,13 +273,13 @@ def render_batch_processor(state: AppState):
with ui.row().classes('q-mt-sm'):
def add_empty():
_add_sequence(DEFAULTS.copy())
_add_sequence(copy.deepcopy(DEFAULTS))
def add_from_source():
item = copy.deepcopy(DEFAULTS)
src_batch = _src_cache['batch']
sel_idx = src_seq_select.value
if src_batch and sel_idx is not None:
if src_batch and sel_idx is not None and int(sel_idx) < len(src_batch):
item.update(copy.deepcopy(src_batch[int(sel_idx)]))
elif _src_cache['data']:
item.update(copy.deepcopy(_src_cache['data']))
@@ -242,7 +292,7 @@ def render_batch_processor(state: AppState):
lora_keys = ['lora 1 high', 'lora 1 low', 'lora 2 high', 'lora 2 low',
'lora 3 high', 'lora 3 low']
standard_keys = {
'general_prompt', 'general_negative', 'current_prompt', 'negative', 'prompt',
'name', 'mode', 'general_prompt', 'general_negative', 'current_prompt', 'negative', 'prompt',
'seed', 'cfg', 'camera', 'flf', KEY_SEQUENCE_NUMBER,
'frame_to_skip', 'end_frame', 'transition', 'vace_length',
'input_a_frames', 'input_b_frames', 'reference switch', 'vace schedule',
@@ -291,8 +341,12 @@ def render_batch_processor(state: AppState):
htree = HistoryTree(tree_data)
snapshot_payload = copy.deepcopy(data)
snapshot_payload.pop(KEY_HISTORY_TREE, None)
note = commit_input.value if commit_input.value else 'Batch Update'
note = commit_input.value if commit_input.value else _auto_change_note(htree, batch_list)
try:
htree.commit(snapshot_payload, note=note)
except ValueError as e:
ui.notify(f'Save failed: {e}', type='negative')
return
data[KEY_HISTORY_TREE] = htree.to_dict()
save_json(file_path, data)
if state.db_enabled and state.current_project and state.db:
@@ -321,21 +375,40 @@ def _render_sequence_card(i, seq, batch_list, data, file_path, state,
refresh_list.refresh()
seq_num = seq.get(KEY_SEQUENCE_NUMBER, i + 1)
seq_name = seq.get('name', '')
if is_subsegment(seq_num):
label = f'Sub #{parent_of(seq_num)}.{sub_index_of(seq_num)} ({int(seq_num)})'
else:
label = f'Sequence #{seq_num}'
if seq_name:
label += f'{seq_name}'
with ui.expansion(label, icon='movie').classes('w-full'):
if is_subsegment(seq_num):
color_idx = (sub_index_of(seq_num) - 1) % SUB_SEGMENT_NUM_COLORS
exp_classes = f'w-full subsegment-color-{color_idx}'
else:
exp_classes = 'w-full'
with ui.expansion(label, icon='movie').classes(exp_classes) as expansion:
# --- Action row ---
with ui.row().classes('w-full q-gutter-sm action-row'):
# Rename
async def rename(s=seq):
result = await ui.run_javascript(
f'prompt("Rename sequence:", {json.dumps(s.get("name", ""))})',
timeout=30.0,
)
if result is not None:
s['name'] = result
commit('Renamed!')
ui.button('Rename', icon='edit', on_click=rename).props('outline')
# Copy from source
def copy_source(idx=i, sn=seq_num):
item = copy.deepcopy(DEFAULTS)
src_batch = src_cache['batch']
sel_idx = src_seq_select.value
if src_batch and sel_idx is not None:
if src_batch and sel_idx is not None and int(sel_idx) < len(src_batch):
item.update(copy.deepcopy(src_batch[int(sel_idx)]))
elif src_cache['data']:
item.update(copy.deepcopy(src_cache['data']))
@@ -390,6 +463,7 @@ def _render_sequence_card(i, seq, batch_list, data, file_path, state,
# Delete
def delete(idx=i):
if idx < len(batch_list):
batch_list.pop(idx)
commit()
@@ -410,6 +484,9 @@ def _render_sequence_card(i, seq, batch_list, data, file_path, state,
'w-full q-mt-sm').props('outlined rows=2')
with splitter.after:
# Mode
dict_number('Mode', seq, 'mode').props('outlined').classes('w-full')
# Sequence number
sn_label = (
f'Seq Number (Sub #{parent_of(seq_num)}.{sub_index_of(seq_num)})'
@@ -541,7 +618,13 @@ def _render_sequence_card(i, seq, batch_list, data, file_path, state,
def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_list):
# VACE Schedule (needed early for both columns)
sched_val = max(0, min(int(seq.get('vace schedule', 1)), len(VACE_MODES) - 1))
def _safe_int(val, default=0):
try:
return int(float(val))
except (ValueError, TypeError, OverflowError):
return default
sched_val = max(0, min(_safe_int(seq.get('vace schedule', 1), 1), len(VACE_MODES) - 1))
# Mode reference dialog
with ui.dialog() as ref_dlg, ui.card():
@@ -562,18 +645,18 @@ def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_li
fts_input = dict_number('Frame to Skip', seq, 'frame_to_skip').classes(
'col').props('outlined')
_original_fts = int(seq.get('frame_to_skip', FRAME_TO_SKIP_DEFAULT))
_original_fts = _safe_int(seq.get('frame_to_skip', FRAME_TO_SKIP_DEFAULT), FRAME_TO_SKIP_DEFAULT)
def shift_fts(idx=i, orig=_original_fts):
new_fts = int(fts_input.value) if fts_input.value is not None else orig
new_fts = _safe_int(fts_input.value, orig)
delta = new_fts - orig
if delta == 0:
ui.notify('No change to shift', type='info')
return
shifted = 0
for j in range(idx + 1, len(batch_list)):
batch_list[j]['frame_to_skip'] = int(
batch_list[j].get('frame_to_skip', FRAME_TO_SKIP_DEFAULT)) + delta
batch_list[j]['frame_to_skip'] = _safe_int(
batch_list[j].get('frame_to_skip', FRAME_TO_SKIP_DEFAULT), FRAME_TO_SKIP_DEFAULT) + delta
shifted += 1
data[KEY_BATCH_DATA] = batch_list
save_json(file_path, data)
@@ -597,7 +680,7 @@ def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_li
ui.button(icon='help', on_click=ref_dlg.open).props('flat dense round')
def update_mode_label(e):
idx = int(e.sender.value) if e.sender.value is not None else 0
idx = _safe_int(e.sender.value, 0)
idx = max(0, min(idx, len(VACE_MODES) - 1))
mode_label.set_text(VACE_MODES[idx])
@@ -611,10 +694,10 @@ def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_li
'outlined').classes('w-full q-mt-sm')
# VACE Length + output calculation
input_a = int(seq.get('input_a_frames', 16))
input_b = int(seq.get('input_b_frames', 16))
stored_total = int(seq.get('vace_length', 49))
mode_idx = int(seq.get('vace schedule', 1))
input_a = _safe_int(seq.get('input_a_frames', 16), 16)
input_b = _safe_int(seq.get('input_b_frames', 16), 16)
stored_total = _safe_int(seq.get('vace_length', 49), 49)
mode_idx = _safe_int(seq.get('vace schedule', 1), 1)
if mode_idx == 0:
base_length = max(stored_total - input_a, 1)
@@ -633,10 +716,10 @@ def _render_vace_settings(i, seq, batch_list, data, file_path, state, refresh_li
# Recalculate VACE output when any input changes
def recalc_vace(*_args):
mi = int(vs_input.value) if vs_input.value is not None else 0
ia = int(ia_input.value) if ia_input.value is not None else 16
ib = int(ib_input.value) if ib_input.value is not None else 16
nb = int(vl_input.value) if vl_input.value is not None else 1
mi = _safe_int(vs_input.value, 0)
ia = _safe_int(ia_input.value, 16)
ib = _safe_int(ib_input.value, 16)
nb = _safe_int(vl_input.value, 1)
if mi == 0:
raw = nb + ia
@@ -721,7 +804,11 @@ def _render_mass_update(batch_list, data, file_path, state: AppState, refresh_li
htree = HistoryTree(data.get(KEY_HISTORY_TREE, {}))
snapshot = copy.deepcopy(data)
snapshot.pop(KEY_HISTORY_TREE, None)
try:
htree.commit(snapshot, f"Mass update: {', '.join(selected_keys)}")
except ValueError as e:
ui.notify(f'Mass update failed: {e}', type='negative')
return
data[KEY_HISTORY_TREE] = htree.to_dict()
save_json(file_path, data)
if state.db_enabled and state.current_project and state.db:

View File

@@ -139,7 +139,7 @@ def _render_single_instance(state: AppState, instance_config: dict, index: int,
async def refresh_status():
status_container.clear()
loop = asyncio.get_event_loop()
loop = asyncio.get_running_loop()
res, err = await loop.run_in_executor(
None, lambda: _fetch_blocking(f'{comfy_url}/queue'))
with status_container:
@@ -237,7 +237,7 @@ def _render_single_instance(state: AppState, instance_config: dict, index: int,
async def check_image():
img_container.clear()
loop = asyncio.get_event_loop()
loop = asyncio.get_running_loop()
res, err = await loop.run_in_executor(
None, lambda: _fetch_blocking(f'{comfy_url}/history', timeout=2))
with img_container:

View File

@@ -1,11 +1,13 @@
import json
import logging
import sqlite3
from pathlib import Path
from nicegui import ui
from state import AppState
from db import ProjectDB
from utils import save_config, sync_to_db, KEY_BATCH_DATA
from utils import save_config, sync_to_db
logger = logging.getLogger(__name__)
@@ -109,6 +111,49 @@ def render_projects_tab(state: AppState):
ui.button('Deactivate', icon='cancel',
on_click=deactivate).props('flat dense')
async def rename_proj(name=proj['name']):
new_name = await ui.run_javascript(
f'prompt("Rename project:", {json.dumps(name)})',
timeout=30.0,
)
if new_name and new_name.strip() and new_name.strip() != name:
new_name = new_name.strip()
try:
state.db.rename_project(name, new_name)
if state.current_project == name:
state.current_project = new_name
state.config['current_project'] = new_name
save_config(state.current_dir,
state.config.get('favorites', []),
state.config)
ui.notify(f'Renamed to "{new_name}"', type='positive')
render_project_list.refresh()
except sqlite3.IntegrityError:
ui.notify(f'A project named "{new_name}" already exists',
type='warning')
except Exception as e:
ui.notify(f'Error: {e}', type='negative')
ui.button('Rename', icon='edit',
on_click=rename_proj).props('flat dense')
async def change_path(name=proj['name'], path=proj['folder_path']):
new_path = await ui.run_javascript(
f'prompt("New path for project:", {json.dumps(path)})',
timeout=30.0,
)
if new_path and new_path.strip() and new_path.strip() != path:
new_path = new_path.strip()
if not Path(new_path).is_dir():
ui.notify(f'Warning: "{new_path}" does not exist',
type='warning')
state.db.update_project_path(name, new_path)
ui.notify(f'Path updated to "{new_path}"', type='positive')
render_project_list.refresh()
ui.button('Path', icon='folder',
on_click=change_path).props('flat dense')
def import_folder(pid=proj['id'], pname=proj['name']):
_import_folder(state, pid, pname, render_project_list)

View File

@@ -1,4 +1,5 @@
import copy
import json
import time
from nicegui import ui
@@ -9,14 +10,36 @@ from utils import save_json, sync_to_db, KEY_BATCH_DATA, KEY_HISTORY_TREE
def _delete_nodes(htree, data, file_path, node_ids):
"""Delete nodes with backup, branch cleanup, and head fallback."""
"""Delete nodes with backup, branch cleanup, re-parenting, and head fallback."""
if 'history_tree_backup' not in data:
data['history_tree_backup'] = []
data['history_tree_backup'].append(copy.deepcopy(htree.to_dict()))
data['history_tree_backup'] = data['history_tree_backup'][-10:]
# Save deleted node parents before removal (needed for branch re-pointing)
deleted_parents = {}
for nid in node_ids:
deleted_node = htree.nodes.get(nid)
if deleted_node:
deleted_parents[nid] = deleted_node.get('parent')
# Re-parent children of deleted nodes — walk up to find a surviving ancestor
for nid in node_ids:
surviving_parent = deleted_parents.get(nid)
while surviving_parent in node_ids:
surviving_parent = deleted_parents.get(surviving_parent)
for child in htree.nodes.values():
if child.get('parent') == nid:
child['parent'] = surviving_parent
for nid in node_ids:
htree.nodes.pop(nid, None)
# Re-point branches whose tip was deleted to a surviving ancestor
for b, tip in list(htree.branches.items()):
if tip in node_ids:
new_tip = deleted_parents.get(tip)
while new_tip in node_ids:
new_tip = deleted_parents.get(new_tip)
if new_tip and new_tip in htree.nodes:
htree.branches[b] = new_tip
else:
del htree.branches[b]
if htree.head_id in node_ids:
if htree.nodes:
@@ -152,8 +175,12 @@ def _render_batch_delete(htree, data, file_path, state, refresh_fn):
def _walk_branch_nodes(htree, tip_id):
"""Walk parent pointers from tip, returning nodes newest-first."""
nodes = []
visited = set()
current = tip_id
while current and current in htree.nodes:
if current in visited:
break
visited.add(current)
nodes.append(htree.nodes[current])
current = htree.nodes[current].get('parent')
return nodes
@@ -172,10 +199,14 @@ def _find_active_branch(htree):
def _find_branch_for_node(htree, node_id):
"""Return the branch name whose ancestry contains node_id, or None."""
for b_name, tip_id in htree.branches.items():
visited = set()
current = tip_id
while current and current in htree.nodes:
if current in visited:
break
if current == node_id:
return b_name
visited.add(current)
current = htree.nodes[current].get('parent')
return None
@@ -310,6 +341,10 @@ def _render_node_manager(all_nodes, htree, data, file_path, restore_fn, refresh_
_delete_nodes(htree, data, file_path, {sel_id})
if state and state.db_enabled and state.current_project and state.db:
sync_to_db(state.db, state.current_project, file_path, data)
# Reset selection if branch was removed
if selected['branch'] not in htree.branches:
selected['branch'] = next(iter(htree.branches), None)
selected['node_id'] = htree.head_id
ui.notify('Node Deleted', type='positive')
refresh_fn()
@@ -433,7 +468,7 @@ def _render_graphviz(dot_source: str, selected_node_id: str | None = None):
src = graphviz.Source(dot_source)
svg = src.pipe(format='svg').decode('utf-8')
sel_escaped = selected_node_id.replace("'", "\\'") if selected_node_id else ''
sel_escaped = json.dumps(selected_node_id or '')[1:-1] # strip quotes, get JS-safe content
# CSS inline (allowed), JS via run_javascript (script tags blocked)
css = '''<style>
@@ -490,11 +525,18 @@ def _render_graphviz(dot_source: str, selected_node_id: str | None = None):
def _restore_node(data, node, htree, file_path, state: AppState):
"""Restore a history node as the current version."""
node_data = copy.deepcopy(node['data'])
if KEY_BATCH_DATA not in node_data and KEY_BATCH_DATA in data:
del data[KEY_BATCH_DATA]
"""Restore a history node as the current version (full replace, not merge)."""
node_data = copy.deepcopy(node.get('data', {}))
# Preserve the history tree before clearing
preserved_tree = data.get(KEY_HISTORY_TREE)
preserved_backup = data.get('history_tree_backup')
data.clear()
data.update(node_data)
# Re-attach history tree (not part of snapshot data)
if preserved_tree is not None:
data[KEY_HISTORY_TREE] = preserved_tree
if preserved_backup is not None:
data['history_tree_backup'] = preserved_backup
htree.head_id = node['id']
data[KEY_HISTORY_TREE] = htree.to_dict()
save_json(file_path, data)
@@ -511,7 +553,7 @@ def _render_data_preview(nid, htree):
ui.label('No node selected.').classes('text-caption')
return
node_data = htree.nodes[nid]['data']
node_data = htree.nodes[nid].get('data', {})
batch_list = node_data.get(KEY_BATCH_DATA, [])
if batch_list and isinstance(batch_list, list) and len(batch_list) > 0:

View File

@@ -1,3 +1,4 @@
import copy
import json
import logging
import os
@@ -30,6 +31,7 @@ DEFAULTS = {
"cfg": 1.5,
# --- Settings ---
"mode": 0,
"camera": "static",
"flf": 0.0,
@@ -112,14 +114,17 @@ def save_config(current_dir, favorites, extra_data=None):
existing = load_config()
data.update(existing)
data["last_dir"] = str(current_dir)
data["favorites"] = favorites
if extra_data:
data.update(extra_data)
with open(CONFIG_FILE, 'w') as f:
# Force-set explicit params last so extra_data can't override them
data["last_dir"] = str(current_dir)
data["favorites"] = favorites
tmp = CONFIG_FILE.with_suffix('.json.tmp')
with open(tmp, 'w') as f:
json.dump(data, f, indent=4)
os.replace(tmp, CONFIG_FILE)
def load_snippets():
if SNIPPETS_FILE.exists():
@@ -131,8 +136,10 @@ def load_snippets():
return {}
def save_snippets(snippets):
with open(SNIPPETS_FILE, 'w') as f:
tmp = SNIPPETS_FILE.with_suffix('.json.tmp')
with open(tmp, 'w') as f:
json.dump(snippets, f, indent=4)
os.replace(tmp, SNIPPETS_FILE)
def load_json(path: str | Path) -> tuple[dict[str, Any], float]:
path = Path(path)
@@ -181,7 +188,7 @@ def sync_to_db(db, project_name: str, file_path: Path, data: dict) -> None:
top_level = {k: v for k, v in data.items()
if k not in (KEY_BATCH_DATA, KEY_HISTORY_TREE)}
if not df:
now = __import__('time').time()
now = time.time()
cur = db.conn.execute(
"INSERT INTO data_files (project_id, name, data_type, top_level, created_at, updated_at) "
"VALUES (?, ?, ?, ?, ?, ?)",
@@ -191,7 +198,7 @@ def sync_to_db(db, project_name: str, file_path: Path, data: dict) -> None:
else:
df_id = df["id"]
# Update top_level metadata
now = __import__('time').time()
now = time.time()
db.conn.execute(
"UPDATE data_files SET top_level = ?, updated_at = ? WHERE id = ?",
(json.dumps(top_level), now, df_id),
@@ -205,17 +212,18 @@ def sync_to_db(db, project_name: str, file_path: Path, data: dict) -> None:
if not isinstance(item, dict):
continue
seq_num = int(item.get(KEY_SEQUENCE_NUMBER, 0))
now = __import__('time').time()
now = time.time()
db.conn.execute(
"INSERT INTO sequences (data_file_id, sequence_number, data, updated_at) "
"VALUES (?, ?, ?, ?)",
"VALUES (?, ?, ?, ?) "
"ON CONFLICT(data_file_id, sequence_number) DO UPDATE SET data=excluded.data, updated_at=excluded.updated_at",
(df_id, seq_num, json.dumps(item), now),
)
# Sync history tree
history_tree = data.get(KEY_HISTORY_TREE)
if history_tree and isinstance(history_tree, dict):
now = __import__('time').time()
now = time.time()
db.conn.execute(
"INSERT INTO history_trees (data_file_id, tree_data, updated_at) "
"VALUES (?, ?, ?) "
@@ -236,10 +244,10 @@ def sync_to_db(db, project_name: str, file_path: Path, data: dict) -> None:
def generate_templates(current_dir: Path) -> None:
"""Creates batch template files if folder is empty."""
first = DEFAULTS.copy()
first = copy.deepcopy(DEFAULTS)
first[KEY_SEQUENCE_NUMBER] = 1
save_json(current_dir / "batch_prompt_i2v.json", {KEY_BATCH_DATA: [first]})
first2 = DEFAULTS.copy()
first2 = copy.deepcopy(DEFAULTS)
first2[KEY_SEQUENCE_NUMBER] = 1
save_json(current_dir / "batch_prompt_vace_extend.json", {KEY_BATCH_DATA: [first2]})

View File

@@ -34,7 +34,7 @@ app.registerExtension({
// Auto-refresh with 500ms debounce on widget changes
this._refreshTimer = null;
const autoRefreshWidgets = ["project_name", "file_name", "sequence_number"];
const autoRefreshWidgets = ["project_name", "file_name", "sequence_number", "refresh"];
for (const widgetName of autoRefreshWidgets) {
const w = this.widgets?.find(w => w.name === widgetName);
if (w) {
@@ -117,11 +117,11 @@ app.registerExtension({
return;
}
// Store keys and types in hidden widgets for persistence (comma-separated)
// Store keys and types in hidden widgets for persistence (JSON)
const okWidget = this.widgets?.find(w => w.name === "output_keys");
if (okWidget) okWidget.value = keys.join(",");
if (okWidget) okWidget.value = JSON.stringify(keys);
const otWidget = this.widgets?.find(w => w.name === "output_types");
if (otWidget) otWidget.value = types.join(",");
if (otWidget) otWidget.value = JSON.stringify(types);
// Slot 0 is always total_sequences (INT) — ensure it exists
if (this.outputs.length === 0 || this.outputs[0].name !== "total_sequences") {
@@ -198,12 +198,18 @@ app.registerExtension({
const okWidget = this.widgets?.find(w => w.name === "output_keys");
const otWidget = this.widgets?.find(w => w.name === "output_types");
const keys = okWidget?.value
? okWidget.value.split(",").filter(k => k.trim())
: [];
const types = otWidget?.value
? otWidget.value.split(",")
: [];
let keys = [];
let types = [];
if (okWidget?.value) {
try { keys = JSON.parse(okWidget.value); } catch (_) {
keys = okWidget.value.split(",").filter(k => k.trim());
}
}
if (otWidget?.value) {
try { types = JSON.parse(otWidget.value); } catch (_) {
types = otWidget.value.split(",");
}
}
// Ensure slot 0 is total_sequences (INT)
if (this.outputs.length === 0 || this.outputs[0].name !== "total_sequences") {
@@ -245,8 +251,8 @@ app.registerExtension({
} else if (this.outputs.length > 1) {
// Widget values empty but serialized dynamic outputs exist — sync widgets
const dynamicOutputs = this.outputs.slice(1);
if (okWidget) okWidget.value = dynamicOutputs.map(o => o.name).join(",");
if (otWidget) otWidget.value = dynamicOutputs.map(o => o.type).join(",");
if (okWidget) okWidget.value = JSON.stringify(dynamicOutputs.map(o => o.name));
if (otWidget) otWidget.value = JSON.stringify(dynamicOutputs.map(o => o.type));
}
this.setSize(this.computeSize());