comfyanonymous
ad81fd682a
Fix issue with cancelling prompt.
1 year ago
space-nuko
03f2d0a764
Rename exception message field
1 year ago
space-nuko
52c9590b7b
Exception message
1 year ago
space-nuko
62bdd9d26a
Catch typecast errors
1 year ago
space-nuko
a9e7e23724
Fix
1 year ago
space-nuko
e2d080b694
Return null for value format
1 year ago
space-nuko
6b2a8a3845
Show message in the frontend if prompt execution raises an exception
1 year ago
space-nuko
ffec815257
Send back more information about exceptions that happen during execution
1 year ago
space-nuko
0d834e3a2b
Add missing input name/config
1 year ago
space-nuko
c33b7c5549
Improve invalid prompt error message
1 year ago
space-nuko
73e85fb3f4
Improve error output for failed nodes
1 year ago
comfyanonymous
48fcc5b777
Parsing error crash.
2 years ago
comfyanonymous
ffc56c53c9
Add a node_errors to the /prompt error json response.
...
"node_errors" contains a dict keyed by node ids. The contents are a message
and a list of dependent outputs.
2 years ago
comfyanonymous
516119ad83
Print min and max values in validation error message.
2 years ago
comfyanonymous
1dd846a7ba
Fix outputs gone from history.
2 years ago
comfyanonymous
9bf67c4c5a
Print prompt execution time.
2 years ago
comfyanonymous
44f9f9baf1
Add the prompt id to some websocket messages.
2 years ago
BlenderNeko
1201d2eae5
Make nodes map over input lists ( #579 )
...
* allow nodes to map over lists
* make work with IS_CHANGED and VALIDATE_INPUTS
* give list outputs distinct socket shape
* add rebatch node
* add batch index logic
* add repeat latent batch
* deal with noise mask edge cases in latentfrombatch
2 years ago
comfyanonymous
dfc74c19d9
Add the prompt_id to some websocket messages.
2 years ago
comfyanonymous
3a7c3acc72
Send websocket message with list of cached nodes right before execution.
2 years ago
comfyanonymous
602095f614
Send execution_error message on websocket on execution exception.
2 years ago
comfyanonymous
d6dee8af1d
Only validate each input once.
2 years ago
comfyanonymous
02ca1c67f8
Don't print traceback when processing interrupted.
2 years ago
comfyanonymous
3a1f9dba20
If IS_CHANGED returns exception delete the output instead of crashing.
2 years ago
comfyanonymous
951c0c2bbe
Don't keep cached outputs for removed nodes.
2 years ago
comfyanonymous
0ac319fd81
Don't delete all outputs when execution gets interrupted.
2 years ago
comfyanonymous
ccad603b2e
Add a way for nodes to validate their own inputs.
2 years ago
ltdrdata
f7a8218814
Add clipspace feature. ( #541 )
...
* Add clipspace feature.
* feat: copy content to clipspace
* feat: paste content from clipspace
Extend validation to allow for validating annotated_path in addition to other parameters.
Add support for annotated_filepath in folder_paths function.
Generalize the '/upload/image' API to allow for uploading images to the 'input', 'temp', or 'output' directories.
* rename contentClipboard -> clipspace
* Do deep copy for imgs on copy to clipspace.
* add original_imgs into clipspace
* Preserve the original image when 'imgs' are modified
* robust patch & refactoring folder_paths about annotated_filepath
* Only show the Paste menu if the ComfyApp.clipspace is not empty
* instant refresh on paste
force triggering 'changed' on paste action
* subfolder fix on paste logic
attach subfolder if subfolder isn't empty
---------
Co-authored-by: Lt.Dr.Data <lt.dr.data@gmail.com>
2 years ago
comfyanonymous
deb2b93e79
Move code to empty gpu cache to model_management.py
2 years ago
藍+85CD
d63705d919
Support releases all unoccupied cached memory from XPU
2 years ago
pythongosssss
6f72c4c6ff
Allows nodes to return ui data and output data
...
Fire executed event on node when message received
2 years ago
Davemane42
1e0f2b232b
add unique_id to nodes hidden inputs
...
@classmethod
def INPUT_TYPES(cls):
return {
"hidden": {"unique_id": "UNIQUE_ID"},
}
2 years ago
comfyanonymous
bb1223d83f
Fix errors appearing more than once.
2 years ago
comfyanonymous
3444ffff3b
Fix IS_CHANGED not working on nodes with an input from another node.
2 years ago
comfyanonymous
f67c00622f
Use inference_mode instead of no_grad.
2 years ago
pythongosssss
5c55c93367
Updated to reuse session id if available
2 years ago
comfyanonymous
c8ce599a8f
Add a button to interrupt processing to the ui.
2 years ago
comfyanonymous
69cc75fbf8
Add a way to interrupt current processing in the backend.
2 years ago
comfyanonymous
5f0f97634f
Only clear cuda cache on CUDA since it causes slowdowns on ROCm.
2 years ago
comfyanonymous
cd85f876f2
Try to clear more memory at the end of each prompt execution.
2 years ago
comfyanonymous
49d2e5bb5a
Move some stuff from main.py to execution.py
2 years ago
comfyanonymous
c0fb0c848f
Update colab notebook.
2 years ago
comfyanonymous
6de6246dd4
Fix some potential issues related to threads.
2 years ago
pythongosssss
9f391ab656
changed to store history by uniqueid
...
fixed removing history items
2 years ago
pythongosssss
5c5725dac0
Remove extra args
2 years ago
pythongosssss
9bd7bfa648
Added workflow history
...
Moved socket output updates to all node executions
Made image rendering on nodes more generic
2 years ago
pythongosssss
a52aa9f4b5
Moved api out to server
...
Reworked sockets to use socketio
Added progress to nodes
Added highlight to active node
Added preview to saveimage node
2 years ago
comfyanonymous
5f375f0d16
Remove my "deleted" debug print that confused people.
2 years ago
masterpiecebestquality
3a83da7281
empty cache after execute()
2 years ago
comfyanonymous
a38a30cb87
Document --highvram and enable it in colab.
2 years ago