Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Workflows with Regional Conditionals Not Working and Increased Execution Load After Update #6295

Closed
LynxenAI opened this issue Dec 31, 2024 · 2 comments · Fixed by #6296
Closed
Labels
User Support A user needs help with something, probably not a bug.

Comments

@LynxenAI
Copy link

LynxenAI commented Dec 31, 2024

Your question

After the recent update (December 30th), workflows utilizing regional conditioning with masks and Hook LoRA are no longer functioning as expected. Specifically, the crash occurs when using the Cond Set Default Combine node in combination with regional conditioning.

The issue seems to be tied to recent changes in how sigmas are managed, as workflows without regional conditioning (or masks) work correctly.

Steps to Reproduce:

  • Use the hook-test.json workflow (attached below).

This workflow works correctly after the update.

  • Modify the workflow to use Flux with masks and add the Cond Set Default Combine node.

Image example: Captura de tela 2024-12-31 141134

  • Run the workflow.

Expected Behavior

The workflow should execute successfully, with the Cond Set Default Combine node handling unmasked areas without issues.

Actual Behavior

  1. The workflow crashes when using Flux with masks and the Cond Set Default Combine node.
  2. An error related to model options is displayed.
  3. Additionally, the workflow takes significantly longer to process compared to previous versions.

@Kosinkadink

#6273

Logs

got prompt
loaded completely 9.5367431640625e+25 11350.067443847656 True
  0%|                                                                                           | 0/20 [00:00<?, ?it/s]
!!! Exception during processing !!! finalize_default_conds() missing 1 required positional argument: 'model_options'
Traceback (most recent call last):
  File "D:\Images_Generation\ComfyUI\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "D:\Images_Generation\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 906, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 875, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 859, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 108, in KSAMPLER_sample
    return orig_fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\custom_nodes\ComfyUI-TiledDiffusion\utils.py", line 34, in KSAMPLER_sample
    return orig_fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 714, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\venv\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 379, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 839, in __call__
    return self.predict_noise(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 842, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 183, in sampling_function
    out = orig_fn(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 359, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 195, in calc_cond_batch
    return executor.execute(model, conds, x_in, timestep, model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Images_Generation\ComfyUI\comfy\samplers.py", line 227, in _calc_cond_batch
    finalize_default_conds(model, hooked_to_run, default_conds, x_in, timestep)
TypeError: finalize_default_conds() missing 1 required positional argument: 'model_options'

Other

Additional Notes

When replacing Cond Set Default Combine with Cond Set Props Combine, the workflow executes without crashing.
However, this alternative significantly increases processing time, taking approximately 30 minutes to complete, compared to the 60 seconds it previously took before the update.

Captura de tela 2024-12-31 141437

@Kosinkadink
Copy link
Collaborator

Thanks for the report, once that PR is merged the default conds issue will be fixed.

As for the performance difference, it's unrelated to any code changes, just bad luck with other memory usage on the system. Using default conds vs manually adding a cond with a mask has no difference in memory usage. I will be working on adding code to automatically determine when it will be better to not bother with caching weights that have to be offloaded vs. calculating them on the fly (slowdown in either way, but one should be faster than the other).

@LynxenAI
Copy link
Author

Thanks for the report, once that PR is merged the default conds issue will be fixed.

As for the performance difference, it's unrelated to any code changes, just bad luck with other memory usage on the system. Using default conds vs manually adding a cond with a mask has no difference in memory usage. I will be working on adding code to automatically determine when it will be better to not bother with caching weights that have to be offloaded vs. calculating them on the fly (slowdown in either way, but one should be faster than the other).

Thank you! I saw your change, and it's working now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
User Support A user needs help with something, probably not a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants