You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the existing issues and checked the recent builds/commits
What happened?
After expanding the model/check-point drop down box and typing some text to filter results, if the user clicks outside of the drop-down box (to cancel), a new model is always selected. Strangely, the model that is selected seems to be the very first model in the list, even if that model doesn't relate to whatever the user typed for the filter.
The expected behavior would be to close the drop-down box without making any changes when the user clicks outside of it (which is what happens if the user presses escape instead of clicking outside of it).
This is a minor bug, but likely easy to fix. I searched for it, but the issue is something that can be phrased a million different ways, so apologies if it has already been reported.
Thank you very much for this fantastic system/app. It has a very elegant software design. I code in c++, but can appreciate the flexibility of python and scripting. Thanks for sharing this amazing tool!
Steps to reproduce the problem
Make sure a checkpoint other than the first one is selected.
Click on the "stable diffusion checkpoint" drop down box control to expand it
Type something to filter the displayed checkpoints
Click somewhere else on the page, in a blank space
Most likely, the first checkpoint will be loaded.
What should have happened?
The drop-down box should just close when the user clicks outside of it. The checkpoint should not change.
venv "C:\Apps\AutoSD\venv\Scripts\Python.exe"
Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
Version: v1.5.1
Commit hash: 68f336bd994bed5442ad95bad6b6ad5564a5409a
Launching Web UI with arguments: --allow-code --opt-split-attention --medvram --opt-sdp-attention --skip-torch-cuda-test --no-half-vae --api --autolaunch --ckpt-dir E:\AIModels
No module 'xformers'. Proceeding without it.
Civitai Helper: Get Custom Model Folder
Civitai Helper: Load setting from: C:\Apps\AutoSD\extensions\Stable-Diffusion-Webui-Civitai-Helper\setting.json
Civitai Helper: No setting file, use default
[-] ADetailer initialized. version: 23.7.10, num models: 9
2023-08-20 22:05:35,245 - ControlNet - INFO - ControlNet v1.1.234
ControlNet preprocessor location: C:\Apps\AutoSD\extensions\sd-webui-controlnet\annotator\downloads
2023-08-20 22:05:35,426 - ControlNet - INFO - ControlNet v1.1.234
Loading weights [9d9156a477] from E:\AIModels\Authentic\526mixV145_v145.safetensors
Creating model from config: C:\Apps\AutoSD\configs\v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Running on local URL: http://127.0.0.1:7860
To create a public link, set`share=True`in`launch()`.Startup time: 21.2s (launcher: 2.9s, import torch: 4.8s, import gradio: 1.4s, setup paths: 1.0s, other imports: 1.1s, setup codeformer: 0.1s, list SD models: 1.1s, load scripts: 5.5s, create ui: 2.3s, gradio launch: 0.7s, add APIs: 0.1s).Applying attention optimization: Doggettx... done.Model loaded in 6.8s (load weights from disk: 1.2s, create model: 0.8s, apply weights to model: 1.9s, apply half(): 2.6s, calculate empty prompt: 0.2s).Loading weights [feadfe3cfe] from E:\AIModels\Authentic\3moonNIReal_3moonNIRealV2.safetensorsApplying attention optimization: Doggettx... done.Weights loaded in 1.7s (load weights from disk: 0.5s, apply weights to model: 1.2s).
Additional information
I have a ton of models, and I keep them on a separate drive that requires a custom arg to link it in. However, I believe this issue has been happening since the beginning.
The text was updated successfully, but these errors were encountered:
catboxanon
added
bug
Report of a confirmed bug
upstream
Issue or feature that must be resolved upstream
gradio
Items related specifically to Gradio (user interface library). May or may not be upstream issues.
and removed
bug-report
Report of a bug, yet to be confirmed
upstream
Issue or feature that must be resolved upstream
labels
Aug 21, 2023
This is actually a problem with all dropdowns. It's related to Gradio but I'm not sure if it's something introduced by us or if it's an upstream issue.
I have this happening as well, so far only with checkpoint menu and Extras upscaler menu, figured it was user error as I recently moved to linux and had no such issue before.
Clicking on the prompt field right after choosing a checkpoint helps avoid this most of the time for me.
Have seen this a couple of times, mostly after selecting a model and trying to type a prompt, even with the selector moved to the Options in main UI .
Moving it there does help prevent the undesired model loading at least, since having the checkpoint dropdown in the main UI only loads new models after pressing generate instead (also useful for having two different models queued for txt2img and img2img for instance).
Is there an existing issue for this?
What happened?
After expanding the model/check-point drop down box and typing some text to filter results, if the user clicks outside of the drop-down box (to cancel), a new model is always selected. Strangely, the model that is selected seems to be the very first model in the list, even if that model doesn't relate to whatever the user typed for the filter.
The expected behavior would be to close the drop-down box without making any changes when the user clicks outside of it (which is what happens if the user presses escape instead of clicking outside of it).
This is a minor bug, but likely easy to fix. I searched for it, but the issue is something that can be phrased a million different ways, so apologies if it has already been reported.
Thank you very much for this fantastic system/app. It has a very elegant software design. I code in c++, but can appreciate the flexibility of python and scripting. Thanks for sharing this amazing tool!
Steps to reproduce the problem
Most likely, the first checkpoint will be loaded.
What should have happened?
The drop-down box should just close when the user clicks outside of it. The checkpoint should not change.
Version or Commit where the problem happens
v1.5.1
What Python version are you running on ?
Python 3.10.x
What platforms do you use to access the UI ?
Windows
What device are you running WebUI on?
Nvidia GPUs (GTX)
Cross attention optimization
sdp
What browsers do you use to access the UI ?
Google Chrome
Command Line Arguments
--allow-code --opt-split-attention --medvram --opt-sdp-attention --skip-torch-cuda-test --no-half-vae --api --autolaunch --ckpt-dir "E:\AIModels"
List of extensions
OneButtonPrompt
Stable-Diffusion-Webui-Civitai-Helper
a1111-sd-webui-lycoris
adetailer
model_preset_manager
openpose-editor
sd-dynamic-prompts
sd-webui-3d-open-pose-editor
sd-webui-controlnet
sd-webui-infinite-image-browsing
sd_delete_button
stable-diffusion-webui-images-browser
ultimate-upscale-for-automatic1111
LDSR
Lora
ScuNET
SwinIR
canvas-zoom-and-pan
extra-options-section
mobile
prompt-bracket-checker
Console logs
Additional information
I have a ton of models, and I keep them on a separate drive that requires a custom arg to link it in. However, I believe this issue has been happening since the beginning.
The text was updated successfully, but these errors were encountered: