Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Immediate evaluation of variables does not reset between prompts #517

Open
tscott65 opened this issue Jun 8, 2023 · 7 comments
Open

Immediate evaluation of variables does not reset between prompts #517

tscott65 opened this issue Jun 8, 2023 · 7 comments

Comments

@tscott65
Copy link

tscott65 commented Jun 8, 2023

I have the following prompt (and corresponding wildcard files for f_sentient and f_animal):

${animal=!{__f_animal__}} ${sentient=!{__f_sentient__}} [${animal}|${sentient}|${animal}] AND ${sentient} in an ancient forest. Dramatic lighting. dappled lighting. deeps shadows.

Every time I run with a "batch count" > 1, each resulting image uses exactly the same wildcard values. The same is true if I set batch-size > 1 and batch_count = 1. The values seem to repeat no matter what.

However, if I hit "Generate" again, the values are different.

Dynamic Prompts version
45b2137 | Sat Jun 3 10:48:47 2023

version: v1.3.2  •  python: 3.10.7  •  torch: 2.0.1+cu118  •  xformers: N/A  •  gradio: 3.32.0  •  checkpoint: 914077285245b21373 Sat Jun 3 10:48:47 2023

@layluke
Copy link

layluke commented Jun 13, 2023

@tscott65
Instead of using batch count and batch size, under dynamic prompts, enable "Combinatorial generation" and leverage "Max generations" and "Combinatorial batches".

You may still have issues, A few of us are having issues using Immediate evaluation of variables like you are:
#421

@amurorei01
Copy link

amurorei01 commented Aug 13, 2023

I encountered the same problem after update today.
And the "batch count" would automatically be 64 (after rebooting it becomes 16), regardless the number setted.
Reinstalling dosen't help.

@adieyal
Copy link
Owner

adieyal commented Aug 13, 2023

@akx what are your thoughts on this? It isn't a bug per se in that the sampler is designed to fix the value of the variable once it has been evaluated but the expectation of randomised values within a batch is reasonable. This problem would be resolved if we created a new sampler for every batch but then other session-based state like cyclical samplers wouldn't work. Perhaps we should reset variables each time we sample?

@AIrtistry
Copy link

same bug for me

@akx
Copy link
Collaborator

akx commented Aug 14, 2023

@adieyal Hm, don't we reset (or at least overwrite) them in the context for each generation? 🤔

@adieyal
Copy link
Owner

adieyal commented Aug 14, 2023

I'm not 100% sure. The variable assignment is takes place in

https://github.com/adieyal/dynamicprompts/blob/main/src/dynamicprompts/samplers/base.py#L71-L75

    def _get_sequence(
        self,
        command: SequenceCommand,
        context: SamplingContext,
    ) -> StringGen:
        tokens, context = context.process_variable_assignments(command.tokens)
        sub_generators = [context.generator_from_command(c) for c in tokens]


        while True:
            yield rotate_and_join(sub_generators, separator=command.separator)

the variables are processed outside of the loop on line 71 but the generators are sampled inside of the loop and yielded to the caller. From my reading, I think that variables are only ever processed once.

Here is a unit test that fails

def test_random_variables_across_prompts(wildcard_manager: WildcardManager):
    cmd = parse("${ball=!{red|blue}} ${ball} ${ball} ${ball}")
    scon = SamplingContext(
        default_sampling_method=SamplingMethod.RANDOM,
        wildcard_manager=wildcard_manager
    )
    gen = scon.sample_prompts(cmd)
    seen = set(islice(gen, 40))
    assert len(seen) == 2

This change lets the test pass

    def _get_sequence(
        self,
        command: SequenceCommand,
        context: SamplingContext,
    ) -> StringGen:

        while True:
            tokens, context = context.process_variable_assignments(command.tokens)
            sub_generators = [context.generator_from_command(c) for c in tokens]
            yield rotate_and_join(sub_generators, separator=command.separator)

but breaks a lot of other stuff

@adieyal adieyal changed the title Randomization not working Immediate evaluation of variables does not reset between prompts Aug 19, 2023
@anothertal3
Copy link

Not sure if I understand this topic correctly but I'm wondering whether #668 is related?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants