Skip to content
This repository has been archived by the owner on Jun 24, 2024. It is now read-only.

Split up initial and per-message prompts for chat mode #216

Closed
philpax opened this issue May 12, 2023 · 2 comments
Closed

Split up initial and per-message prompts for chat mode #216

philpax opened this issue May 12, 2023 · 2 comments
Labels
app:cli App: the `llm` CLI issue:enhancement New feature or request
Milestone

Comments

@philpax
Copy link
Collaborator

philpax commented May 12, 2023

At present, the chat mode will apply the same prompt template for every single user prompt. This makes sense for REPL mode, where you want to reuse the same prompt (e.g. Alpaca instructions), but it doesn't make sense for chat mode, where you want to continue a conversation:

[Initial prompt]

User: [user input]
[Chat response prompt] [bot's response]
User: [user input]
[Chat response prompt] [bot's response]

That is to say, Initial prompt and Chat response prompt should be different.

@philpax philpax added issue:enhancement New feature or request app:cli App: the `llm` CLI labels May 12, 2023
@danforbes
Copy link
Contributor

Just a totally random nitpick, but I think it would be great if the top-level examples folder could be removed (or at least renamed) as a part of this effort.

@ralyodio
Copy link

i agree on both counts.

@philpax philpax added this to the 0.2 milestone May 18, 2023
@philpax philpax closed this as completed in 23d2bf8 Jul 4, 2023
philpax added a commit that referenced this issue Jul 4, 2023
fix #216 - split prompts for chat mode in CLI
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
app:cli App: the `llm` CLI issue:enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants