feat: Add Context Length Slider for Ollama Models #1805
+42
−0
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Add a new slider component to control the context length (num_ctx) parameter for Ollama models. This allows users to adjust the context window size between 8K and 128K tokens.
Changes:
The context length parameter affects how much previous conversation history the model can access when generating responses. Larger values allow for longer memory but consume more RAM.
Screenshots
word count: 9476, token count: 15701, time: 20:02
word count: 945, token count: 1219, tokens used: 16922, model: Ollama (deepseek-r1:70b-llama-distill-q8_0), time: 20:08
Contributor Agreement
By submitting this Pull Request, I confirm that I have read and agree to the following terms:
Please check the box below to confirm:
[x] I have read and agree with the above statement.