Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add Context Length Slider for Ollama Models #1805

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

LeiHao0
Copy link

@LeiHao0 LeiHao0 commented Jan 29, 2025

Add a new slider component to control the context length (num_ctx) parameter for Ollama models. This allows users to adjust the context window size between 8K and 128K tokens.

Changes:

  • Add new ContextLengthSlider component with predefined marks at 8K, 32K, 64K, and 128K
  • Update step size to 8K increments for better granularity control
  • Add K-formatted value labels for better readability
  • Update ModelSettings type to include ollamaContextLength
  • Integrate context length control in ModelSettingTab
  • Pass context length parameter to Ollama API calls

The context length parameter affects how much previous conversation history the model can access when generating responses. Larger values allow for longer memory but consume more RAM.

Screenshots

Screenshot 2025-01-29 at 20 56 09

word count: 9476, token count: 15701, time: 20:02
word count: 945, token count: 1219, tokens used: 16922, model: Ollama (deepseek-r1:70b-llama-distill-q8_0), time: 20:08

Contributor Agreement

By submitting this Pull Request, I confirm that I have read and agree to the following terms:

  • I agree to contribute all code submitted in this PR to the open-source community edition licensed under GPLv3 and the proprietary official edition without compensation.
  • I grant the official edition development team the rights to freely use, modify, and distribute this code, including for commercial purposes.
  • I confirm that this code is my original work, or I have obtained the appropriate authorization from the copyright holder to submit this code under these terms.
  • I understand that the submitted code will be publicly released under the GPLv3 license, and may also be used in the proprietary official edition.

Please check the box below to confirm:

[x] I have read and agree with the above statement.

Add a new slider component to control the context length (num_ctx) parameter for Ollama models. This allows users to adjust the context window size between 8K and 128K tokens.

Changes:
- Add new ContextLengthSlider component with predefined marks at 8K, 32K, 64K, and 128K
- Update step size to 8K increments for better granularity control
- Add K-formatted value labels for better readability
- Update ModelSettings type to include ollamaContextLength
- Integrate context length control in ModelSettingTab
- Pass context length parameter to Ollama API calls

The context length parameter affects how much previous conversation history the model can access when generating responses. Larger values allow for longer memory but consume more RAM.
@chris-sharl
Copy link

I suggest that in addition to the drag bar, an input box should also be supported. something like: "Temperature".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants