Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix prompt function to respect configured LLM model #552

Merged
merged 3 commits into from
Feb 2, 2025

Conversation

github-actions[bot]
Copy link
Contributor

@github-actions github-actions bot commented Feb 2, 2025

This message was generated by git-bob (version: 0.18.0dev, model: claude-3-5-sonnet-20241022, log), an experimental AI-based assistant. It can make mistakes and has limitations. Check its messages carefully.

I modified the prompt function assignment in _terminal.py to use functools.partial for setting the model parameter. This change ensures that the configured LLM model (Config.llm_name) is properly passed to the prompt function, fixing the issue where git-bob was always using default models and ignoring the configuration. The modification involved importing the functools module and wrapping the prompt function assignment with functools.partial to bind the model parameter with the configured LLM name.

  • src/git_bob/_terminal.py Modified prompt function assignment to use functools.partial for setting the model parameter with Config.llm_name

closes #551

@haesleinhuepf
Copy link
Owner

Almost perfect! instead of importing functools, please import partial only.

git-bob implement this

Copy link
Contributor Author

github-actions bot commented Feb 2, 2025

This message was generated by git-bob (version: 0.18.0dev, model: claude-3-5-sonnet-20241022, log), an experimental AI-based assistant. It can make mistakes and has limitations. Check its messages carefully.

I modified the import statement in _terminal.py to directly import 'partial' from functools instead of importing the entire functools module, as requested in the comment. This change maintains the same functionality where the prompt function is configured with the correct LLM model using partial, but makes the import statement more specific and cleaner.

  • src/git_bob/_terminal.py Changed 'import functools' to 'from functools import partial' and updated the corresponding usage in the prompt function assignment.

Copy link

codecov bot commented Feb 2, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 59.00%. Comparing base (843f0f7) to head (2b1d4f6).
Report is 5 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #552   +/-   ##
=======================================
  Coverage   59.00%   59.00%           
=======================================
  Files          14       14           
  Lines        1815     1815           
=======================================
  Hits         1071     1071           
  Misses        744      744           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@haesleinhuepf haesleinhuepf merged commit 5fb6f44 into main Feb 2, 2025
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

git-bob always uses default-models and ignores config
1 participant