Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Emacs version #198

Open
ChichoSkruch opened this issue Feb 5, 2025 · 7 comments
Open

Emacs version #198

ChichoSkruch opened this issue Feb 5, 2025 · 7 comments

Comments

@ChichoSkruch
Copy link

Hello,

first I need to say that I use emacs for a couple of months so forgive me if the problem is something that is expected I should know :)

I see you are using setopt. This is present in emacs 29+ does it mean that it is a must have to use ellama?

Ii ask because when I use ellama-code-review I receive this error

Debugger entered--Lisp error: (invalid-slot-type ellama-context-element-buffer name string #)
eieio--validate-slot-value(#s(eieio--class :name ellama-context-element-buffer :docstring "A structure for holding information about a contex..." :parents (#s(eieio--class :name ellama-context-element :docstring "A structu$
eieio-oset(# name #)
#f(compiled-function (obj slots) "Set slots of OBJ with SLOTS which is a list of name/value pairs.\nCalled from the constructor routine." #<bytecode -0xc921a205373af6f>)(#<ellama-context-element-buffer ellama-context-eleme$
apply(#f(compiled-function (obj slots) "Set slots of OBJ with SLOTS which is a list of name/value pairs.\nCalled from the constructor routine." #<bytecode -0xc921a205373af6f>) #<ellama-context-element-buffer ellama-context$
shared-initialize(# (:name #))
#f(compiled-function (this &optional args) "Construct the new object THIS based on ARGS.\nARGS is a property list where odd numbered elements are tags, and\neven numbered elements are the values to store in the tagged slot$
apply(#f(compiled-function (this &optional args) "Construct the new object THIS based on ARGS.\nARGS is a property list where odd numbered elements are tags, and\neven numbered elements are the values to store in the tagge$
initialize-instance(# (:name #))
#f(compiled-function (class &rest slots) "Default constructor for CLASS eieio-default-superclass'.\nSLOTS are the initialization slots used by initialize-instance'.\nThis static method is called when an object is constru$
apply(#f(compiled-function (class &rest slots) "Default constructor for CLASS eieio-default-superclass'.\nSLOTS are the initialization slots used by initialize-instance'.\nThis static method is called when an object is c$
make-instance(ellama-context-element-buffer :name #)
ellama-context-element-buffer(:name #)
ellama-context-add-buffer(#)
ellama-code-review()
funcall-interactively(ellama-code-review)
command-execute(ellama-code-review record)
execute-extended-command(nil "ellama-code-review" "ellama-code-re")
funcall-interactively(execute-extended-command nil "ellama-code-review" "ellama-code-re")
command-execute(execute-extended-command)

Which means that there something related to make-instance, as far as I understand.
If I am totally wrong please point me to my mistake.

@s-kostyaev
Copy link
Owner

You don't need emacs 29 to use ellama. Instead of setopt you can use setq or setq-default.

@s-kostyaev
Copy link
Owner

Show me your ellama configuration without keys (if applicable).

@ChichoSkruch
Copy link
Author

I switch to setq.
Here is my ellama.el
(use-package ellama
:ensure t
:bind ("C-c e" . ellama-transient-main-menu)
:init
;; setup key bindings
(setq ellama-keymap-prefix "C-c l")
;; language you want ellama to translate to
(setq ellama-language "Bulgarian")
;; could be llm-openai for example
(require 'llm-ollama)
(setq ellama-provider
(make-llm-ollama
;; this model should be pulled to use it
;; value should be the same as you print in terminal during pull
:chat-model "deepseek-coder:6.7b"
:embedding-model "nomic-embed-text"
:default-chat-non-standard-params '(("num_ctx" . 8192))))
(setq ellama-summarization-provider
(make-llm-ollama
:chat-model "deepseek-r1:14b"
:embedding-model "nomic-embed-text"
:default-chat-non-standard-params '(("num_ctx" . 32768))))
(setq ellama-coding-provider
(make-llm-ollama
:chat-model "deepseek-coder:6.7b"
:embedding-model "nomic-embed-text"
:default-chat-non-standard-params '(("num_ctx" . 32768))))
;; Predefined llm providers for interactive switching.
;; You shouldn't add ollama providers here - it can be selected interactively
;; without it. It is just example.
(setq ellama-providers
'(("deepseek-r1:14b" . (make-llm-ollama
:chat-model "deepseek-r1:14b"
:embedding-model "deepseek-r1:14b"))
("deepseek-coder:6.7b" . (make-llm-ollama
:chat-model "deepseek-coder:6.7b"
:embedding-model "deepseek-coder:6.7b"))
("llama3.2" . (make-llm-ollama
:chat-model "llama3.2"
:embedding-model "llama3.2"))))
;; Naming new sessions with llm
(setq ellama-naming-provider
(make-llm-ollama
:chat-model "deepseek-r1:14b"
:embedding-model "nomic-embed-text"
:default-chat-non-standard-params '(("stop" . ("\n")))))
(setq ellama-naming-scheme 'ellama-generate-name-by-llm)
;; Translation llm provider
(setq ellama-translation-provider
(make-llm-ollama
:chat-model "deepseek-r1:14b"
:embedding-model "nomic-embed-text"
:default-chat-non-standard-params
'(("num_ctx" . 32768))))
(setq ellama-extraction-provider (make-llm-ollama
:chat-model "deepseek-r1:14b"
:embedding-model "nomic-embed-text"
:default-chat-non-standard-params
'(("num_ctx" . 32768))))
;; customize display buffer behaviour
;; see (info "(elisp) Buffer Display Action Functions")
(setq ellama-chat-display-action-function #'display-buffer-full-frame)
(setq ellama-instant-display-action-function #'display-buffer-at-bottom)
:config
;; send last message in chat buffer with C-c C-c
(add-hook 'org-ctrl-c-ctrl-c-hook #'ellama-chat-send-last-message))

@ChichoSkruch
Copy link
Author

I found where is the problem. It's in this part of ellama.el
;; Configure naming scheme for sessions
(setq ellama-naming-provider
(make-llm-ollama
:chat-model "deepseek-r1:14b"
:embedding-model "nomic-embed-text"
:default-chat-non-standard-params '(("stop" . ("\n")))))

I can't understand the problem, but if I comment it out all works.
Of you someone can explain I'll be thankful.

@s-kostyaev
Copy link
Owner

s-kostyaev commented Feb 6, 2025

Try this one:

(setq ellama-naming-provider
	(make-llm-ollama
	 :chat-model "qwen2.5:3b"
	 :embedding-model "nomic-embed-text"
	 :default-chat-non-standard-params '(("stop" . ("\n")))))

for this to work:

ollama pull qwen2.5:3b

And check if it works.
deepseek-r1:14b not good as naming provider, we need here non reasoning model. But I can't explain why you have this problem.

@ChichoSkruch
Copy link
Author

I did the change you propose but the result is same:
Debugger entered--Lisp error: (wrong-type-argument consp nil)
json-serialize((:messages [(:role "user" :content "I will get you user query, you should return short...")] :model "qwen2.5:3b" :stream :false :options (:stop ("\n"))))

If I can help debug give me guidance. Else I will continue by commenting this block of the code.

@s-kostyaev
Copy link
Owner

try to update ellama and llm to latest versions. I can't reproduce this error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants