-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug: Jan does not support cyrillic (maybe any non-latin symbols) in the path to models. #1713
Comments
@Glashkoff Please try this workaround approach with our experimental feature while we are resolving the issue: |
@Van-QA thanks to the new option, it is now possible to change the dir to one that will contain only Latin symbols, and yes, then the model after move can be launched. |
I tested with llama.cpp and encountered the same problem. It appears to be an issue with the engine, not the application. I will continue investigating further. |
Closing, if issue occurs again, should be filed upstream in respective engines, e.g. llamacpp |
Describe the bug
Nitro fails to load the model in Windows if there is cyrillic in the path to the folder with models. This can happen when the user's name is specified in a national language.
Steps to reproduce
Steps to reproduce the behavior:
Expected behavior
Jan will allow working with local AI in chat mode.
Environment details
Logs
/jan/logs/app.log:
`2024-01-22T23:22:36.794Z [NITRO]::CPU informations - 12
2024-01-22T23:22:36.795Z [NITRO]::Debug: Request to kill Nitro
2024-01-22T23:22:39.916Z [NITRO]::Debug: Nitro process is terminated
2024-01-22T23:22:40.432Z [NITRO]::Debug: Spawning Nitro subprocess...
2024-01-22T23:22:41.086Z [NITRO]::Debug: �[93m �[94m �[93m �[94m �[93m �[94m �[93m_�[94m_�[93m_�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m_�[94m_�[93m_�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m_�[94m_�[93m_�[94m �[93m �[94m �[93m �[94m �[0m
�[93m �[94m �[93m �[94m �[93m �[94m/�[93m_�[94m_�[93m/�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m_�[93m_�[94m_�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m_�[93m_�[94m_�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m\�[94m �[93m �[94m �[93m �[0m
�[93m �[94m �[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m:�[93m\�[94m �[93m �[94m �[0m
�[93m �[94m �[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m:�[93m/�[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m:�[93m/�[94m\�[93m:�[94m\�[93m �[94m �[0m
�[93m �[94m �[93m_�[94m_�[93m_�[94m_�[93m_�[94m\�[93m_�[94m_�[93m\�[94m:�[93m\�[94m �[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m/�[93m �[94m �[93m/�[94m:�[93m/�[94m �[93m �[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[93m �[94m �[93m\�[94m:�[93m\�[94m �[0m
�[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m:�[94m:�[93m:�[94m:�[93m:�[94m:�[93m:�[94m\�[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m\�[94m:�[93m\�[94m_�[93m_�[94m �[93m �[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m/�[94m �[93m/�[94m:�[93m/�[94m_�[93m_�[94m_�[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m/�[94m �[93m\�[94m_�[93m_�[94m\�[93m:�[94m\�[0m
�[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m
�[93m�[94m\�[93m�[94m�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m/�[94m\�[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m/�[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m/�[93m:�[94m:�[93m:�[94m:�[93m:�[94m/�[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m/�[94m �[93m �[94m/�[93m:�[94m/�[0m�[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m
�[94m�[93m�[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m:�[93m:�[94m/�[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m:�[93m/�[94m�[93m�[94m�[93m~�[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m/�[94m:�[93m/�[94m �[0m�[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m/�[93m_�[94m_�[93m/�[94m:�[93m/�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m �[94m �[93m\�[94m:�[93m\�[94m/�[93m:�[94m/�[93m �[94m �[0m
�[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m\�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m\�[94m �[93m �[94m\�[93m:�[94m:�[93m/�[94m �[93m �[94m �[0m
�[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m �[
2024-01-22T23:22:41.297Z [NITRO]::Debug: Loading model with params {"llama_model_path":"C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf","ctx_len":2048,"prompt_template":"<|system|>\n{system_message}<|user|>\n{prompt}<|assistant|>","system_prompt":"<|system|>\n","user_prompt":"<|user|>\n","ai_prompt":"<|assistant|>","cpu_threads":12}
2024-01-22T23:22:41.333Z [NITRO]::Error: ggml_init_cublas: GGML_CUDA_FORCE_MMQ: no
ggml_init_cublas: CUDA_USE_TENSOR_CORES: yes
ggml_init_cublas: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 3060, compute capability 8.6, VMM: yes
2024-01-22T23:22:41.445Z [NITRO]::Debug: 94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m �[93m �[94m\�[93m_�[94m_�[93m\�[94m/�[93m �[94m �[93m �[94m �[0m
�[0m20240122 23:22:41.085000 UTC 32044 INFO Nitro version: - main.cc:44
20240122 23:22:41.086000 UTC 32044 INFO Server started, listening at: 127.0.0.1:3928 - main.cc:48
20240122 23:22:41.086000 UTC 32044 INFO Please load your model - main.cc:49
20240122 23:22:41.086000 UTC 32044 INFO Number of thread is:12 - main.cc:52
20240122 23:22:41.174000 UTC 32044 INFO Not found models folder, start server as usual - llamaCPP.h:2510
20240122 23:22:41.299000 UTC 33752 INFO Setting up GGML CUBLAS PARAMS - llamaCPP.cc:462
{"timestamp":1705965761,"level":"INFO","function":"loadModelImpl","line":478,"message":"system info","n_threads":12,"total_threads":12,"system_info":"AVX = 1 | AVX_VNNI = 0 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 0 | VSX = 0 | "}
2024-01-22T23:22:41.451Z [NITRO]::Error: error loading model: failed to open C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf: No such file or directory
llama_load_model_from_file: failed to load model
llama_init_from_gpt_params: error: failed to load model 'C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf'
2024-01-22T23:22:41.451Z [NITRO]::Debug: {"timestamp":1705965761,"level":"ERROR","function":"load_model","line":557,"message":"unable to load model","model":"C:\Users\Абв\jan\models\tinyllama-1.1b\tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf"}
20240122 23:22:41.445000 UTC 33752 ERROR Error loading the model - llamaCPP.cc:482
2024-01-22T23:22:41.486Z [NITRO]::Debug: 20240122 23:22:41.486000 UTC 29104 INFO Resolved request for task_id:0 - llamaCPP.cc:289
20240122 23:22:41.486000 UTC 29104 ERROR Sending more than 1 response for request. Ignoring later response - HttpServer.cc:355
2024-01-22T23:22:45.859Z [NITRO]::Debug: Request to kill Nitro
`
The text was updated successfully, but these errors were encountered: