You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If try to import the file "D:\Софт\Windows\Базы данных\LM Studio - AI чаты общения\Модели\lmstudio-community\Meta-Llama-3.1-8B-Instruct-GGUF\Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf" it is imported, but the model does not run.
If import "D:\Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf" it works fine.
Minimum reproduction step
Settings -> Import model
Drag and drop "D:\Софт\Windows\Базы данных\LM Studio - AI чаты общения\Модели\lmstudio-community\Meta-Llama-3.1-8B-Instruct-GGUF\Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf".
Start model in list.
Expected behavior
Work on any paths
Screenshots / Logs
2024-07-25T09:42:15.846Z [CORTEX]::Debug: Killing PID 13700
2024-07-25T09:42:15.846Z [CORTEX]::Debug: Request to kill cortex
2024-07-25T09:42:15.982Z [CORTEX]::Debug: cortex process is terminated
2024-07-25T09:42:16.054Z [CORTEX]::CPU information - 7
2024-07-25T09:42:16.055Z [CORTEX]::Debug: Killing PID 13700
2024-07-25T09:42:16.054Z [CORTEX]::Debug: Request to kill cortex
2024-07-25T09:42:16.159Z [CORTEX]::Failed to kill PID - sending request to kill
2024-07-25T09:42:16.161Z [CORTEX]::Debug: cortex process is terminated
2024-07-25T09:42:16.161Z [CORTEX]::Debug: Spawning cortex subprocess...
2024-07-25T09:42:16.161Z [APP]::C:\Users\r57zone\jan\extensions\@janhq\inference-cortex-extension\dist\bin\win-cpu
2024-07-25T09:42:16.161Z [CORTEX]::Debug: Spawn cortex at path: C:\Users\r57zone\jan\extensions\@janhq\inference-cortex-extension\dist\bin\win-cpu\cortex-cpp.exe, and args: 1,127.0.0.1,3928
2024-07-25T09:42:16.261Z [CORTEX]::Debug: cortex exited with code: 1
2024-07-25T09:42:16.265Z [CORTEX]::Debug: Loading model with params {"cpu_threads":7,"ctx_len":2048,"embedding":false,"prompt_template":"{system_message}\n### Instruction: {prompt}\n### Response:","llama_model_path":"D:\\Софт\\Windows\\Базы данных\\LM Studio - AI чаты общения\\Модели\\lmstudio-community\\Meta-Llama-3.1-8B-Instruct-GGUF\\Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf","system_prompt":"","user_prompt":"\n### Instruction: ","ai_prompt":"\n### Response:","model":"Meta-Llama-3.1-8B-Instruct-Q4_K_M-1","ngl":100}
2024-07-25T09:42:16.265Z [CORTEX]::Debug: cortex is ready
2024-07-25T09:42:16.268Z [CORTEX]::Debug: 20240725 09:42:16.181000 UTC 924 INFO cortex-cpp version: default_version - main.cc:73
20240725 09:42:16.181000 UTC 924 INFO cortex.llamacpp version: 0.1.20-30.06.24 - main.cc:78
20240725 09:42:16.181000 UTC 924 INFO Server started, listening at: 127.0.0.1:3928 - main.cc:81
20240725 09:42:16.181000 UTC 924 INFO Please load your model - main.cc:82
20240725 09:42:16.181000 UTC 924 INFO Number of thread is:12 - main.cc:89
20240725 09:42:16.265000 UTC 11376 INFO CPU instruction set: fpu = 1| mmx = 1| sse = 1| sse2 = 1| sse3 = 1| ssse3 = 1| sse4_1 = 1| sse4_2 = 1| pclmulqdq = 1| avx = 1| avx2 = 1| avx512_f = 0| avx512_dq = 0| avx512_ifma = 0| avx512_pf = 0| avx512_er = 0| avx512_cd = 0| avx512_bw = 0| has_avx512_vl = 0| has_avx512_vbmi = 0| has_avx512_vbmi2 = 0| avx512_vnni = 0| avx512_bitalg = 0| avx512_vpopcntdq = 0| avx512_4vnniw = 0| avx512_4fmaps = 0| avx512_vp2intersect = 0| aes = 1| f16c = 1| - server.cc:277
20240725 09:42:16.265000 UTC 11376 INFO Loaded engine: cortex.llamacpp - server.cc:305
20240725 09:42:16.265000 UTC 11376 INFO cortex.llamacpp version: default_version - llama_engine.cc:159
20240725 09:42:16.265000 UTC 11376 ERROR Could not find model in path D:\Софт\Windows\Базы данных\LM Studio - AI чаты общения\Модели\lmstudio-community\Meta-Llama-3.1-8B-Instruct-GGUF\Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf - llama_engine.cc:332
2024-07-25T09:42:16.270Z [CORTEX]::Debug: 20240725 09:42:16.265000 UTC 11376 DEBUG [LoadModelImpl] cache_type: f16 - llama_engine.cc:356
20240725 09:42:16.265000 UTC 11376 DEBUG [LoadModelImpl] Enabled Flash Attention - llama_engine.cc:365
20240725 09:42:16.265000 UTC 11376 DEBUG [LoadModelImpl] stop: null
- llama_engine.cc:386
{"timestamp":1721900536,"level":"INFO","function":"LoadModelImpl","line":409,"message":"system info","n_threads":7,"total_threads":12,"system_info":"AVX = 1 | AVX_VNNI = 0 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 1 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 0 | "}
2024-07-25T09:42:16.271Z [CORTEX]::Debug: Load model success with response {}
2024-07-25T09:42:16.271Z [CORTEX]::Debug: {"timestamp":1721900536,"level":"ERROR","function":"LoadModel","line":183,"message":"llama.cpp unable to load model","model":""}
20240725 09:42:16.265000 UTC 11376 ERROR Error loading the model - llama_engine.cc:414
2024-07-25T09:42:16.271Z [CORTEX]::Error: llama_model_load: error loading model: llama_model_loader: failed to load model from
llama_load_model_from_file: failed to load model
llama_init_from_gpt_params: error: failed to load model ''
2024-07-25T09:42:16.272Z [CORTEX]::Debug: Validate model state with response 409
2024-07-25T09:42:16.272Z [CORTEX]::Debug: Validate model state failed with response "Conflict"
2024-07-25T09:42:16.272Z [CORTEX]::Error: Validate model status failed
Jan version
0.5.2
In which operating systems have you tested?
macOS
Windows
Linux
Environment details
No response
The text was updated successfully, but these errors were encountered:
@r57zone,
Sorry for the inconvenience,
The issue is related to #1713, dev team did spend time investigating for a solution but we are working on a refactoring version of the Jan x Cortex, so it might take a while for the issue to be resolved.
Feel free to follow up via the related issue.
Current behavior
If try to import the file "D:\Софт\Windows\Базы данных\LM Studio - AI чаты общения\Модели\lmstudio-community\Meta-Llama-3.1-8B-Instruct-GGUF\Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf" it is imported, but the model does not run.
If import "D:\Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf" it works fine.
Minimum reproduction step
Expected behavior
Work on any paths
Screenshots / Logs
Jan version
0.5.2
In which operating systems have you tested?
Environment details
No response
The text was updated successfully, but these errors were encountered: