You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As SLM rollout is almost completed: #1420, there are some remaining steps to clear the confusion.
Action Items
W0: Tutorial notebook as CI
The Colab notebooks have served as an entrance for many people new to mlc-llm, and thus it would also be nice to add them as tests:
tutorial_chat_module_getting_started.ipynb also seem to broken
Fix notebooks
Add notebooks as tests
W1: Transition from the old mlc_chat_cli to the new SLM CLI
Afterward, we should clean up the models prefixed with mlc-chat- in https://huggingface.co/mlc-ai, and model libraries not in a folder: https://github.com/mlc-ai/binary-mlc-llm-libs, as these outdated and unmaintained models and weights have caused much confusion in the issues (e.g. one below, and some other instances in discord).
W2: Model support parity
Comparing the two directories, python/mlc_chat/model and mlc_llm/relax_model, there some architectures that are still not supported in SLM. While some could be slightly obsolete now, some may still be good to have.
Overview
As SLM rollout is almost completed: #1420, there are some remaining steps to clear the confusion.
Action Items
W0: Tutorial notebook as CI
The Colab notebooks have served as an entrance for many people new to mlc-llm, and thus it would also be nice to add them as tests:
tutorial_chat_module_getting_started.ipynb
also seem to brokenW1: Transition from the old mlc_chat_cli to the new SLM CLI
Afterward, we should clean up the models prefixed with
mlc-chat-
in https://huggingface.co/mlc-ai, and model libraries not in a folder: https://github.com/mlc-ai/binary-mlc-llm-libs, as these outdated and unmaintained models and weights have caused much confusion in the issues (e.g. one below, and some other instances in discord).Function tvmjs.array.decode_storage error when deploy the model WizardCoder-15B-V1.0-q4f16_1 web-llm#273
[Tracking] Transition from the old mlc_chat_cli to the new SLM CLI #1707
W2: Model support parity
Comparing the two directories, python/mlc_chat/model and mlc_llm/relax_model, there some architectures that are still not supported in SLM. While some could be slightly obsolete now, some may still be good to have.
W3: Other compilation errors due to batching
The text was updated successfully, but these errors were encountered: