-
Notifications
You must be signed in to change notification settings - Fork 18
Insights: allenai/OLMo-core
Overview
-
- 12 Merged pull requests
- 2 Open pull requests
- 1 Closed issue
- 0 New issues
Could not load contribution data
Please try again later
12 Pull requests merged by 2 people
-
consolidate model parallelize code between train modules
#196 merged
Mar 8, 2025 -
make initialization more robust for custom modules
#195 merged
Mar 7, 2025 -
FSDP optimizations
#194 merged
Mar 7, 2025 -
Some fixes
#193 merged
Mar 7, 2025 -
MoE parallelism improvements
#192 merged
Mar 6, 2025 -
Steps per epoch convenience
#190 merged
Mar 6, 2025 -
Add support for BF16 optim state in
SkipStepAdamW
#148 merged
Mar 5, 2025 -
Add quick settings to disable checkpoints and evals
#189 merged
Mar 4, 2025 -
remove use of deprecated
tensor.storage()
#188 merged
Mar 4, 2025 -
update minimum torch version to 2.6.0
#187 merged
Mar 4, 2025 -
Improve TP/CP APIs, add a fused linear loss
#185 merged
Mar 4, 2025 -
Add support for context parallelism (round 2)
#175 merged
Mar 2, 2025
2 Pull requests opened by 2 people
-
Add option to save data paths
#191 opened
Mar 6, 2025 -
Add saving and loading of HF checkpoints
#197 opened
Mar 8, 2025
1 Issue closed by 1 person
-
`get_total_norm` requires `torch==2.6.0`
#186 closed
Mar 4, 2025
2 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
making converter additionally support deepseek-coder dense hf model
#167 commented on
Mar 6, 2025 • 11 new comments -
Learn2code | Review of the love2code configs
#164 commented on
Mar 7, 2025 • 0 new comments