v0.12.0
divyashreepathihalli
released this
21 May 22:09
·
14 commits
to r0.12
since this release
Summary
Add PaliGemma, Llama 3, and Phi 3 models.
PaliGemma quickstart, see a complete usage on Kaggle.
pali_gemma_lm = keras_nlp.models.PaliGemmaCausalLM.from_preset(
"pali_gemma_3b_224"
)
pali_gemma_lm.generate(
inputs={
"images": images,
"prompts": prompts,
}
)
What's Changed
- Add CodeGemma 1.1 presets by @grasskin in #1617
- Fix rope scaling factor by @abuelnasr0 in #1605
- Fix the issue of propagating
training
argument in subclasses by @james77777778 in #1623 - Pass kwargs to tokenizer when creating preprocessor by @SamanehSaadat in #1632
- Add phi3 by @abuelnasr0 in #1597
- Add LLaMA 3 tokenizer and preset by @tirthasheshpatel in #1584
- Export missing llama 3 symbol by @mattdangerw in #1633
- PaliGemma by @mattdangerw in #1636
- Update pali_gemma_presets.py by @divyashreepathihalli in #1637
- Update version to 0.13.0 for the master branch by @mattdangerw in #1640
- Update llama3 preset versions by @mattdangerw in #1641
Full Changelog: v0.11.1...v0.12.0