Skip to content

Implements dual-chunk-flash-attn backend for dual chunk attention with sparse attention support #4285

Implements dual-chunk-flash-attn backend for dual chunk attention with sparse attention support

Implements dual-chunk-flash-attn backend for dual chunk attention with sparse attention support #4285

Triggered via pull request January 27, 2025 04:31
Status Success
Total duration 7m 44s
Artifacts

lint-and-deploy.yaml

on: pull_request
lint-and-deploy
7m 33s
lint-and-deploy
Fit to window
Zoom out
Zoom in

Annotations

1 warning
lint-and-deploy
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636