Skip to content

Fix attention mask type for Flash Attention + CP + THD #6419

Fix attention mask type for Flash Attention + CP + THD

Fix attention mask type for Flash Attention + CP + THD #6419

Triggered via pull request December 5, 2024 21:42
Status Success
Total duration 2m 6s
Artifacts

lint.yml

on: pull_request
PyTorch C++
19s
PyTorch C++
PyTorch Python
1m 56s
PyTorch Python
JAX C++
24s
JAX C++
JAX Python
25s
JAX Python
PaddlePaddle C++
22s
PaddlePaddle C++
PaddlePaddle Python
51s
PaddlePaddle Python
Fit to window
Zoom out
Zoom in

Annotations

6 warnings
PyTorch C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
JAX C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PaddlePaddle C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
JAX Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PaddlePaddle Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PyTorch Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636