Skip to content

[common/PyTorch] Add FusedAttention support for SWA (left, right) #6487

[common/PyTorch] Add FusedAttention support for SWA (left, right)

[common/PyTorch] Add FusedAttention support for SWA (left, right) #6487

Triggered via pull request December 17, 2024 10:17
Status Success
Total duration 2m 18s
Artifacts

lint.yml

on: pull_request
PyTorch C++
23s
PyTorch C++
PyTorch Python
2m 6s
PyTorch Python
JAX C++
20s
JAX C++
JAX Python
20s
JAX Python
PaddlePaddle C++
25s
PaddlePaddle C++
PaddlePaddle Python
51s
PaddlePaddle Python
Fit to window
Zoom out
Zoom in

Annotations

6 warnings
JAX C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
JAX Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PyTorch C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PaddlePaddle C++
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PaddlePaddle Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
PyTorch Python
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636