Skip to content

h200 tuning fused_moe_triton config for Mixtral 8x7B/8x22B and Qwen2 57BA14B #3569

h200 tuning fused_moe_triton config for Mixtral 8x7B/8x22B and Qwen2 57BA14B

h200 tuning fused_moe_triton config for Mixtral 8x7B/8x22B and Qwen2 57BA14B #3569

Triggered via pull request December 31, 2024 14:59
@BBufBBuf
synchronize #2689
BBuf:main
Status Success
Total duration 16m 18s
Artifacts

pr-test.yml

on: pull_request
Matrix: unit-test-backend-1-gpu
unit-test-frontend
3m 9s
unit-test-frontend
unit-test-backend-2-gpu
12m 30s
unit-test-backend-2-gpu
performance-test-1-gpu-part-1
9m 41s
performance-test-1-gpu-part-1
performance-test-1-gpu-part-2
11m 3s
performance-test-1-gpu-part-2
performance-test-2-gpu
9m 17s
performance-test-2-gpu
accuracy-test-1-gpu
5m 24s
accuracy-test-1-gpu
accuracy-test-2-gpu
6m 20s
accuracy-test-2-gpu
Fit to window
Zoom out
Zoom in

Annotations

1 warning
finish
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636