Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] FusedMoE H200 tuning #2471

Open
2 tasks
zhyncs opened this issue Dec 12, 2024 · 0 comments
Open
2 tasks

[Feature] FusedMoE H200 tuning #2471

zhyncs opened this issue Dec 12, 2024 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@zhyncs
Copy link
Member

zhyncs commented Dec 12, 2024

Checklist

Motivation

ref #2450
https://github.com/sgl-project/sglang/blob/main/benchmark/kernels/fused_moe_triton/README.md
DeepSeek V2, Mixtral 8x7B 8x22B, Qwen MoE etc

BTW Thanks @antferdom

Related resources

No response

@zhyncs zhyncs added the enhancement New feature or request label Dec 12, 2024
@zhyncs zhyncs self-assigned this Dec 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant