Skip to content

Conversation

0x45f
Copy link
Collaborator

@0x45f 0x45f commented Aug 25, 2025

PR Category

Operator

Type of Change

Refactor

Description

Support cascade-attention
https://github.com/vllm-project/vllm/blob/v0.8.5/tests/kernels/attention/test_cascade_flash_attn.py
https://github.com/vllm-project/vllm/blob/v0.8.5/vllm/v1/attention/backends/flash_attn.py#L727

Issue

Progress

  • Change is properly reviewed (1 reviewer required, 2 recommended).
  • Change is responded to an issue.
  • Change is fully covered by a UT.

Performance

Copy link
Contributor

@tongxin tongxin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, benchmark passed! Next we should add more correctness tests for this patch.

@tongxin
Copy link
Contributor

tongxin commented Sep 8, 2025

The lse store mask error was the cause of the failing tests and it's fixed now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants