-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Issues: Dao-AILab/flash-attention
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Is try_wait on barrier_Q Similar to barrier_O? Is an Additional Wait Needed?
#1319
opened Nov 6, 2024 by
ziyuhuang123
In the FlashAttention 3 (FA3) code, where is the barrier_O phase specified as 1 or 0?
#1317
opened Nov 5, 2024 by
ziyuhuang123
flash_attn_with_kvcach return block_lse or attention_score
#1301
opened Oct 28, 2024 by
NonvolatileMemory
FlashSelfAttention and SelfAttention in flash_attn.modules.mha give different results
#1300
opened Oct 28, 2024 by
senxiu-puleya
Previous Next
ProTip!
Follow long discussions with comments:>50.