You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think the problem is the tensor.expand_shape which is getting put into the dispatch because it is the attention mask, this is similar to the problem fixed by #19838
What happened?
Follow up of [ROCm][Codegen] llama 8b fp8 with attention segfault #19921
New codegen issue llama_f8_attn_bug_log_0213.txt after I rebase iree to
Steps to reproduce your issue
run the following cmd:
What component(s) does this issue relate to?
Compiler
Version information
commit 0ff26a7 (HEAD -> main, upstream/main)
Author: Prashant Kumar [email protected]
Date: Thu Feb 13 23:26:59 2025 +0530
[Codegen] Add support to emulate unsupported float type (#19943)
Additional context
No response
The text was updated successfully, but these errors were encountered: