Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] 'tvm.relax.op.nn' has no attribute 'attention_bias' #17486

Open
Cookiee235 opened this issue Oct 23, 2024 · 0 comments
Open

[Bug] 'tvm.relax.op.nn' has no attribute 'attention_bias' #17486

Cookiee235 opened this issue Oct 23, 2024 · 0 comments
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug

Comments

@Cookiee235
Copy link
Contributor

Actual behavior

@I.ir_module
class Module:
    @R.function
    def main(q: R.Tensor((4, 16, 32, 8), dtype="float32"), k: R.Tensor((4, 8, 32, 8), dtype="float32"), v: R.Tensor((4, 8, 32, 16), dtype="float32"), bias: R.Tensor((4, 32, 16, 8), dtype="float32")) -> R.Tensor((4, 16, 32, 16), dtype="float32"):
        gv: R.Tensor((4, 16, 32, 16), dtype="float32") = R.nn.attention_bias(q, k, v, bias, scale=T.float32(0.10000000000000001), causal_mask="TopLeft", window_size=None)
        return gv

error: module 'tvm.relax.op.nn' has no attribute 'attention_bias'
 --> <str>:9:58
   |  
 9 |          gv: R.Tensor((4, 16, 32, 16), dtype="float32") = R.nn.attention_bias(q, k, v, bias, scale=T.float32(0.10000000000000001), causal_mask="TopLeft", window_size=None)
   |                                                           ^^^^^^^^^^^^^^^^^^^ 

Steps to reproduce

irs= """# from tvm.script import ir as I
# from tvm.script import tir as T
# from tvm.script import relax as R

@I.ir_module
class Module:
    @R.function
    def main(q: R.Tensor((4, 16, 32, 8), dtype="float32"), k: R.Tensor((4, 8, 32, 8), dtype="float32"), v: R.Tensor((4, 8, 32, 16), dtype="float32"), bias: R.Tensor((4, 32, 16, 8), dtype="float32")) -> R.Tensor((4, 16, 32, 16), dtype="float32"):
        gv: R.Tensor((4, 16, 32, 16), dtype="float32") = R.nn.attention(q, k, v, bias, scale=T.float32(0.10000000000000001), causal_mask="TopLeft", window_size=None)
        return gv

import tvm
mod = tvm.script.from_source(irs)
mod.show()
mod_new = tvm.script.from_source(mod.script())   # crash!
"""

cc @Lunderberg @junrushao @Hzfengsy @tqchen

@Cookiee235 Cookiee235 added needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug labels Oct 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug
Projects
None yet
Development

No branches or pull requests

1 participant