Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doubt in Probability Attention Module #2

Open
vidit98 opened this issue Dec 29, 2021 · 0 comments
Open

Doubt in Probability Attention Module #2

vidit98 opened this issue Dec 29, 2021 · 0 comments

Comments

@vidit98
Copy link

vidit98 commented Dec 29, 2021

Thanks for the work. May I know if there is an explanation for the condition on the value of gamma in this line

    if self.gamma < -0.01:
        out = x
        attention = None
    else:
        proj_value = x.view(B, -1, W * H)  # D reshape (B,C,H*W)

        proj_query = x.view(B, -1, W * H).permute(0, 2, 1)  # B reshape & transpose (B,H*W,C)
        proj_key = x.view(B, -1, W * H)  # C reshape (B,C,H*W)
        energy = torch.bmm(proj_query, proj_key)  # batch matrix multiplication, (B, H*W, H*W)
        attention = self.softmax(energy)

        out = torch.bmm(proj_value, attention.permute(0, 2, 1))  # (B,C,H*W)
        out = out.view(B, C, H, W)  # new attentioned features

        out = self.gamma * out + x

    return out, attention
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant