Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write tests for .log_abs_det_jacobian() methods #110

Open
fritzo opened this issue Jan 28, 2018 · 1 comment
Open

Write tests for .log_abs_det_jacobian() methods #110

fritzo opened this issue Jan 28, 2018 · 1 comment

Comments

@fritzo
Copy link

fritzo commented Jan 28, 2018

I think we could do this by numerically approximating the Jacobian via finite difference. Alternatively we could stack a bunch of grad()s, computing the determinant as in this suggestion:

def test_jacobian(self):
    t = MyTransform()
    for x in my_test_points:
        y  = t(x)
        jacobian = torch.stack([grad(yi, [x], create_graph=True)[0] for yi in y])
        det_jacobian = torch.potrf(jacobian).diag().prod()**2
        log_abs_det_jacobian = torch.abs(det_jacobian).log()
        self.assertEqual(t.log_abs_det_jacobian(x, y), log_abs_det_jacobian)

It would be nice to make this test generic like the other tests in TestTransforms.

Note that this might not work for StickBreakingTransform because the det computation is over a non-square matrix since the source and target spaces have different dimensions.

@fritzo fritzo changed the title Write finite-difference tests for .log_abs_det_jacobian() methods Write finite-difference tests for .log_abs_det_jacobian() methods Jan 28, 2018
@fritzo fritzo changed the title Write finite-difference tests for .log_abs_det_jacobian() methods Write tests for .log_abs_det_jacobian() methods Jan 29, 2018
@ssnl
Copy link

ssnl commented Mar 27, 2018

BTW, we now have torch.slogdet :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants