-
Notifications
You must be signed in to change notification settings - Fork 0
dev(svd): add support for svd #70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
gausshj
commented
Oct 9, 2025
- Add support for svd
- Add test for svd
- Add support for svd - Add test for svd Signed-off-by: Gausshj <[email protected]>
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
我们先review一下业务逻辑,先改这数值svd部分吧。你修改后,我们再说代码风格的事情。
数值svd后,恢复grassmann tensor这部分看起来没毛病。
grassmann_tensor/tensor.py
Outdated
|
||
tensor = tensor.reshape((left_dim, right_dim)) | ||
|
||
U, S, Vh = torch.linalg.svd(tensor.tensor, full_matrices=full_matrices) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里tensor是一个2分块的矩阵,你需要分别进行svd
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
不然的话,原来的分块矩阵进行svd后就不是分块的了
grassmann_tensor/tensor.py
Outdated
U, S, Vh = torch.linalg.svd(tensor.tensor, full_matrices=full_matrices) | ||
|
||
k = min(tensor.tensor.shape[0], tensor.tensor.shape[-1]) | ||
k_index = tensor.tensor.shape.index(k) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
分别进行svd后,需要允许有cut dimension的操作,这个在tn中很常见。大概就是删掉最小几个singular value,只保留最大的若干个,这个个数使用参数传进来,默认不进行cut,这里两个分块需要分别cut。
- Perform SVD separately on even/odd parity blocks instead of entire rank-2 tensor - Update the test case with new implementation Previously, the SVD was applied directly to the full Grassmann tensor, which ignored the parity block structure and produced incorrect decompositions. Now the tensor is split into even/odd blocks before performing SVD, then recombined via block_diag to ensure correct parity preservation. Signed-off-by: Gausshj <[email protected]>