Skip to content

about the 2-warp loss #4

@horizon0408

Description

@horizon0408

Hi,

Sorry to bother you again.

According to your paper, the 2-warp consistency loss should compare reconstructed $\ddot{I}$ via 2 warp and $\tilde{I}$ that reconstructed directly.

However, in your train.py code, take the module(a) for example,

if args.type_of_2warp == 1:
    mask_4 = [fw_mask[i][[2,3]] for i in range(4)]
    warp2_est_4 = [Resample2d()(left_est[i][[0,1]], disp_est_scale[i][[2,3]]) for i in range(4)]
    loss += 0.1 * sum([warp_2(warp2_est_4[i], left_pyramid[i][[6,7]], mask_4[i], args) for i in range(4)])
    mask_5 = [bw_mask[i][[2,3]] for i in range(4)]
    warp2_est_5 = [Resample2d()(left_est[i][[6,7]], disp_est_scale_2[i][[2,3]]) for i in range(4)]
    loss += 0.1 * sum([warp_2(warp2_est_5[i], left_pyramid[i][[0,1]], mask_5[i], args) for i in range(4)])

from my understanding, the warp2_est_4 corresponds to $\ddot{L_1}$ and warp2_est_5 corresponds to $\ddot{L_2}$, and you use left_pyramid[i][[6,7]] and left_pyramid[i][[0,1]] to compute warp_2 loss, which means you use the original L1 and L2, rather than the directly reconstructed results $\tilde{L_1}$ and $\tilde{L_2}$ (I think should be left_est[i][[6,7]] and left_est[i][[0,1]]?)here.

Is there any typo here, or do I have misunderstanding?

Looking forward to your reply. Appreciate your help!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions