Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for multi-batch processing in torchsearchsorted #10

Open
1 task
mihaimorariu opened this issue Jan 25, 2021 · 0 comments
Open
1 task

Support for multi-batch processing in torchsearchsorted #10

mihaimorariu opened this issue Jan 25, 2021 · 0 comments
Labels
feature New feature request low priority Low priority items size-XS Very small item (a few hours to 1-2 days of work)

Comments

@mihaimorariu
Copy link
Contributor

The implementation of torchsearchsorted that is currently being used does not support multi-batch processing. A for loop in currently being used in NeRF training for handling a batch size larger than one, but that significantly slows down the training process. This needs to be fixed.

Tasks to be completed

  • TODO

Definition of Done
Training NeRF with batch size > 1 yields similar PSNR on the evaluation set after removing the for loop and replacing it with a multi-batch-based torchsearchsorted.

@mihaimorariu mihaimorariu transferred this issue from another repository Aug 17, 2021
@mihaimorariu mihaimorariu added feature New feature request low priority Low priority items size-XS Very small item (a few hours to 1-2 days of work) labels Aug 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature request low priority Low priority items size-XS Very small item (a few hours to 1-2 days of work)
Projects
None yet
Development

No branches or pull requests

1 participant