You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Based on the inference code provided by @vcadillog (thank you!), I encountered the following error when running the code:
Traceback (most recent call last): File "/home/sschneider/projects/hulk/.env/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 69, in _wrap fn(i, *args) File "/home/sschneider/projects/hulk/test.py", line 24, in multi_inference pipeline = HumanHulk(device) File "/home/sschneider/projects/hulk/inference_model.py", line 209, in __init__ self.det_model = self.load_model(config, path_detect, 0) File "/home/sschneider/projects/hulk/inference_model.py", line 269, in load_model model = create_model(config, self.device) File "/home/sschneider/projects/hulk/inference_model.py", line 114, in create_model decoder_module = decoders.decoder_entry(config.decoder) File "/home/sschneider/projects/hulk/core/models/decoders/__init__.py", line 5, in decoder_entry return globals()[config['type']](**config['kwargs']) File "/home/sschneider/projects/hulk/core/models/decoders/mask2former/meta_arch/unihcpv2_head.py", line 79, in __init__ self.predictor = Hulk_Decoder(in_channels=neck.vis_token_dim, File "/home/sschneider/projects/hulk/core/models/decoders/mask2former/transformer_decoder/hulk_decoder.py", line 690, in __init__ anchor_points = np.load(self.peddet_cfgs.get('pre_defined_path')) File "/home/sschneider/projects/hulk/.env/lib/python3.10/site-packages/numpy/lib/npyio.py", line 427, in load fid = stack.enter_context(open(os_fspath(file), "rb")) FileNotFoundError: [Errno 2] No such file or directory: '289_points_3d.npy'
Where does the file 289_points_3d.npy come from? Many thanks in advance.
The text was updated successfully, but these errors were encountered:
Hi,
I commentend that line in my inference code, as it was not present in the source code, but also those weights are contained in the source model, so it's overwriten by the complete model weights after loading the checkpoint, but it's probably required for training the model from scratch.
The 289_points_3d.npy are coordinates of 17 * 17 points that cover the 1 * 1 space. These coordinates perform as anchors for the detection boxes. I have now uploaded the missing file 289_points_3d.npy. Sorry for the late response.
Based on the inference code provided by @vcadillog (thank you!), I encountered the following error when running the code:
Traceback (most recent call last): File "/home/sschneider/projects/hulk/.env/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 69, in _wrap fn(i, *args) File "/home/sschneider/projects/hulk/test.py", line 24, in multi_inference pipeline = HumanHulk(device) File "/home/sschneider/projects/hulk/inference_model.py", line 209, in __init__ self.det_model = self.load_model(config, path_detect, 0) File "/home/sschneider/projects/hulk/inference_model.py", line 269, in load_model model = create_model(config, self.device) File "/home/sschneider/projects/hulk/inference_model.py", line 114, in create_model decoder_module = decoders.decoder_entry(config.decoder) File "/home/sschneider/projects/hulk/core/models/decoders/__init__.py", line 5, in decoder_entry return globals()[config['type']](**config['kwargs']) File "/home/sschneider/projects/hulk/core/models/decoders/mask2former/meta_arch/unihcpv2_head.py", line 79, in __init__ self.predictor = Hulk_Decoder(in_channels=neck.vis_token_dim, File "/home/sschneider/projects/hulk/core/models/decoders/mask2former/transformer_decoder/hulk_decoder.py", line 690, in __init__ anchor_points = np.load(self.peddet_cfgs.get('pre_defined_path')) File "/home/sschneider/projects/hulk/.env/lib/python3.10/site-packages/numpy/lib/npyio.py", line 427, in load fid = stack.enter_context(open(os_fspath(file), "rb")) FileNotFoundError: [Errno 2] No such file or directory: '289_points_3d.npy'
Where does the file
289_points_3d.npy
come from? Many thanks in advance.The text was updated successfully, but these errors were encountered: