Skip to content

[Bug] InstanceNorm2d with relax not supported #17842

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
sidsingla opened this issue Apr 15, 2025 · 1 comment · May be fixed by #17885
Open

[Bug] InstanceNorm2d with relax not supported #17842

sidsingla opened this issue Apr 15, 2025 · 1 comment · May be fixed by #17885
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug

Comments

@sidsingla
Copy link

sidsingla commented Apr 15, 2025

Hi,
TVM version: 0.20.dev0

Here is the Traceback:

Traceback (most recent call last):

File "/home/sidharth_modiface_com/looks_factory_tvm/looks-factory-transfer/convert_to_tvm.py", line 93, in <module>
  mod = from_fx(
        ^^^^^^^^
File "/opt/conda/envs/py311-looks-factory-tvm/lib/python3.11/site-packages/tvm/relax/frontend/torch/fx_translator.py", line 1064, in from_fx
  return TorchFXImporter().from_fx(
         ^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/envs/py311-looks-factory-tvm/lib/python3.11/site-packages/tvm/relax/frontend/torch/fx_translator.py", line 935, in from_fx
  assert (
AssertionError: Unsupported module type <class 'torch.nn.modules.instancenorm.InstanceNorm2d'>

Code snippet:

from tvm.relax.frontend.torch import from_fx

mod = from_fx(
        fx_graph,
        [((1, 3, 256, 256), "float32"), ((1, 1, 256, 256), "float32")],
        keep_params_as_input=True,
    )

Model is mostly based on pix2pix GAN model.
Thanks!

@sidsingla sidsingla added needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug labels Apr 15, 2025
@kavin-sai-krishna
Copy link
Contributor

@mshr-h @tlopex @Hzfengsy
I’ve added the Relax-level instance_norm operator, following standard semantic checks and layout inference. Since topi.instance_norm already exists, I tried using it for legalization, but it produced incorrect results.

On debugging, I found that it reuses layer_norm, which expects gamma and beta to match an n-dimensional normalized_shape, whereas instance_norm requires them to be 1D with shape [C].

In this PR #17885 , I’ve implemented the logic assuming gamma and beta have shape {C}.

Could you please confirm if this approach is correct or if I’ve missed anything? Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants