-
Notifications
You must be signed in to change notification settings - Fork 794
[ET-VK][ez] Restrict batch norm operator to 4D input tensors only #16420
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/) [ghstack-poisoned]
Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/) ghstack-source-id: 331442243 Pull Request resolved: #16420
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/16420
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 2 New Failures, 1 Unrelated FailureAs of commit 45f8fa9 with merge base c730feb ( NEW FAILURES - The following jobs have failed:
UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
…rs only" Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/) [ghstack-poisoned]
Pull Request resolved: #16420 Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. ghstack-source-id: 331443282 @exported-using-ghexport Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/)
…rs only" Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/) [ghstack-poisoned]
Pull Request resolved: #16420 Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. ghstack-source-id: 331888750 @exported-using-ghexport Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/)
…rs only" Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/) [ghstack-poisoned]
Pull Request resolved: #16420 Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. ghstack-source-id: 331914753 @exported-using-ghexport Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/)
…rs only" Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/) [ghstack-poisoned]
Pull Request resolved: #16420 Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends. The implementation follows the same pattern as the convolution operator, using an `are_node_inputs_supported_fn` callback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit. ghstack-source-id: 331920708 @exported-using-ghexport Differential Revision: [D89935219](https://our.internmc.facebook.com/intern/diff/D89935219/)
|
Addressed comments cc: @mergennachin
|
| sample_inputs, | ||
| ) | ||
|
|
||
| def test_vulkan_backend_batch_norm_after_linear(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this a positive case or negative case?
why not add both tests?
91cd545
into
gh/SS-JIA/391/base
Stack from ghstack (oldest at bottom):
Batch normalization is typically used with 4D tensors (batch, channels, height, width) in convolutional neural networks. This change adds input validation to ensure batch norm is only lowered to the Vulkan backend when the input tensor is 4-dimensional. For other input shapes, the operator will fall back to other backends.
The implementation follows the same pattern as the convolution operator, using an
are_node_inputs_supported_fncallback to validate input shapes during graph partitioning. This prevents potential issues with batch norm on unsupported tensor shapes and makes the operator requirements explicit.Differential Revision: D89935219