🚀 The feature, motivation and pitch
Currently XNNPACK only supports batchnorm when it succeeds a convolution. This is because we can fuse it with the preceeding convolution operation. However we can also do this for Linear layers, by fusing batchnorm with the previous linear. Take a look at the code for fusing with conv, and try to add a pass to do the same for linear:
https://github.com/pytorch/executorch/blob/main/backends/xnnpack/_passes/fuse_batch_norm_with_conv.py
Alternatives
No response
Additional context
No response
RFC (Optional)
No response
🚀 The feature, motivation and pitch
Currently XNNPACK only supports batchnorm when it succeeds a convolution. This is because we can fuse it with the preceeding convolution operation. However we can also do this for Linear layers, by fusing batchnorm with the previous linear. Take a look at the code for fusing with conv, and try to add a pass to do the same for linear:
https://github.com/pytorch/executorch/blob/main/backends/xnnpack/_passes/fuse_batch_norm_with_conv.py
Alternatives
No response
Additional context
No response
RFC (Optional)
No response