Similar to #1346, conv also has asymmetric padding cases, especially in frameworks with 'SAME‘ padding support. But adding this support will have much more impact on current code than pooling ops, so we file this RFC to discuss its necessity and implementation details.
Necessity
In current tensorflow frontend, conv padding is supported by adding a separate pad nnvm op. Consequently, the separate pad op can not be fused to conv, which leads to imperfect performance. Supporting asymmetric padding natively in relay/nnvm can solve this problem.
Breakdown
We propose to support it in 3 steps:
@srkreddy1238 @FrozenGene
Similar to #1346, conv also has asymmetric padding cases, especially in frameworks with 'SAME‘ padding support. But adding this support will have much more impact on current code than pooling ops, so we file this RFC to discuss its necessity and implementation details.
Necessity
In current tensorflow frontend, conv padding is supported by adding a separate pad nnvm op. Consequently, the separate pad op can not be fused to conv, which leads to imperfect performance. Supporting asymmetric padding natively in relay/nnvm can solve this problem.
Breakdown
We propose to support it in 3 steps:
Add asymmetric padding(pad_top, pad_left, pad_bottom, pad_right) support for conv/deconv in relay/nnvm and generic topi.
Modify _get_workload behavior to accomodate the new 4-number padding style, and change backend-specific topi schedule respectively.
Add support in tensorflow/keras/coreml frontends: eliminate separate pad and use attr['padding'] instead
@srkreddy1238 @FrozenGene