This issue is to track any aspects and issues on PyTorch (#1120) ONNX export. * [x] Working script for conversion (`export_to_onnx.py`) * [x] Fix issue with convolution * [x] Rename script to `torch_export_to_onnx.py` * [x] Check `model_outputs`. * [x] Use `model_outputs` dims when not specified in `mark_as_output` (esp in case of PT). * [x] The input names for sizes should be better (currently they are like "classes:size1" but should be like "data:size1"). (Fixed via #1362.) * [x] The batch dim (`data:size0`), or any scalar dyn_size_ext, should not be an input, as it is redundant. (Fixed via #1362.) * [x] `model_outputs`: Ignore dim tag equality. Check static dims, dyn dims information * [x] Out seq lens are actually relevant, and should not be filled with dummy values? Clarify. (https://github.com/rwth-i6/returnn/issues/1333#issuecomment-1625179457) (Fixed via #1362.) * [x] Working demo-rf + test case * [x] Test case: Also perform ONNX inference (Fixed via #1362.) * [ ] RF Conformer works * [ ] Some real-world pure PT model works * [ ] Running the resulting ONNX model in RASR * [x] Writing a sisyphus job to run ONNX export in i6_core. (Fixed via https://github.com/rwth-i6/i6_core/pull/429.) Other things which are less important for now: * Avoiding `TracerWarning`s for some sanity checking code: I think it's not really possible in an easy way.
This issue is to track any aspects and issues on PyTorch (#1120) ONNX export.
export_to_onnx.py)torch_export_to_onnx.pymodel_outputs.model_outputsdims when not specified inmark_as_output(esp in case of PT).data:size0), or any scalar dyn_size_ext, should not be an input, as it is redundant. (Fixed via Torch ONNX export, proper dynamic dims handling #1362.)model_outputs: Ignore dim tag equality. Check static dims, dyn dims informationOther things which are less important for now:
TracerWarnings for some sanity checking code: I think it's not really possible in an easy way.