File "/workplace/software/tvm/tvm_/python/tvm/driver/tvmc/frontends.py", line 169, in load
return relay.frontend.from_onnx(model, shape=shape_dict, **kwargs)
File "/workplace/software/tvm/tvm_/python/tvm/relay/frontend/onnx.py", line 7346, in from_onnx
mod, params = g.from_onnx(graph, opset)
File "/workplace/software/tvm/tvm_/python/tvm/relay/frontend/onnx.py", line 6963, in from_onnx
self._construct_nodes(graph)
File "/workplace/software/tvm/tvm_/python/tvm/relay/frontend/onnx.py", line 7078, in _construct_nodes
op = self._convert_operator(op_name, inputs, attr, self.opset)
File "/workplace/software/tvm/tvm_/python/tvm/relay/frontend/onnx.py", line 7204, in _convert_operator
sym = convert_map[op_name](inputs, attrs, self._params)
File "/workplace/software/tvm/tvm_/python/tvm/relay/frontend/onnx.py", line 2486, in _impl_v1
dim = int(infer_value(dim, params).numpy())
File "/workplace/software/tvm/tvm_/python/tvm/relay/frontend/common.py", line 547, in infer_value
), "All inputs to infer must be available in params."
AssertionError: All inputs to infer must be available in params.
For the simple ONNX model, It will crash when executed at this line, It seems that ONNXConverter can't parse the
axisvalue.model structure
Trackback
Steps to reproduce
tvmc compile --target 'llvm' cum_sum.onnxTriage
cc @KJlaccHoeUM9l
@vvchernov @tqchen Could you take a look? is it an ONNXConverter bug?