Skip to content

[Bug] TVM failes to compile onnx model: TVMError: CodeGenVM cannot handle this intrinsic now: Op(relax.concat) #17867

@coffezhou

Description

@coffezhou

Expected behavior

TVM should build the model correctly.

Actual behavior

When compiling the model, TVM crashes as follows:

Image

Traceback (most recent call last):
  File "/home/carla/Documents/test/test.py", line 189, in <module>
    main()
  File "/home/carla/Documents/test/test.py", line 178, in main
    check_correctness(onnx_model)
  File "/home/carla/Documents/test/test.py", line 126, in check_correctness
    ex = relax.build(tvm_model, target="llvm", relax_pipeline=relax_pipeline)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/vm_build.py", line 258, in build
    mod = _vmcodegen(builder, mod, exec_mode)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/Documents/tvm/python/tvm/relax/vm_build.py", line 77, in _vmcodegen
    return _ffi_api.VMCodeGen(builder, mod)  # type:ignore
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "tvm/_ffi/_cython/./packed_func.pxi", line 339, in tvm._ffi._cy3.core.PackedFuncBase.__call__
  File "tvm/_ffi/_cython/./packed_func.pxi", line 270, in tvm._ffi._cy3.core.FuncCall
  File "tvm/_ffi/_cython/./packed_func.pxi", line 259, in tvm._ffi._cy3.core.FuncCall3
  File "tvm/_ffi/_cython/./base.pxi", line 185, in tvm._ffi._cy3.core.CHECK_CALL
  File "/home/carla/Documents/tvm/python/tvm/_ffi/base.py", line 468, in raise_last_ffi_error
    raise py_err
tvm._ffi.base.TVMError: Traceback (most recent call last):
  7: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::relax::ExecBuilder, tvm::IRModule)>::AssignTypedLambda<tvm::IRModule (*)(tvm::relax::ExecBuilder, tvm::IRModule)>(tvm::IRModule (*)(tvm::relax::ExecBuilder, tvm::IRModule), std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  6: tvm::relax::relax_vm::VMCodeGen(tvm::relax::ExecBuilder, tvm::IRModule)
  5: tvm::relax::relax_vm::CodeGenVM::Run(tvm::relax::ExecBuilder, tvm::IRModule)
  4: tvm::relax::relax_vm::CodeGenVM::Codegen(tvm::relax::Function const&)
  3: tvm::relax::ExprFunctor<tvm::runtime::relax_vm::Instruction::Arg (tvm::RelaxExpr const&)>::VisitExpr(tvm::RelaxExpr const&)
  2: tvm::relax::relax_vm::CodeGenVM::VisitExpr_(tvm::relax::SeqExprNode const*)
  1: tvm::relax::ExprFunctor<tvm::runtime::relax_vm::Instruction::Arg (tvm::RelaxExpr const&)>::VisitExpr(tvm::RelaxExpr const&)
  0: tvm::relax::relax_vm::CodeGenVM::VisitExpr_(tvm::relax::CallNode const*)
  File "/home/carla/Documents/tvm/src/relax/backend/vm/codegen_vm.cc", line 156
TVMError: CodeGenVM cannot handle this intrinsic now:
Op(relax.concat)

Environment

OS: Ubuntu 20.04
TVM: 0.21.dev0(c00f52a)

Steps to reproduce

This bug can be reproduced by the following code with the model in the attachment. As shown in the code, the model can be executed by onnxruntime, which indicates that this is a valid model. It is strange that CodeGenVM cannot handle the concat operator.

import sys

import numpy as np
import onnx
import onnxruntime

import tvm
from tvm import relax
from tvm.relax.frontend.onnx import from_onnx

import pickle
            
def main():
    onnx_model = onnx.load("a317.onnx")
    
    with open("inputs.pkl", "rb") as fp:
        inputs = pickle.load(fp)
    
    try:
        ort_session = onnxruntime.InferenceSession(
            onnx_model.SerializeToString(), providers=["CPUExecutionProvider"]
        )
        ort_output = ort_session.run([], inputs)
    except Exception as e:
        print(e)
        sys.exit(1)
    
    tvm_model = from_onnx(onnx_model, keep_params_in_input=True)
    tvm_model = relax.transform.DecomposeOpsForInference()(tvm_model)
    tvm_model = relax.transform.LegalizeOps()(tvm_model)

    tvm_model, params = relax.frontend.detach_params(tvm_model)
        
    with tvm.transform.PassContext(opt_level=0):
        
        ex = relax.build(tvm_model, target="llvm")

if __name__ == "__main__":
    main()

testcase.zip

Triage

  • needs-triage

Metadata

Metadata

Assignees

No one assigned

    Labels

    needs-triagePRs or issues that need to be investigated by maintainers to find the right assignees to address ittype: bug

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions