• H
    Inference add ONNXRuntime back-end (#39988) · 431afc39
    heliqi 提交于
    * add onnxruntime predictor
    
    * Add code comments
    
    * support link paddle2onnx onnxruntime
    
    * support onnxruntime with python
    
    * support onnxruntime with python
    
    * support onnxruntime with windows
    
    * paddle2onnx compile with windows
    
    * supoort windows compile
    
    * supoort windows compile with onnxruntime
    
    * supoort windows compile with paddle2onnx
    
    * supoort mac compile
    
    * compile with mac
    
    * compile with mac
    
    * add code comments
    
    * fix remind word
    
    * code optimization
    
    * add test case
    
    * add test case
    
    * add inference demo_ci test case
    
    * fix compile paddle2onnx with no python
    
    * add inference demo_ci test case
    
    * add inference demo_ci test case
    
    * add inference infer_ut test case
    
    * support c go api and test cases
    
    * add converage test case
    
    * add converage test case
    
    * add capi test case
    
    * add capi test case
    431afc39
paddle_api.h 14.9 KB