本地ubuntu16.04复现房产预测报错
Created by: liangruofei
serve端用docker实现的没问题: 输出如下:
[root@1e9ea907d737 opt]# python3 -m paddle_serving_server_gpu.serve --thread 10 --model uci_housing_model --port 9292 --gpu_id 0 --use_multilang
mkdir: cannot create directory ‘workdir_0’: File exists
Going to Run Comand
/usr/local/lib/python3.6/site-packages/paddle_serving_server_gpu/serving-gpu-0.3.1/serving -enable_model_toolkit -inferservice_path workdir_0 -inferservice_file infer_service.prototxt -max_concurrency 0 -num_threads 10 -port 12000 -reload_interval_s 10 -resource_path workdir_0 -resource_file resource.prototxt -workflow_path workdir_0 -workflow_file workflow.prototxt -bthread_concurrency 10 -gpuid 0 -max_body_size 536870912
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0913 12:19:05.714517 362 naming_service_thread.cpp:209] brpc::policy::ListNamingService("0.0.0.0:12000"): added 1
I0100 00:00:00.000000 380 op_repository.h:65] RAW: Succ regist op: GeneralDistKVInferOp
I0100 00:00:00.000000 380 op_repository.h:65] RAW: Succ regist op: GeneralTextReaderOp
I0100 00:00:00.000000 380 op_repository.h:65] RAW: Succ regist op: GeneralCopyOp
I0100 00:00:00.000000 380 op_repository.h:65] RAW: Succ regist op: GeneralDistKVQuantInferOp
I0100 00:00:00.000000 380 op_repository.h:65] RAW: Succ regist op: GeneralReaderOp
I0100 00:00:00.000000 380 op_repository.h:65] RAW: Succ regist op: GeneralInferOp
I0100 00:00:00.000000 380 op_repository.h:65] RAW: Succ regist op: GeneralTextResponseOp
I0100 00:00:00.000000 380 op_repository.h:65] RAW: Succ regist op: GeneralResponseOp
I0100 00:00:00.000000 380 service_manager.h:61] RAW: Service[LoadGeneralModelService] insert successfully!
I0100 00:00:00.000000 380 load_general_model_service.pb.h:299] RAW: Success regist service[LoadGeneralModelService][PN5baidu14paddle_serving9predictor26load_general_model_service27LoadGeneralModelServiceImplE]
I0100 00:00:00.000000 380 service_manager.h:61] RAW: Service[GeneralModelService] insert successfully!
I0100 00:00:00.000000 380 general_model_service.pb.h:1473] RAW: Success regist service[GeneralModelService][PN5baidu14paddle_serving9predictor13general_model23GeneralModelServiceImplE]
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_GPU_ANALYSIS, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_gpu_engine.cpp:27] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine<FluidGpuAnalysisCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_GPU_ANALYSIS in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_GPU_ANALYSIS_DIR, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_gpu_engine.cpp:33] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine< FluidGpuAnalysisDirCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_GPU_ANALYSIS_DIR in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_GPU_ANALYSIS_DIR_SIGMOID, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_gpu_engine.cpp:39] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine< FluidGpuAnalysisDirWithSigmoidCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_GPU_ANALYSIS_DIR_SIGMOID in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_GPU_NATIVE, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_gpu_engine.cpp:44] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine<FluidGpuNativeCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_GPU_NATIVE in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_GPU_NATIVE_DIR, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_gpu_engine.cpp:49] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine<FluidGpuNativeDirCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_GPU_NATIVE_DIR in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_GPU_NATIVE_DIR_SIGMOID, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_gpu_engine.cpp:55] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine< FluidGpuNativeDirWithSigmoidCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_GPU_NATIVE_DIR_SIGMOID in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_CPU_ANALYSIS, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_cpu_engine.cpp:25] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine<FluidCpuAnalysisCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_CPU_ANALYSIS in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_CPU_ANALYSIS_DIR, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_cpu_engine.cpp:31] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine< FluidCpuAnalysisDirCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_CPU_ANALYSIS_DIR in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_CPU_ANALYSIS_DIR_SIGMOID, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_cpu_engine.cpp:37] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine< FluidCpuAnalysisDirWithSigmoidCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_CPU_ANALYSIS_DIR_SIGMOID in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_CPU_NATIVE, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_cpu_engine.cpp:42] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine<FluidCpuNativeCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_CPU_NATIVE in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_CPU_NATIVE_DIR, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_cpu_engine.cpp:47] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine<FluidCpuNativeDirCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_CPU_NATIVE_DIR in macro!
I0100 00:00:00.000000 380 factory.h:121] RAW: Succ insert one factory, tag: FLUID_CPU_NATIVE_DIR_SIGMOID, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 380 fluid_cpu_engine.cpp:53] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine< FluidCpuNativeDirWithSigmoidCore>->::baidu::paddle_serving::predictor::InferEngine, tag: FLUID_CPU_NATIVE_DIR_SIGMOID in macro!
client端没有输出: client端程序为:
from paddle_serving_client import MultiLangClient as Client
client = Client()
client.connect(["127.0.0.1:9292"])
data = [0.0137, -0.1136, 0.2553, -0.0692, 0.0582, -0.0727,
-0.1583, -0.0584, 0.6283, 0.4919, 0.1856, 0.0795, -0.0332]
fetch_map = client.predict(feed={"x": data}, fetch=["price"])
print(fetch_map)
配好环境运行结果如下: System-Product-Name:~$ python test2.py None 同时在serve端出现:
W0913 12:24:20.966691 393 predictor.hpp:129] inference call failed, message: [E112]1/1 channels failed, fail_limit=1 [C0][E111]Fail to connect SocketId=113@0.0.0.0:12000: Connection refused [R1][E112]Fail to select server from list://0.0.0.0:12000 lb=la [R2][E112]Fail to select server from list://0.0.0.0:12000 lb=la
E0913 12:24:20.966754 393 general_model.cpp:561] failed call predictor with req: insts { tensor_array { float_data: 0.0137 float_data: -0.1136 float_data: 0.2553 float_data: -0.0692 float_data: 0.0582 float_data: -0.0727 float_data: -0.1583 float_data: -0.0584 float_data: 0.6283 float_data: 0.4919 float_data: 0.1856 float_data: 0.0795 float_data: -0.0332 elem_type: 1 shape: 13 } } fetch_var_names: "price"
I0913 12:24:21.066982 389 socket.cpp:2370] Checking SocketId=0@0.0.0.0:12000
请问咋解决