diff --git a/doc/MODEL_ENSEMBLE_IN_PADDLE_SERVING.md b/doc/MODEL_ENSEMBLE_IN_PADDLE_SERVING.md index 4564a97ec7117cee4f83c18a5120d04f3405a96b..dd83bc29093288df49ba5d55ea7afa081e1d8b59 100644 --- a/doc/MODEL_ENSEMBLE_IN_PADDLE_SERVING.md +++ b/doc/MODEL_ENSEMBLE_IN_PADDLE_SERVING.md @@ -1,5 +1,7 @@ # Model Ensemble in Paddle Serving +([简体中文](MODEL_ENSEMBLE_IN_PADDLE_SERVING_CN.md)|English) + In some scenarios, multiple models with the same input may be used to predict in parallel and integrate predicted results for better prediction effect. Paddle Serving also supports this feature. Next, we will take the text classification task as an example to show model ensemble in Paddle Serving (This feature is still serial prediction for the time being. We will support parallel prediction as soon as possible). diff --git a/doc/MODEL_ENSEMBLE_IN_PADDLE_SERVING_CN.md b/doc/MODEL_ENSEMBLE_IN_PADDLE_SERVING_CN.md index 938a8eaf363be2a6e2951982991f5c7c00b8f3ce..12af427ed2e7bbd688ea854673dfac716a6050bb 100644 --- a/doc/MODEL_ENSEMBLE_IN_PADDLE_SERVING_CN.md +++ b/doc/MODEL_ENSEMBLE_IN_PADDLE_SERVING_CN.md @@ -1,5 +1,7 @@ # Paddle Serving中的集成预测 +(简体中文|[English](MODEL_ENSEMBLE_IN_PADDLE_SERVING.md)) + 在一些场景中,可能使用多个相同输入的模型并行集成预测以获得更好的预测效果,Paddle Serving提供了这项功能。 下面将以文本分类任务为例,来展示Paddle Serving的集成预测功能(暂时还是串行预测,我们会尽快支持并行化)。