Training and Inference

Parameters

class paddle.v2.parameters.Parameters

Parameters is a dictionary contains Paddle’s parameter. The key of Parameters is the name of parameter. The value of Parameters is a plain numpy.ndarry .

Basically usage is

data = paddle.layers.data(...)
...
out = paddle.layers.fc(...)

parameters = paddle.parameters.create(out)

parameter_names = parameters.names()
fc_mat = parameters.get('fc')
print fc_mat
keys()

keys are the names of each parameter.

Returns:list of parameter name
Return type:list
names()

names of each parameter.

Returns:list of parameter name
Return type:list
has_key(key)

has_key return true if there are such parameter name == key

Parameters:key (basestring) – Parameter name
Returns:True if contains such key
get_shape(key)

get shape of the parameter.

Parameters:key (basestring) – parameter name
Returns:parameter’s shape
Return type:tuple
get(parameter_name)

Get parameter by parameter name.

Note:It will always copy the parameter from C++ side.
Parameters:parameter_name (basestring) – parameter name
Returns:The parameter matrix.
Return type:np.ndarray
get_grad(key)

Get grandient by parameter name.

Note:It will always copy the parameter from C++ side.
Parameters:key (basestring) – parameter name
Returns:The grandient matrix.
Return type:np.ndarray
set(parameter_name, value)

Set parameter by parameter name & matrix.

Parameters:
  • parameter_name (basestring) – parameter name
  • value (np.ndarray) – parameter matrix
Returns:

Nothing.

append_gradient_machine(gradient_machine)

append gradient machine to parameters. This method is used internally in Trainer.train.

Parameters:gradient_machine (api.GradientMachine) – Paddle C++ GradientMachine object.
Returns:
serialize(name, f)
Parameters:
  • name
  • f (file) –
Returns:

deserialize(name, f)
Parameters:
  • name
  • f (file) –
Returns:

static from_tar(f)

Create a Parameters object from the given file. And the Parameters only contains the parameters in this file. It is adapted the parameters are same in the defined network and the given file. For example, it can be used in the inference.

Parameters:f (tar file) – the initialized model file.
Returns:A Parameters object.
Return type:Parameters.
init_from_tar(f)

Different from from_tar, this interface can be used to init partial network parameters from another saved model.

Parameters:f (tar file) – the initialized model file.
Returns:Nothing.

Trainer

Module Trainer

class paddle.v2.trainer.SGD(cost, parameters, update_equation, extra_layers=None, is_local=True, pserver_spec=None, use_etcd=True)

Simple SGD Trainer. SGD Trainer combines data reader, network topolopy and update_equation together to train/test a neural network.

Parameters:
  • update_equation (paddle.v2.optimizer.Optimizer) – The optimizer object.
  • cost (paddle.v2.config_base.Layer) – Target cost that neural network should be optimized.
  • parameters (paddle.v2.parameters.Parameters) – The parameters dictionary.
  • extra_layers (paddle.v2.config_base.Layer) – Some layers in the neural network graph are not in the path of cost layer.
  • pserver_spec – pserver location, eg: localhost:3000
train(reader, num_passes=1, event_handler=None, feeding=None)

Training method. Will train num_passes of input data.

Parameters:
  • reader (collections.Iterable) – A reader that reads and yeilds data items. Usually we use a batched reader to do mini-batch training.
  • num_passes – The total train passes.
  • event_handler ((BaseEvent) => None) – Event handler. A method will be invoked when event occurred.
  • feeding (dict|list) – Feeding is a map of neural network input name and array index that reader returns.
Returns:

test(reader, feeding=None)

Testing method. Will test input data.

Parameters:
  • reader (collections.Iterable) – A reader that reads and yeilds data items.
  • feeding (dict) – Feeding is a map of neural network input name and array index that reader returns.
Returns:

Event

Testing and training events.

There are:

  • TestResult
  • BeginIteration
  • EndIteration
  • BeginPass
  • EndPass
class paddle.v2.event.TestResult(evaluator, cost)

Result that trainer.test return.

class paddle.v2.event.BeginPass(pass_id)

Event On One Pass Training Start.

class paddle.v2.event.EndPass(pass_id, evaluator)

Event On One Pass Training Complete.

class paddle.v2.event.BeginIteration(pass_id, batch_id)

Event On One Batch Training Start.

class paddle.v2.event.EndIteration(pass_id, batch_id, cost, evaluator)

Event On One Batch Training Complete.

Inference

paddle.v2.infer(output_layer, parameters, input, feeding=None, field='value')

Infer a neural network by given neural network output and parameters. The user should pass either a batch of input data or reader method.

Example usage for sinlge output_layer:

result = paddle.infer(output_layer=prediction,
                      parameters=parameters,
                      input=SomeData)
print result

Example usage for multiple outout_layers and fields:

result = paddle.infer(output_layer=[prediction1, prediction2],
                      parameters=parameters,
                      input=SomeData,
                      field=[id, value]])
print result
Parameters:
  • output_layer (paddle.v2.config_base.Layer or a list of paddle.v2.config_base.Layer) – output of the neural network that would be inferred
  • parameters (paddle.v2.parameters.Parameters) – parameters of the neural network.
  • input (collections.Iterable) – input data batch. Should be a python iterable object, and each element is the data batch.
  • feeding – Reader dictionary. Default could generate from input value.
  • field (str) – The prediction field. It should in [value, id, prob]. value and prob mean return the prediction probabilities, id means return the prediction labels. Default is value. Note that prob only used when output_layer is beam_search or max_id.
Returns:

The prediction result. If there are multiple outout_layers and fields, the return order is outout_layer1.field1, outout_layer2.field1, ..., outout_layer1.field2, outout_layer2.field2 ...

Return type:

numpy.ndarray