“a6d4a31deae38f78e24ecaf198e9250927416041”上不存在“paddle/legacy/utils/Error.h”
session.md 3.6 KB
Newer Older
H
Helin Wang 已提交
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160
# Design Doc: Session

## Abstract

The *session* object encapsulates the environment in which the
computation graph is executed.

We will have *local* session and *remote* session, they offer the
same [interface](#interface). The local session encapsulates the local
runtime environment and the remote session encapsulates the cluster
runtime envrionment.

The local runtime envrionment contains:

1. computation devices (i.e., CPU, GPU) handles, and
1. the [scope](../scope.md) which holds all variables.

The remote runtime envrionment contains:

1. computation devices (i.e., CPU and GPU on node 0, 1) in a cluster,
   and
1. the distributed [scope](../scope.md) in a cluster which holds all
   variables.

The user can create a remote session on Paddle Cloud and evaluate the
computation graph with it. In this way, the user can control the
remote computation resource in a cluster from his local computer.


## Background

The current design has an implicit global session on which
`paddle.eval()` is executed. The pain point is:

Since the user is not able to explicitly switch between runtime
environments such as the scope and the device contexts, the user
cannot run a topology in two independent environments.

For example, in reinforcement learning, the user may want to have a
stale model for inference and a fresh model for training, and only
replace the stale model with the fresh model periodically.

Furthermore, we have no concept that encapsulates a remote environment
that executes a computation graph.

We need the session object to address above issues.


## Session

A session is an object that owns the runtime environment. All
computations are executed through `session.eval`.


### Interface

```
eval(
    targets,
    feed_dict=None,
)
```

Evaluates the target Operations or Variables in `targets`.

- *targets*: the evaluation targets. Can be a single Operation or
  Variable, or a list with the Operations or Variables as elements.

  The value returned by `eval()` has the same shape as the `target`
  argument.

  The computation graph is implicitly inferred from the targets.

- *feed_dict*: a dictionary that contains the tensors which overrides
  the edges of the computation graph.

```
close()
```

Closes the session. Calling this method releases the scope.


### Create a Local Session

```
session(
    gpu_ids=None
)
```

Creates a new session. One session owns one scope, so creating
multiple sessions will create different scopes.

- *gpu_ids*: a single `int` or a list of `int` of the GPU IDs to be
  used as the computation devices. If not specified, all avaiable GPUs
  will be used.


#### Example

```Python
a = paddle.constant(1.0)
b = paddle.constant(2.0)
c = a + b
sess = paddle.session(gpu_ids=[0,1])
sess.eval(c)
sess.close()
```

### Create a Remote Session

```
create_cloud_job(
    name,
    num_trainer,
    mem_per_trainer,
    gpu_per_trainer,
    cpu_per_trainer,
    num_ps,
    mem_per_ps,
    cpu_per_ps,
)
```

Creates a Paddle Cloud job. Fails if the job name exists.

```
get_cloud_job(
    name
)
```

Gets a Paddle Cloud job.

```
remote_session(
    job
)
```

- *job*: the Paddle Cloud job.

#### Example

```Python
reader = paddle.reader.recordio("/pfs/home/peter/mnist-train-*") # data stored on Paddle Cloud
image = reader.column(0)
label = reader.column(1)
fc1 = paddle.op.fc(image, size=256, act="sigmoid")
fc2 = paddle.op.fc(fc1, size=10, act="softmax")
cost = paddle.op.cross_entropy(fc2, label)
opt = paddle.optimizer.sgd(cost)

job = paddle.create_cloud_job("test", 3, "1G", 1, 1, 2, "1G", 1)
sess = paddle.remote_ession(job)
for i in range(1000):
    sess.eval(opt)
sess.close()
```