提交 d9541696 编写于 作者: A A. Unique TensorFlower 提交者: TensorFlower Gardener

Update generated Python Op docs.

Change: 143626185
上级 1628abf8
......@@ -983,3 +983,135 @@ save memory during initialization.
* <b>`ValueError`</b>: If ref tensor is initialized.
## Checkpoint utilities
- - -
### `tf.contrib.framework.load_checkpoint(filepattern)` {#load_checkpoint}
Returns CheckpointReader for latest checkpoint.
##### Args:
* <b>`filepattern`</b>: Directory with checkpoints file or path to checkpoint.
##### Returns:
`CheckpointReader` object.
##### Raises:
* <b>`ValueError`</b>: if checkpoint_dir doesn't have 'checkpoint' file or checkpoints.
- - -
### `tf.contrib.framework.list_variables(checkpoint_dir)` {#list_variables}
Returns list of all variables in the latest checkpoint.
##### Args:
* <b>`checkpoint_dir`</b>: Directory with checkpoints file or path to checkpoint.
##### Returns:
List of tuples `(name, shape)`.
- - -
### `tf.contrib.framework.load_variable(checkpoint_dir, name)` {#load_variable}
Returns a Tensor with the contents of the given variable in the checkpoint.
##### Args:
* <b>`checkpoint_dir`</b>: Directory with checkpoints file or path to checkpoint.
* <b>`name`</b>: Name of the tensor to return.
##### Returns:
`Tensor` object.
- - -
### `tf.contrib.framework.init_from_checkpoint(checkpoint_dir, assignment_map)` {#init_from_checkpoint}
Using assingment map initializes current variables with loaded tensors.
Note: This overrides default initialization ops of specified variables and
redefines dtype.
##### Assignment map supports following syntax:
`'checkpoint_scope_name/': 'scope_name/'` - will load all variables in
current `scope_name` from `checkpoint_scope_name` with matching variable
names.
`'checkpoint_scope_name/some_other_variable': 'scope_name/variable_name'` -
will initalize `scope_name/variable_name` variable
from `checkpoint_scope_name/some_other_variable`.
`'scope_variable_name': variable` - will initialize given `tf.Variable`
object with variable from the checkpoint.
`'scope_variable_name': list(variable)` - will initialize list of
partitioned variables with variable from the checkpoint.
`'/': 'scope_name/'` - will load all variables in current `scope_name` from
checkpoint's root (e.g. no scope).
Supports loading into partitioned variables, which are represented as
'<variable>/part_<part #>'.
* <b>`Example`</b>:
```python
# Create variables.
with tf.variable_scope('test'):
m = tf.get_variable('my_var')
with tf.variable_scope('test2'):
var2 = tf.get_variable('my_var')
var3 = tf.get_variable(name="my1", shape=[100, 100],
partitioner=lambda shape, dtype: [5, 1])
...
# Specify which variables to intialize from checkpoint.
init_from_checkpoint(checkpoint_dir, {
'some_var': 'test/my_var',
'some_scope/': 'test2/'})
...
# Or use `Variable` objects to identify what to initialize.
init_from_checkpoint(checkpoint_dir, {
'some_scope/var2': var2,
})
# Initialize partitioned variables
init_from_checkpoint(checkpoint_dir, {
'some_var_from_ckpt': 'part_var',
})
# Or specifying the list of `Variable` objects.
init_from_checkpoint(checkpoint_dir, {
'some_var_from_ckpt': var3._get_variable_list(),
})
...
# Initialize variables as usual.
session.run(tf.get_all_variables())
```
##### Args:
* <b>`checkpoint_dir`</b>: Directory with checkpoints file or path to checkpoint.
* <b>`assignment_map`</b>: Dict, where keys are names of the variables in the
checkpoint and values are current variables or names of current variables
(in default graph).
##### Raises:
tf.errors.OpError: If missing checkpoints or tensors in checkpoints.
* <b>`ValueError`</b>: If missing variables in current graph.
### `tf.contrib.framework.init_from_checkpoint(checkpoint_dir, assignment_map)` {#init_from_checkpoint}
Using assingment map initializes current variables with loaded tensors.
Note: This overrides default initialization ops of specified variables and
redefines dtype.
##### Assignment map supports following syntax:
`'checkpoint_scope_name/': 'scope_name/'` - will load all variables in
current `scope_name` from `checkpoint_scope_name` with matching variable
names.
`'checkpoint_scope_name/some_other_variable': 'scope_name/variable_name'` -
will initalize `scope_name/variable_name` variable
from `checkpoint_scope_name/some_other_variable`.
`'scope_variable_name': variable` - will initialize given `tf.Variable`
object with variable from the checkpoint.
`'scope_variable_name': list(variable)` - will initialize list of
partitioned variables with variable from the checkpoint.
`'/': 'scope_name/'` - will load all variables in current `scope_name` from
checkpoint's root (e.g. no scope).
Supports loading into partitioned variables, which are represented as
'<variable>/part_<part #>'.
* <b>`Example`</b>:
```python
# Create variables.
with tf.variable_scope('test'):
m = tf.get_variable('my_var')
with tf.variable_scope('test2'):
var2 = tf.get_variable('my_var')
var3 = tf.get_variable(name="my1", shape=[100, 100],
partitioner=lambda shape, dtype: [5, 1])
...
# Specify which variables to intialize from checkpoint.
init_from_checkpoint(checkpoint_dir, {
'some_var': 'test/my_var',
'some_scope/': 'test2/'})
...
# Or use `Variable` objects to identify what to initialize.
init_from_checkpoint(checkpoint_dir, {
'some_scope/var2': var2,
})
# Initialize partitioned variables
init_from_checkpoint(checkpoint_dir, {
'some_var_from_ckpt': 'part_var',
})
# Or specifying the list of `Variable` objects.
init_from_checkpoint(checkpoint_dir, {
'some_var_from_ckpt': var3._get_variable_list(),
})
...
# Initialize variables as usual.
session.run(tf.get_all_variables())
```
##### Args:
* <b>`checkpoint_dir`</b>: Directory with checkpoints file or path to checkpoint.
* <b>`assignment_map`</b>: Dict, where keys are names of the variables in the
checkpoint and values are current variables or names of current variables
(in default graph).
##### Raises:
tf.errors.OpError: If missing checkpoints or tensors in checkpoints.
* <b>`ValueError`</b>: If missing variables in current graph.
### `tf.contrib.framework.load_checkpoint(filepattern)` {#load_checkpoint}
Returns CheckpointReader for latest checkpoint.
##### Args:
* <b>`filepattern`</b>: Directory with checkpoints file or path to checkpoint.
##### Returns:
`CheckpointReader` object.
##### Raises:
* <b>`ValueError`</b>: if checkpoint_dir doesn't have 'checkpoint' file or checkpoints.
### `tf.contrib.framework.load_variable(checkpoint_dir, name)` {#load_variable}
Returns a Tensor with the contents of the given variable in the checkpoint.
##### Args:
* <b>`checkpoint_dir`</b>: Directory with checkpoints file or path to checkpoint.
* <b>`name`</b>: Name of the tensor to return.
##### Returns:
`Tensor` object.
### `tf.contrib.framework.list_variables(checkpoint_dir)` {#list_variables}
Returns list of all variables in the latest checkpoint.
##### Args:
* <b>`checkpoint_dir`</b>: Directory with checkpoints file or path to checkpoint.
##### Returns:
List of tuples `(name, shape)`.
......@@ -834,10 +834,14 @@
* [`get_variables_by_suffix`](../../api_docs/python/contrib.framework.md#get_variables_by_suffix)
* [`get_variables_to_restore`](../../api_docs/python/contrib.framework.md#get_variables_to_restore)
* [`has_arg_scope`](../../api_docs/python/contrib.framework.md#has_arg_scope)
* [`init_from_checkpoint`](../../api_docs/python/contrib.framework.md#init_from_checkpoint)
* [`is_non_decreasing`](../../api_docs/python/contrib.framework.md#is_non_decreasing)
* [`is_numeric_tensor`](../../api_docs/python/contrib.framework.md#is_numeric_tensor)
* [`is_strictly_increasing`](../../api_docs/python/contrib.framework.md#is_strictly_increasing)
* [`is_tensor`](../../api_docs/python/contrib.framework.md#is_tensor)
* [`list_variables`](../../api_docs/python/contrib.framework.md#list_variables)
* [`load_checkpoint`](../../api_docs/python/contrib.framework.md#load_checkpoint)
* [`load_variable`](../../api_docs/python/contrib.framework.md#load_variable)
* [`local_variable`](../../api_docs/python/contrib.framework.md#local_variable)
* [`model_variable`](../../api_docs/python/contrib.framework.md#model_variable)
* [`reduce_sum_n`](../../api_docs/python/contrib.framework.md#reduce_sum_n)
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册