提交 1fdc2880 编写于 作者: 绝不原创的飞龙's avatar 绝不原创的飞龙

2024-02-05 14:07:26

上级 589e5191
此差异已折叠。
- en: Type Info
id: totrans-0
prefs:
- PREF_H1
type: TYPE_NORMAL
zh: 类型信息
- en: 原文:[https://pytorch.org/docs/stable/type_info.html](https://pytorch.org/docs/stable/type_info.html)
id: totrans-1
prefs:
- PREF_BQ
type: TYPE_NORMAL
zh: 原文:[https://pytorch.org/docs/stable/type_info.html](https://pytorch.org/docs/stable/type_info.html)
- en: The numerical properties of a [`torch.dtype`](tensor_attributes.html#torch.dtype
"torch.dtype") can be accessed through either the [`torch.finfo`](#torch.torch.finfo
"torch.torch.finfo") or the [`torch.iinfo`](#torch.torch.iinfo "torch.torch.iinfo").
id: totrans-2
prefs: []
type: TYPE_NORMAL
zh: '[`torch.dtype`](tensor_attributes.html#torch.dtype "torch.dtype")的数字属性可以通过[`torch.finfo`](#torch.torch.finfo
"torch.torch.finfo")或[`torch.iinfo`](#torch.torch.iinfo "torch.torch.iinfo")来访问。'
- en: '## torch.finfo'
id: totrans-3
prefs: []
type: TYPE_NORMAL
zh: '## torch.finfo'
- en: '[PRE0]'
id: totrans-4
prefs: []
type: TYPE_PRE
zh: '[PRE0]'
- en: A [`torch.finfo`](#torch.torch.finfo "torch.torch.finfo") is an object that
represents the numerical properties of a floating point [`torch.dtype`](tensor_attributes.html#torch.dtype
"torch.dtype"), (i.e. `torch.float32`, `torch.float64`, `torch.float16`, and `torch.bfloat16`).
This is similar to [numpy.finfo](https://docs.scipy.org/doc/numpy/reference/generated/numpy.finfo.html).
id: totrans-5
prefs: []
type: TYPE_NORMAL
zh: '[`torch.finfo`](#torch.torch.finfo "torch.torch.finfo")是一个表示浮点数[`torch.dtype`](tensor_attributes.html#torch.dtype
"torch.dtype")(即`torch.float32`、`torch.float64`、`torch.float16`和`torch.bfloat16`)的数字属性的对象。这类似于[numpy.finfo](https://docs.scipy.org/doc/numpy/reference/generated/numpy.finfo.html)。'
- en: 'A [`torch.finfo`](#torch.torch.finfo "torch.torch.finfo") provides the following
attributes:'
id: totrans-6
prefs: []
type: TYPE_NORMAL
zh: '[`torch.finfo`](#torch.torch.finfo "torch.torch.finfo")提供以下属性:'
- en: '| Name | Type | Description |'
id: totrans-7
prefs: []
type: TYPE_TB
zh: '| 名称 | 类型 | 描述 |'
- en: '| --- | --- | --- |'
id: totrans-8
prefs: []
type: TYPE_TB
zh: '| --- | --- | --- |'
- en: '| bits | int | The number of bits occupied by the type. |'
id: totrans-9
prefs: []
type: TYPE_TB
zh: '| bits | int | 类型占用的位数。|'
- en: '| eps | float | The smallest representable number such that `1.0 + eps != 1.0`.
|'
id: totrans-10
prefs: []
type: TYPE_TB
zh: '| eps | float | 可表示的最小数,使得`1.0 + eps != 1.0`。|'
- en: '| max | float | The largest representable number. |'
id: totrans-11
prefs: []
type: TYPE_TB
zh: '| max | float | 可表示的最大数。|'
- en: '| min | float | The smallest representable number (typically `-max`). |'
id: totrans-12
prefs: []
type: TYPE_TB
zh: '| min | float | 可表示的最小数(通常为`-max`)。|'
- en: '| tiny | float | The smallest positive normal number. Equivalent to `smallest_normal`.
|'
id: totrans-13
prefs: []
type: TYPE_TB
zh: '| tiny | float | 最小的正常数。等同于`smallest_normal`。|'
- en: '| smallest_normal | float | The smallest positive normal number. See notes.
|'
id: totrans-14
prefs: []
type: TYPE_TB
zh: '| smallest_normal | float | 最小的正常数。请参阅注释。|'
- en: '| resolution | float | The approximate decimal resolution of this type, i.e.,
`10**-precision`. |'
id: totrans-15
prefs: []
type: TYPE_TB
zh: '| resolution | float | 此类型的近似十进制分辨率,即`10**-precision`。|'
- en: Note
id: totrans-16
prefs: []
type: TYPE_NORMAL
zh: 注意
- en: The constructor of [`torch.finfo`](#torch.torch.finfo "torch.torch.finfo") can
be called without argument, in which case the class is created for the pytorch
default dtype (as returned by [`torch.get_default_dtype()`](generated/torch.get_default_dtype.html#torch.get_default_dtype
"torch.get_default_dtype")).
id: totrans-17
prefs: []
type: TYPE_NORMAL
zh: 可以在不带参数的情况下调用[`torch.finfo`](#torch.torch.finfo "torch.torch.finfo")的构造函数,此时类将为pytorch默认dtype创建(由[`torch.get_default_dtype()`](generated/torch.get_default_dtype.html#torch.get_default_dtype
"torch.get_default_dtype")返回)。
- en: Note
id: totrans-18
prefs: []
type: TYPE_NORMAL
zh: 注意
- en: 'smallest_normal returns the smallest *normal* number, but there are smaller
subnormal numbers. See [https://en.wikipedia.org/wiki/Denormal_number](https://en.wikipedia.org/wiki/Denormal_number)
for more information. ## torch.iinfo'
id: totrans-19
prefs: []
type: TYPE_NORMAL
zh: smallest_normal返回最小的*正常*数,但存在更小的次正常数。有关更多信息,请参阅[https://en.wikipedia.org/wiki/Denormal_number](https://en.wikipedia.org/wiki/Denormal_number)。##
torch.iinfo
- en: '[PRE1]'
id: totrans-20
prefs: []
type: TYPE_PRE
zh: '[PRE1]'
- en: A [`torch.iinfo`](#torch.torch.iinfo "torch.torch.iinfo") is an object that
represents the numerical properties of a integer [`torch.dtype`](tensor_attributes.html#torch.dtype
"torch.dtype") (i.e. `torch.uint8`, `torch.int8`, `torch.int16`, `torch.int32`,
and `torch.int64`). This is similar to [numpy.iinfo](https://docs.scipy.org/doc/numpy/reference/generated/numpy.iinfo.html).
id: totrans-21
prefs: []
type: TYPE_NORMAL
zh: '[`torch.iinfo`](#torch.torch.iinfo "torch.torch.iinfo")是一个表示整数[`torch.dtype`](tensor_attributes.html#torch.dtype
"torch.dtype")(即`torch.uint8`、`torch.int8`、`torch.int16`、`torch.int32`和`torch.int64`)的数字属性的对象。这类似于[numpy.iinfo](https://docs.scipy.org/doc/numpy/reference/generated/numpy.iinfo.html)。'
- en: 'A [`torch.iinfo`](#torch.torch.iinfo "torch.torch.iinfo") provides the following
attributes:'
id: totrans-22
prefs: []
type: TYPE_NORMAL
zh: '[`torch.iinfo`](#torch.torch.iinfo "torch.torch.iinfo")提供以下属性:'
- en: '| Name | Type | Description |'
id: totrans-23
prefs: []
type: TYPE_TB
zh: '| 名称 | 类型 | 描述 |'
- en: '| --- | --- | --- |'
id: totrans-24
prefs: []
type: TYPE_TB
zh: '| --- | --- | --- |'
- en: '| bits | int | The number of bits occupied by the type. |'
id: totrans-25
prefs: []
type: TYPE_TB
zh: '| bits | int | 类型占用的位数。|'
- en: '| max | int | The largest representable number. |'
id: totrans-26
prefs: []
type: TYPE_TB
zh: '| max | int | 可表示的最大数。|'
- en: '| min | int | The smallest representable number. |'
id: totrans-27
prefs: []
type: TYPE_TB
zh: '| min | int | 可表示的最小数。|'
此差异已折叠。
此差异已折叠。
- en: torch.__config__
id: totrans-0
prefs:
- PREF_H1
type: TYPE_NORMAL
zh: torch.__config__
- en: 原文:[https://pytorch.org/docs/stable/config_mod.html](https://pytorch.org/docs/stable/config_mod.html)
id: totrans-1
prefs:
- PREF_BQ
type: TYPE_NORMAL
zh: 原文:[https://pytorch.org/docs/stable/config_mod.html](https://pytorch.org/docs/stable/config_mod.html)
- en: '[PRE0]'
id: totrans-2
prefs: []
type: TYPE_PRE
zh: '[PRE0]'
- en: Return a human-readable string with descriptions of the configuration of PyTorch.
id: totrans-3
prefs: []
type: TYPE_NORMAL
zh: 返回一个包含PyTorch配置描述的易读字符串。
- en: '[PRE1]'
id: totrans-4
prefs: []
type: TYPE_PRE
zh: '[PRE1]'
- en: Returns detailed string with parallelization settings
id: totrans-5
prefs: []
type: TYPE_NORMAL
zh: 返回带有并行设置的详细字符串
- en: torch._logging
id: totrans-0
prefs:
- PREF_H1
type: TYPE_NORMAL
zh: torch._logging
- en: 原文:[https://pytorch.org/docs/stable/logging.html](https://pytorch.org/docs/stable/logging.html)
id: totrans-1
prefs:
- PREF_BQ
type: TYPE_NORMAL
zh: 原文:[https://pytorch.org/docs/stable/logging.html](https://pytorch.org/docs/stable/logging.html)
- en: PyTorch has a configurable logging system, where different components can be
given different log level settings. For instance, one component’s log messages
can be completely disabled, while another component’s log messages can be set
to maximum verbosity.
id: totrans-2
prefs: []
type: TYPE_NORMAL
zh: PyTorch具有可配置的日志系统,其中不同的组件可以具有不同的日志级别设置。例如,一个组件的日志消息可以完全禁用,而另一个组件的日志消息可以设置为最大详细程度。
- en: Warning
id: totrans-3
prefs: []
type: TYPE_NORMAL
zh: 警告
- en: This feature is a prototype and may have compatibility breaking changes in the
future.
id: totrans-4
prefs: []
type: TYPE_NORMAL
zh: 此功能是一个原型,未来可能会有兼容性破坏性变化。
- en: Warning
id: totrans-5
prefs: []
type: TYPE_NORMAL
zh: 警告
- en: This feature has not been expanded to control the log messages of all components
in PyTorch yet.
id: totrans-6
prefs: []
type: TYPE_NORMAL
zh: 此功能尚未扩展到控制PyTorch中所有组件的日志消息。
- en: 'There are two ways to configure the logging system: through the environment
variable `TORCH_LOGS` or the python API torch._logging.set_logs.'
id: totrans-7
prefs: []
type: TYPE_NORMAL
zh: 有两种配置日志系统的方法:通过环境变量`TORCH_LOGS`或python API torch._logging.set_logs。
- en: '| [`set_logs`](generated/torch._logging.set_logs.html#torch._logging.set_logs
"torch._logging.set_logs") | Sets the log level for individual components and
toggles individual log artifact types. |'
id: totrans-8
prefs: []
type: TYPE_TB
zh: '[`set_logs`](generated/torch._logging.set_logs.html#torch._logging.set_logs
"torch._logging.set_logs") | 设置各个组件的日志级别并切换各个日志艺术品类型。'
- en: 'The environment variable `TORCH_LOGS` is a comma-separated list of `[+-]<component>`
pairs, where `<component>` is a component specified below. The `+` prefix will
decrease the log level of the component, displaying more log messages while the
......@@ -49,144 +68,233 @@
that are so spammy they should only be displayed when explicitly enabled. The
following components and artifacts are configurable through the `TORCH_LOGS` environment
variable (see torch._logging.set_logs for the python API):'
id: totrans-9
prefs: []
type: TYPE_NORMAL
zh: 环境变量`TORCH_LOGS`是一个由`[+-]<component>`对组成的逗号分隔列表,其中`<component>`是下面指定的组件。前缀`+`将减少组件的日志级别,显示更多日志消息,而前缀`-`将增加组件的日志级别,并显示更少的日志消息。默认设置是当未在`TORCH_LOGS`中指定组件时的行为。除了组件外,还有艺术品。艺术品是与组件关联的特定调试信息片段,可以显示或不显示,因此用`+`或`-`前缀对艺术品进行操作将不起作用。由于它们与组件相关联,启用该组件通常也会启用该艺术品,除非该艺术品被指定为默认关闭。对于那些应该仅在显式启用时显示的艺术品,在_registrations.py中指定了此选项。以下组件和艺术品可通过`TORCH_LOGS`环境变量进行配置(请参阅torch._logging.set_logs以获取python
API):
- en: 'Components:'
id: totrans-10
prefs: []
type: TYPE_NORMAL
zh: 组件:
- en: '`all`'
id: totrans-11
prefs: []
type: TYPE_NORMAL
zh: 全部
- en: 'Special component which configures the default log level of all components.
Default: `logging.WARN`'
id: totrans-12
prefs: []
type: TYPE_NORMAL
zh: 配置所有组件的默认日志级别的特殊组件。默认值:`logging.WARN`
- en: '`dynamo`'
id: totrans-13
prefs: []
type: TYPE_NORMAL
zh: 动力
- en: 'The log level for the TorchDynamo component. Default: `logging.WARN`'
id: totrans-14
prefs: []
type: TYPE_NORMAL
zh: TorchDynamo组件的日志级别。默认值:`logging.WARN`
- en: '`aot`'
id: totrans-15
prefs: []
type: TYPE_NORMAL
zh: AOT
- en: 'The log level for the AOTAutograd component. Default: `logging.WARN`'
id: totrans-16
prefs: []
type: TYPE_NORMAL
zh: AOTAutograd组件的日志级别。默认值:`logging.WARN`
- en: '`inductor`'
id: totrans-17
prefs: []
type: TYPE_NORMAL
zh: 感应器
- en: 'The log level for the TorchInductor component. Default: `logging.WARN`'
id: totrans-18
prefs: []
type: TYPE_NORMAL
zh: TorchInductor组件的日志级别。默认值:`logging.WARN`
- en: '`your.custom.module`'
id: totrans-19
prefs: []
type: TYPE_NORMAL
zh: your.custom.module
- en: 'The log level for an arbitrary unregistered module. Provide the fully qualified
name and the module will be enabled. Default: `logging.WARN`'
id: totrans-20
prefs: []
type: TYPE_NORMAL
zh: 任意未注册模块的日志级别。提供完全限定名称,模块将被启用。默认值:`logging.WARN`
- en: 'Artifacts:'
id: totrans-21
prefs: []
type: TYPE_NORMAL
zh: 艺术品:
- en: '`bytecode`'
id: totrans-22
prefs: []
type: TYPE_NORMAL
zh: 字节码
- en: 'Whether to emit the original and generated bytecode from TorchDynamo. Default:
`False`'
id: totrans-23
prefs: []
type: TYPE_NORMAL
zh: 是否生成TorchDynamo的原始和生成的字节码。默认值:`False`
- en: '`aot_graphs`'
id: totrans-24
prefs: []
type: TYPE_NORMAL
zh: AOT图
- en: 'Whether to emit the graphs generated by AOTAutograd. Default: `False`'
id: totrans-25
prefs: []
type: TYPE_NORMAL
zh: 是否生成AOTAutograd生成的图形。默认值:`False`
- en: '`aot_joint_graph`'
id: totrans-26
prefs: []
type: TYPE_NORMAL
zh: AOT联合图
- en: 'Whether to emit the joint forward-backward graph generated by AOTAutograd.
Default: `False`'
id: totrans-27
prefs: []
type: TYPE_NORMAL
zh: 是否生成AOTAutograd生成的联合前向-后向图。默认值:`False`
- en: '`compiled_autograd`'
id: totrans-28
prefs: []
type: TYPE_NORMAL
zh: 已编译的自动微分
- en: 'Whether to emit logs from compiled_autograd. Defaults: `False`'
id: totrans-29
prefs: []
type: TYPE_NORMAL
zh: 是否从compiled_autograd发出日志。默认值:`False`
- en: '`ddp_graphs`'
id: totrans-30
prefs: []
type: TYPE_NORMAL
zh: ddp图
- en: 'Whether to emit graphs generated by DDPOptimizer. Default: `False`'
id: totrans-31
prefs: []
type: TYPE_NORMAL
zh: 是否生成DDPOptimizer生成的图形。默认值:`False`
- en: '`graph`'
id: totrans-32
prefs: []
type: TYPE_NORMAL
zh:
- en: 'Whether to emit the graph captured by TorchDynamo in tabular format. Default:
`False`'
id: totrans-33
prefs: []
type: TYPE_NORMAL
zh: 是否以表格格式生成TorchDynamo捕获的图形。默认值:`False`
- en: '`graph_code`'
id: totrans-34
prefs: []
type: TYPE_NORMAL
zh: 图形代码
- en: 'Whether to emit the python source of the graph captured by TorchDynamo. Default:
`False`'
id: totrans-35
prefs: []
type: TYPE_NORMAL
zh: 是否生成TorchDynamo捕获的图形的Python源代码。默认值:`False`
- en: '`graph_breaks`'
id: totrans-36
prefs: []
type: TYPE_NORMAL
zh: 图中断
- en: 'Whether to emit a message when a unique graph break is encountered during TorchDynamo
tracing. Default: `False`'
id: totrans-37
prefs: []
type: TYPE_NORMAL
zh: 是否在TorchDynamo跟踪期间遇到唯一图形中断时发出消息。默认值:`False`
- en: '`guards`'
id: totrans-38
prefs: []
type: TYPE_NORMAL
zh: 守卫
- en: 'Whether to emit the guards generated by TorchDynamo for each compiled function.
Default: `False`'
id: totrans-39
prefs: []
type: TYPE_NORMAL
zh: 是否为每个编译函数生成的TorchDynamo守卫。默认值:`False`
- en: '`recompiles`'
id: totrans-40
prefs: []
type: TYPE_NORMAL
zh: 重新编译
- en: 'Whether to emit a guard failure reason and message every time TorchDynamo recompiles
a function. Default: `False`'
id: totrans-41
prefs: []
type: TYPE_NORMAL
zh: 是否在TorchDynamo重新编译函数时每次发出守卫失败原因和消息。默认值:`False`
- en: '`output_code`'
id: totrans-42
prefs: []
type: TYPE_NORMAL
zh: 输出代码
- en: 'Whether to emit the TorchInductor output code. Default: `False`'
id: totrans-43
prefs: []
type: TYPE_NORMAL
zh: 是否生成TorchInductor输出代码。默认值:`False`
- en: '`schedule`'
id: totrans-44
prefs: []
type: TYPE_NORMAL
zh: 时间表
- en: 'Whether to emit the TorchInductor schedule. Default: `False`'
id: totrans-45
prefs: []
type: TYPE_NORMAL
zh: 是否生成TorchInductor时间表。默认值:`False`
- en: 'Examples:'
id: totrans-46
prefs: []
type: TYPE_NORMAL
zh: 示例:
- en: '`TORCH_LOGS="+dynamo,aot"` will set the log level of TorchDynamo to `logging.DEBUG`
and AOT to `logging.INFO`'
id: totrans-47
prefs: []
type: TYPE_NORMAL
zh: '`TORCH_LOGS="+dynamo,aot"`将TorchDynamo的日志级别设置为`logging.DEBUG`,AOT设置为`logging.INFO`'
- en: '`TORCH_LOGS="-dynamo,+inductor"` will set the log level of TorchDynamo to `logging.ERROR`
and TorchInductor to `logging.DEBUG`'
id: totrans-48
prefs: []
type: TYPE_NORMAL
zh: '`TORCH_LOGS="-dynamo,+inductor"`将TorchDynamo的日志级别设置为`logging.ERROR`,TorchInductor设置为`logging.DEBUG`'
- en: '`TORCH_LOGS="aot_graphs"` will enable the `aot_graphs` artifact'
id: totrans-49
prefs: []
type: TYPE_NORMAL
zh: '`TORCH_LOGS="aot_graphs"` 将启用 `aot_graphs` artifact'
- en: '`TORCH_LOGS="+dynamo,schedule"` will enable set the log level of TorchDynamo
to `logging.DEBUG` and enable the `schedule` artifact'
id: totrans-50
prefs: []
type: TYPE_NORMAL
zh: '`TORCH_LOGS="+dynamo,schedule"` 将启用 TorchDynamo 的日志级别设置为 `logging.DEBUG` 并启用
`schedule` artifact'
- en: '`TORCH_LOGS="+some.random.module,schedule"` will set the log level of some.random.module
to `logging.DEBUG` and enable the `schedule` artifact'
id: totrans-51
prefs: []
type: TYPE_NORMAL
zh: '`TORCH_LOGS="+some.random.module,schedule"` 将设置 some.random.module 的日志级别为 `logging.DEBUG`
并启用 `schedule` artifact'
- en: Libraries
id: totrans-0
prefs:
- PREF_H1
type: TYPE_NORMAL
zh: 图书馆
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册