8000 DOC: Cover the new force argument for numpy() · pytorch/pytorch@52e954b · GitHub
[go: up one dir, main page]

Skip to content
8000

Commit 52e954b

Browse files
committed
DOC: Cover the new force argument for numpy()
1 parent db3299a commit 52e954b

File tree

1 file changed

+10
-8
lines changed

1 file changed

+10
-8
lines changed

docs/source/notes/numpy_compatibility.rst

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,8 @@ NumPy Compatibility
1212

1313
**TL;DR:**
1414

15-
- Be explicit about obtaining and using NumPy arrays, ``.numpy()`` will always ensure valid arrays
15+
- Be explicit about obtaining and using NumPy arrays, ``.numpy(force=True)``
16+
will always ensure valid arrays [from ``1.12``]
1617
- `This tutorial`_ shows a more concrete example of interoperability between
1718
NumPy and PyTorch
1819

@@ -83,13 +84,12 @@ existing object. We can obtain a **view** or interpretation of a Tensor from
8384
NumPy array as well.
8485

8586
The conversion to a NumPy array may potentially be a lossy operation (the
86-
computational graph), and so there are also no inbuilt operations to
87-
unconditionally return a NumPy array. Practically speaking, a function can be
88-
created locally to return a NumPy array, for say, visualizations.
87+
computational graph). Where the user is certain that such information is not
88+
present / required, e.g. to generate visualizations, there is also the
89+
``torch.numpy(force=True)``, which is conceptually equivalent to:
8990

9091
.. code-block:: python
9192
92-
# Possibly lossy function for visualizations
9393
def uncond_numpy(torch_tensor):
9494
if torch_tensor.device != torch.device(type="cpu"):
9595
torch_tensor = torch_tensor.cpu()
@@ -165,9 +165,11 @@ As a concrete example, consider the following snippet:
165165
object is described in the `interoperability with NumPy`_ document.
166166

167167
If it is absolutely necessary to write functions where the input objects are not
168-
unconditionally known to be either PyTorch tensors or NumPy arrays, it is
169-
possible to ensure operator functionality by using NumPy functions explicitly as
170-
they are more forgiving than their PyTorch equivalents.
168+
unconditionally known to be either PyTorch tensors or NumPy arrays, it is is
169+
**strongly recommended** to use the ``torch.numpy(force=True)`` method
170+
explicitly. As a less clear alternative, it is also possible to ensure operator
171+
functionality by using NumPy functions since these will coerce tensors without
172+
throwing errors.
171173

172174
.. csv-table::
173175
:header: Operator, NumPy Function, Description

0 commit comments

Comments
 (0)
0