8000 Replace links with 2.7.0 by svekars · Pull Request #31 · pytorch/docs · GitHub
[go: up one dir, main page]

Skip to content

Replace links with 2.7.0 #31

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 16, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
8000
Diff view
  •  
  •  
  •  
20 changes: 10 additions & 10 deletions 2.7/amp.html

Large diffs are not rendered by default.

26 changes: 13 additions & 13 deletions 2.7/autograd.html

Large diffs are not rendered by default.

76 changes: 38 additions & 38 deletions 2.7/backends.html

Large diffs are not rendered by default.

44 changes: 22 additions & 22 deletions 2.7/benchmark_utils.html

Large diffs are not rendered by default.

12 changes: 6 additions & 6 deletions 2.7/checkpoint.html
Original file line number Diff line number Diff line change
Expand Up @@ -624,7 +624,7 @@ <h1>torch.utils.checkpoint<a class="headerlink" href="#torch-utils-checkpoint" t
</div>
<dl class="py function">
<dt class="sig sig-object py" id="torch.utils.checkpoint.checkpoint">
<span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">checkpoint</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">function</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">*args</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">use_reentrant=None</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">context_fn=&lt;function</span> <span class="pre">noop_context_fn&gt;</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">determinism_check='default'</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">debug=False</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">**kwargs</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#checkpoint"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.1/torch/utils/checkpoint.py#L342"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.checkpoint" title="Permalink to this definition">¶</a></dt>
<span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">checkpoint</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">function</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">*args</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">use_reentrant=None</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">context_fn=&lt;function</span> <span class="pre">noop_context_fn&gt;</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">determinism_check='default'</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">debug=False</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">**kwargs</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#checkpoint"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.0/torch/utils/checkpoint.py#L342"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.checkpoint" title="Permalink to this definition">¶</a></dt>
<dd><p>Checkpoint a model or part of the model.</p>
<p>Activation checkpointing is a technique that trades compute for memory.
Instead of keeping tensors needed for backward alive until they are used in
Expand Down Expand Up @@ -736,7 +736,7 @@ <h1>torch.utils.checkpoint<a class="headerlink" href="#torch-utils-checkpoint" t

<dl class="py function">
<dt class="sig sig-object py" id="torch.utils.checkpoint.checkpoint_sequential">
<span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">checkpoint_sequential</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">functions</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">segments</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">input</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">use_reentrant</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em>, <em class="sig-param"><span class="o"><span class="pre">**</span></span><span class="n"><span class="pre">kwargs</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#checkpoint_sequential"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.1/torch/utils/checkpoint.py#L503"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.checkpoint_sequential" title="Permalink to this definition">¶</a></dt>
<span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">checkpoint_sequential</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">functions</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">segments</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">input</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">use_reentrant</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">None</span></span></em>, <em class="sig-param"><span class="o"><span class="pre">**</span></span><span class="n"><span class="pre">kwargs</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#checkpoint_sequential"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.0/torch/utils/checkpoint.py#L503"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.checkpoint_sequential" title="Permalink to this definition">¶</a></dt>
<dd><p>Checkpoint a sequential model to save memory.</p>
<p>Sequential models execute a list of modules/functions in order
(sequentially). Therefore, we can divide such a model in various segments
Expand Down Expand Up @@ -786,7 +786,7 @@ <h1>torch.utils.checkpoint<a class="headerlink" href="#torch-utils-checkpoint" t

<dl class="py function">
<dt class="sig sig-object py" id="torch.utils.checkpoint.set_checkpoint_debug_enabled">
<span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">set_checkpoint_debug_enabled</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">enabled</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#set_checkpoint_debug_enabled"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.1/torch/utils/checkpoint.py#L43"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.set_checkpoint_debug_enabled" title="Permalink to this definition">¶</a></dt>
<span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">set_checkpoint_debug_enabled</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">enabled</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#set_checkpoint_debug_enabled"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.0/torch/utils/checkpoint.py#L43"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.set_checkpoint_debug_enabled" title="Permalink to this definition">¶</a></dt>
<dd><p>Context manager that sets whether checkpoint should print additional debug
information when running. See the <code class="docutils literal notranslate"><span class="pre">debug</span></code> flag for
<a class="reference internal" href="#torch.utils.checkpoint.checkpoint" title="torch.utils.checkpoint.checkpoint"><code class="xref py py-func docutils literal notranslate"><span class="pre">checkpoint()</span></code></a> for more information. Note that
Expand All @@ -802,7 +802,7 @@ <h1>torch.utils.checkpoint<a class="headerlink" href="#torch-utils-checkpoint" t

<dl class="py class">
<dt class="sig sig-object py" id="torch.utils.checkpoint.CheckpointPolicy">
<em class="property"><span class="pre">class</span><span class="w"> </span></em><span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">CheckpointPolicy</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">value</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#CheckpointPolicy"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.1/torch/utils/checkpoint.py#L1225"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.CheckpointPolicy" title="Permalink to this definition">¶</a></dt>
<em class="property"><span class="pre">class</span><span class="w"> </span></em><span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">CheckpointPolicy</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">value</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#CheckpointPolicy"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.0/torch/utils/checkpoint.py#L1225"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.CheckpointPolicy" title="Permalink to this definition">¶</a></dt>
<dd><p>Enum for specifying the policy for checkpointing during backpropagation.</p>
<p>The following policies are supported:</p>
<ul class="simple">
Expand All @@ -826,7 +826,7 @@ <h1>torch.utils.checkpoint<a class="headerlink" href="#torch-utils-checkpoint" t

<dl class="py class">
<dt class="sig sig-object py" id="torch.utils.checkpoint.SelectiveCheckpointContext">
<em class="property"><span class="pre">class</span><span class="w"> </span></em><span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">SelectiveCheckpointContext</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="o"><span class="pre">*</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">is_recompute</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#SelectiveCheckpointContext"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.1/torch/utils/checkpoint.py#L1199"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.SelectiveCheckpointContext" title="Permalink to this definition">¶</a></dt>
<em class="property"><span class="pre">class</span><span class="w"> </span></em><span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">SelectiveCheckpointContext</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="o"><span class="pre">*</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">is_recompute</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#SelectiveCheckpointContext"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.0/torch/utils/checkpoint.py#L1199"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.SelectiveCheckpointContext" title="Permalink to this definition">¶</a></dt>
<dd><p>Context passed to policy function during selective checkpointing.</p>
<p>This class is used to pass relevant metadata to the policy function during
selective checkpointing. The metadata includes whether the current invocation
Expand All @@ -849,7 +849,7 @@ <h1>torch.utils.checkpoint<a class="headerlink" href="#torch-utils-checkpoint" t

<dl class="py function">
<dt class="sig sig-object py" id="torch.utils.checkpoint.create_selective_checkpoint_contexts">
<span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">create_selective_checkpoint_contexts</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">policy_fn_or_list</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">allow_cache_entry_mutation</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">False</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#create_selective_checkpoint_contexts"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.1/torch/utils/checkpoint.py#L1333"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.create_selective_checkpoint_contexts" title="Permalink to this definition">¶</a></dt>
<span class="sig-prename descclassname"><span class="pre">torch.utils.checkpoint.</span></span><span class="sig-name descname"><span class="pre">create_selective_checkpoint_contexts</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">policy_fn_or_list</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">allow_cache_entry_mutation</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">False</span></span></em><span class="sig-paren">)</span><a class="reference internal" href="_modules/torch/utils/checkpoint.html#create_selective_checkpoint_contexts"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.0/torch/utils/checkpoint.py#L1333"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch.utils.checkpoint.create_selective_checkpoint_contexts" title="Permalink to this definition">¶</a></dt>
<dd><p>Helper to avoid recomputing certain ops during activation checkpointing.</p>
<p>Use this with <cite>torch.utils.checkpoint.checkpoint</cite> to control which
operations are recomputed during the backward pass.</p>
Expand Down
2 changes: 1 addition & 1 deletion 2.7/cond.html
Original file line number Diff line number Diff line change
Expand Up @@ -760,7 +760,7 @@ <h2>Invariants of torch.ops.higher_order.cond<a class="headerlink" href="#invari
<h3>API Reference<a class="headerlink" href="#api-reference" title="Permalink to this heading">¶</a></h3>
<dl class="py function">
<dt class="sig sig-object py" id="torch._higher_order_ops.cond.cond">
<span class="sig-prename descclassname"><span class="pre">torch._higher_order_ops.cond.</span></span><span class="sig-name descname"><span class="pre">cond</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">pred</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">true_fn</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">false_fn</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">operands</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">()</span></span></em><span class="sig-paren">)</span><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.1/torch/_higher_order_ops/cond.py#L67"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch._higher_order_ops.cond.cond" title="Permalink to this definition">¶</a></dt>
<span class="sig-prename descclassname"><span class="pre">torch._higher_order_ops.cond.</span></span><span class="sig-name descname"><span class="pre">cond</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">pred</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">true_fn</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">false_fn</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">operands</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">()</span></span></em><span class="sig-paren">)</span><a class="reference external" href="https://github.com/pytorch/pytorch/blob/v2.7.0/torch/_higher_order_ops/cond.py#L67"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torch._higher_order_ops.cond.cond" title="Permalink to this definition">¶</a></dt>
<dd><p>Conditionally applies <cite>true_fn</cite> or <cite>false_fn</cite>.</p>
<div class="admonition warning">
<p class="admonition-title">Warning</p>
Expand Down
Loading
0