提交 88ff8404 编写于 作者: V Varuna Jayasiri

docs

上级 8a942253
......@@ -456,7 +456,7 @@
<url>
<loc>https://nn.labml.ai/transformers/knn/eval_knn.html</loc>
<lastmod>2020-12-10T16:30:00+00:00</lastmod>
<lastmod>2021-09-06T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>
......@@ -582,7 +582,7 @@
<url>
<loc>https://nn.labml.ai/transformers/vit/index.html</loc>
<lastmod>2021-08-19T16:30:00+00:00</lastmod>
<lastmod>2021-09-07T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>
......@@ -610,7 +610,7 @@
<url>
<loc>https://nn.labml.ai/transformers/switch/experiment.html</loc>
<lastmod>2021-05-26T16:30:00+00:00</lastmod>
<lastmod>2021-09-06T16:30:00+00:00</lastmod>
<priority>1.00</priority>
</url>
......
......@@ -444,7 +444,7 @@ transformer and the <a href="#ClassificationHead">classification head</a>.</p>
<ul>
<li><code>transformer_layer</code> is a copy of a single <a href="../models.html#TransformerLayer">transformer layer</a>.
We make copies of it to make the transformer with <code>n_layers</code>.</li>
<li><code>n_layers</code> is the number of [transformer layers](../models.html#TransformerLayer).</li>
<li><code>n_layers</code> is the number of <a href="../models.html#TransformerLayer">transformer layers</a>.</li>
<li><code>patch_emb</code> is the <a href="#PatchEmbeddings">patch embeddings layer</a>.</li>
<li><code>pos_emb</code> is the <a href="#LearnedPositionalEmbeddings">positional embeddings layer</a>.</li>
<li><code>classification</code> is the <a href="#ClassificationHead">classification head</a>.</li>
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册