提交 878b4859 编写于 作者: V Varuna Jayasiri

link fix

上级 ad5f40c9
...@@ -72,7 +72,7 @@ ...@@ -72,7 +72,7 @@
<div class='section-link'> <div class='section-link'>
<a href='#section-0'>#</a> <a href='#section-0'>#</a>
</div> </div>
<h1>[Fast weights transformer]((https://nn.labml.ai/transformers/fast_weights/index.html)</h1> <h1><a href="https://nn.labml.ai/transformers/fast_weights/index.html">Fast weights transformer</a></h1>
<p>This is an annotated implementation of the paper <p>This is an annotated implementation of the paper
<a href="https://arxiv.org/abs/2102.11174">Linear Transformers Are Secretly Fast Weight Memory Systems in PyTorch</a>.</p> <a href="https://arxiv.org/abs/2102.11174">Linear Transformers Are Secretly Fast Weight Memory Systems in PyTorch</a>.</p>
<p>Here is the <a href="https://nn.labml.ai/transformers/fast_weights/index.html">annotated implementation</a>. <p>Here is the <a href="https://nn.labml.ai/transformers/fast_weights/index.html">annotated implementation</a>.
......
# [Fast weights transformer]((https://nn.labml.ai/transformers/fast_weights/index.html) # [Fast weights transformer](https://nn.labml.ai/transformers/fast_weights/index.html)
This is an annotated implementation of the paper This is an annotated implementation of the paper
[Linear Transformers Are Secretly Fast Weight Memory Systems in PyTorch](https://arxiv.org/abs/2102.11174). [Linear Transformers Are Secretly Fast Weight Memory Systems in PyTorch](https://arxiv.org/abs/2102.11174).
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册