- 04 8月, 2023 1 次提交
-
-
由 Javier 提交于
Added a example of flash and linear attention. Fix some small bugs in one example. Adjusted all new functionality to GPU usage
-
- 03 8月, 2023 1 次提交
-
-
由 Javier 提交于
test passed. Need to increase a bit test coverage for the tabtransformer and attention_layers, and review the docs
-
- 02 8月, 2023 1 次提交
-
-
由 Javier 提交于
Added linear attention from the paper 'Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention'. This now needs to be turned into an encoder and offer as an optional model
-
- 28 7月, 2023 3 次提交