**Experiments and comparisson with `LightGBM`**: [TabularDL vs LightGBM](https://github.com/jrzaurin/tabulardl-benchmark)
**slack**: if you want to contribute or just want to chat with us, join [slack](https://join.slack.com/t/pytorch-widedeep/shared_invite/zt-soss7stf-iXpVuLeKZz8lGTnxxtHtTw)
**Slack**: if you want to contribute or just want to chat with us, join [slack](https://join.slack.com/t/pytorch-widedeep/shared_invite/zt-soss7stf-iXpVuLeKZz8lGTnxxtHtTw)
### Introduction
...
...
@@ -109,14 +109,14 @@ is an adaptation of the original implementation.
5.``FT-Transformer``: or Feature Tokenizer transformer. This is a relatively small
variation of the ``TabTransformer``. The variation itself was first
introduced in the ``SAINT`` paper, but the name ``FT-Transformer`` was first
introduced in the ``SAINT`` paper, but the name "``FT-Transformer``" was first
used in
[Revisiting Deep Learning Models for TabularData](https://arxiv.org/abs/2106.11959).
[Revisiting Deep Learning Models for TabularData](https://arxiv.org/abs/2106.11959).
When using the ``FT-Transformer`` each continuous feature is "embedded"
(i.e. each one going through a 1-layer MLP with or without activation
function) and then passed through the attention blocks along with the
categorical features. This is available in ``pytorch-widedeep``'s
``TabTransformer`` by setting the parameter ``embed_continuous = True``.
(i.e. going through a 1-layer MLP with or without activation function) and
then passed through the attention blocks along with the categorical features.
This is available in ``pytorch-widedeep``'s ``TabTransformer`` by setting the
parameter ``embed_continuous = True``.
6.``SAINT``: Details on SAINT can be found in:
...
...
@@ -161,20 +161,19 @@ cd pytorch-widedeep
pip install-e .
```
**Important note for Mac users**: at the time of writing (June-2021) the
**Experiments and comparisson with `LightGBM`**: [TabularDL vs LightGBM](https://github.com/jrzaurin/tabulardl-benchmark)
**slack**: if you want to contribute or just want to chat with us, join [slack](https://join.slack.com/t/pytorch-widedeep/shared_invite/zt-soss7stf-iXpVuLeKZz8lGTnxxtHtTw)
**Slack**: if you want to contribute or just want to chat with us, join [slack](https://join.slack.com/t/pytorch-widedeep/shared_invite/zt-soss7stf-iXpVuLeKZz8lGTnxxtHtTw)
### Introduction
...
...
@@ -57,20 +57,20 @@ cd pytorch-widedeep
pip install-e .
```
**Important note for Mac users**: at the time of writing (June-2021) the