提交 83ccc5c7 编写于 作者: J jrzaurin

Added notebook on how to use custom components. Version will be still beta...

Added notebook on how to use custom components. Version will be still beta because I prefer to wait for V1 until I have tried it with more datasets
上级 16285d54
1.0.0
\ No newline at end of file
0.4.8
\ No newline at end of file
......@@ -26,7 +26,7 @@ sys.path.insert(0, PACKAGEDIR)
# -- Project information -----------------------------------------------------
project = "pytorch-widedeep"
copyright = "2020, Javier Rodriguez Zaurin"
copyright = "2021, Javier Rodriguez Zaurin"
author = "Javier Rodriguez Zaurin"
# # The full version, including alpha/beta/rc tags
......
......@@ -12,3 +12,4 @@ them to address different problems
* `Binary Classification with varying parameters <https://github.com/jrzaurin/pytorch-widedeep/blob/master/examples/04_Binary_Classification_Varying_Parameters.ipynb>`__
* `Regression with Images and Text <https://github.com/jrzaurin/pytorch-widedeep/blob/master/examples/05_Regression_with_Images_and_Text.ipynb>`__
* `FineTune routines <https://github.com/jrzaurin/pytorch-widedeep/blob/master/examples/06_FineTune_and_WarmUp_Model_Components.ipynb>`__
* `Custom Components <https://github.com/jrzaurin/pytorch-widedeep/blob/master/examples/07_Custom_Components.ipynb>`__
......@@ -53,7 +53,7 @@ within the faded-pink rectangle are concatenated.
Note that it is not possible to illustrate the number of possible
architectures and components available in ``pytorch-widedeep`` in one Figure.
Therefore, for more details on possible architectures (and more) please, read
this documentation, or seethe Examples folders in the repo.
this documentation, or see the Examples folders in the repo.
In math terms, and following the notation in the `paper
<https://arxiv.org/abs/1606.07792>`_, the expression for the architecture
......
......@@ -40,3 +40,4 @@ Dependencies
* torch
* torchvision
* einops
* wrapt
\ No newline at end of file
......@@ -2,10 +2,10 @@ The ``utils`` module
====================
These are a series utilities that might be useful for a number of
preprocessing tasks. All the classes and functions discussed here are
available directly from the ``utils`` module. For example, the
``LabelEncoder`` within the ``deeptabular_utils`` submodule can be imported
as:
preprocessing tasks, even not directly related to ``pytorch-widedeep``. All
the classes and functions discussed here are available directly from the
``utils`` module. For example, the ``LabelEncoder`` within the
``deeptabular_utils`` submodule can be imported as:
.. code-block:: python
......
此差异已折叠。
......@@ -121,10 +121,8 @@ class Callback(object):
class History(Callback):
r"""Callback that records metrics to a ``history`` attribute.
This callback runs by default within :obj:`Trainer`. Callbacks are passed
as input parameters to the ``Trainer`` class See
:class:`pytorch_widedeep.trainer.Trainer`. Documentation is included here
for completion.
This callback runs by default within :obj:`Trainer`, therefore, should not
be passed to the ``Trainer``. Is included here just for completion.
"""
def on_train_begin(self, logs: Optional[Dict] = None):
......
......@@ -93,7 +93,7 @@ class Trainer:
- ``root_mean_squared_error``, aliases: ``rmse``
- ``root_mean_squared_log_error``, aliases: ``rmsle``
custom_loss: ``nn.Module``, Optional, default = None
custom_loss_function: ``nn.Module``, Optional, default = None
object of class ``nn.Module``. If none of the loss functions
available suits the user, it is possible to pass a custom loss
function. See for example
......@@ -101,6 +101,11 @@ class Trainer:
structure of the object or the `Examples
<https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples>`_
folder in the repo.
.. note:: If ``custom_loss_function`` is not None, ``objective`` must be
'binary', 'multiclass' or 'regression', consistent with the loss
function
optimizers: ``Optimzer`` or Dict, Optional, default= ``AdamW``
- An instance of Pytorch's ``Optimizer`` object (e.g. :obj:`torch.optim.Adam()`) or
- a dictionary where there keys are the model components (i.e.
......@@ -222,7 +227,7 @@ class Trainer:
"regression",
]:
raise ValueError(
"If 'custom_loss_function' is not None, 'objective' might be 'binary' "
"If 'custom_loss_function' is not None, 'objective' must be 'binary' "
"'multiclass' or 'regression', consistent with the loss function"
)
......
__version__ = "1.0.0"
__version__ = "0.4.8"
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册