• T
    Keras ideal fit and compile. · 10666c59
    Thomas O'Malley 提交于
    Kept all new abstractions private for now. In a few weeks, if we're
    comfortable that these abstractions are working and stable, we should expose
    many of them publicly.
    
    Capabilites added by this CL:
    
    (1) Easy to create a custom training step via overriding Model._train_step
    (2) Easy to create custom tf.function / DistStrat logic via overriding
    Model._make_train_function
    (3) Advanced users can override Model.compile and Model.fit
    (4) Full support for dicts, nested structures, etc with Subclassed Models.
    (5) "Power user" path (tf.data inputs) only modifies data in Model._train_step,
    where this behavior is easy to override and disable. This applies even to
    Keras's assumption that data is passed in (x, y, sample_weight) format.
    
    Behavior changes:
    
    (1) "loss" passed to Callbacks is now stateful (like all other metrics in
    Callbacks). This greatly simplifies the training step logic and callback logic.
    (2) ProgbarLogger always uses steps. If steps is not available, the
    ProgbarLogger handles inferring the steps after the first epoch.
    (3) validation_batch_size added in `fit`, rather than inferring from generator.
    (4) Model.inputs, Model.outputs, Model.input_names, and Model.output_names are
    no longer populated for subclassed Models. Instead, "pseudo" output names are
    created for subclassed Models, which are only used for metrics names and
    SavedModel's signature.
    (5) Cast NumPy floats to backend.floatx(), otherwise leave
    unchanged (this is likely not a change, we did something like this in our old
    version but the logic was scattered in many places)
    
    PiperOrigin-RevId: 296090972
    Change-Id: Ia5ac833fd39085bddb016833bd338083d0dc5fc2
    10666c59
sequential_test.py 16.5 KB