• T
    First pass at a TPU loop for Transformer (#4296) · 2eeb85fe
    Taylor Robie 提交于
    * port changes from previous branch now that transformer util changes are in master
    
    fix incorrect count
    
    correct (hopefully) treatment of batch_size
    
    set eval_metrics to a dummy function for now
    
    add some comments
    
    start bringing metrics to transformer TPU
    
    resolve logits shape
    
    metrics are now working except for tf.py_func metrics
    
    increase batch_size for tpu, and create summary host call
    
    fix host call
    
    reduce tpu default batch size
    
    further tune batch sizes
    
    add minibatch loss to summary
    
    handle case of single_iteration_train_steps > number points in an epoch
    
    begin to incorporate hooks
    
    add sleep workarounds
    
    disable hooks altogether
    
    generalize host call function and move to newly created tpu utils module
    
    remove all traces of params as an object
    
    switch from  to
    
    address some PR comments, and change the number of data points.
    
    minor tweaks
    
    add tpu dry run for testing, and use matmul for TPU embedding
    
    infeed/outfeed queue issue is fixed. Sleeps are no longer necessary
    
    add some documentation.
    
    cleanup and address PR comments
    
    delint
    
    add accelerator __init__
    
    fix embedding
    
    missed PR comment
    
    address PR comments
    
    fix validator bug
    
    rewrite cloud storage validator, and add oauth dependency to requirements.txt
    
    * delint
    2eeb85fe
This project manages its dependencies using pip. 进一步了解
requirements.txt 127 字节