Updates to RNNCells to allow easy storage of attention TensorArray in the state.
The main change is that RNNCells that wrap other RNNCells now override self.zero_state to call the wrapped cell's zero_state and then (maybe) perform some post-processing... instead of relying on the state_size property to provide all information about the state. Also made zero_state calls create ops inside their own name scope. Change: 150413265
Showing
想要评论请 注册 或 登录