tf.contrib.rnn.AttentionCellWrapper

Basic attention cell wrapper.

Inherits From: RNNCell

Implementation based on https://arxiv.org/abs/1601.06733

Args
cell an RNNCell, an attention is added to it.
attn_length integer, the size of an attention window.
attn_size integer, the size of an attention vector. Equal to cell.output_size by default.
attn_vec_size integer, the number of convolutional features calculated on attention state and a size of the hidden layer built from base cell state. Equal attn_size to by default.
input_size integer, the size of a hidden linear layer, built from inputs and attention. Derived from the input tensor by default.
state_is_tuple If True, accepted and returned states are n-tuples, where n = len(cells). By default (False), the states are all concatenated along the column axis.
reuse (optional) Python boolean describing whether to reuse variables in an existing scope. If not True, and the existing scope already has the given variables, an error is raised.
Raises
TypeError if cell is not an RNNCell.
ValueError if cell returns a state tuple but the flag state_is_tuple is False or if attn_length is zero or less.
Attributes
graph DEPRECATED FUNCTION
output_size Integer or TensorShape: size of outputs produced by this cell.
scope_name
state_size size(s) of state(s) used by this cell.

It can be represented by an Integer, a TensorShape or a tuple of Integers or TensorShapes.

Methods

get_initial_state

View source

zero_state

View source

Return zero-filled state tensor(s).

Args
batch_size int, float, or unit Tensor representing the batch size.
dtype the data type to use for the state.
Returns
If state_size is an int or TensorShape, then the return value is a N-D tensor of shape [batch_size, state_size] filled with zeros.

If state_size is a nested list or tuple, then the return value is a nested list or tuple (of the same structure) of 2-D tensors with the shapes [batch_size, s] for each s in state_size.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/contrib/rnn/AttentionCellWrapper