HomeSort by relevance Sort by last modified time
    Searched refs:attention_layer_sizes (Results 1 - 2 of 2) sorted by null

  /external/tensorflow/tensorflow/contrib/seq2seq/python/kernel_tests/
attention_wrapper_test.py 122 attention_layer_sizes=[attention_layer_size],
133 attention_layer_sizes=None,
147 if attention_layer_sizes is None:
150 # Compute sum of attention_layer_sizes. Use encoder_output_depth if None.
152 for attention_layer_size in attention_layer_sizes])
178 attention_layer_size=(attention_layer_sizes if is_multi
179 else attention_layer_sizes[0]),
756 attention_layer_sizes=[3, 4],
    [all...]
  /external/tensorflow/tensorflow/contrib/seq2seq/python/ops/
attention_wrapper.py     [all...]

Completed in 121 milliseconds