/external/autotest/client/tests/compilebench/ |
compilebench-0.6.tar.gz | |
/external/tensorflow/tensorflow/core/api_def/base_api/ |
api_def_ConcatenateDataset.pbtxt | 3 summary: "Creates a dataset that concatenates `input_dataset` with `another_dataset`."
|
api_def_ZipDataset.pbtxt | 3 summary: "Creates a dataset that zips together `input_datasets`."
|
api_def_PrefetchDataset.pbtxt | 7 this dataset. 10 summary: "Creates a dataset that asynchronously prefetches elements from `input_dataset`."
|
api_def_FlatMapDataset.pbtxt | 7 `other_arguments`, to a Dataset variant that contains elements matching 11 summary: "Creates a dataset that applies `f` to the outputs of `input_dataset`." 14 Dataset variant, and FlatMapDataset will flatten successive results 15 into a single Dataset.
|
api_def_MapDataset.pbtxt | 3 summary: "Creates a dataset that applies `f` to the outputs of `input_dataset`."
|
api_def_ScanDataset.pbtxt | 3 summary: "Creates a dataset successively reduces `f` over the elements of `input_dataset`."
|
api_def_SparseTensorSliceDataset.pbtxt | 3 summary: "Creates a dataset that splits a SparseTensor into elements row-wise."
|
api_def_TensorDataset.pbtxt | 3 summary: "Creates a dataset that emits `components` as a tuple of tensors once."
|
api_def_TensorSliceDataset.pbtxt | 3 summary: "Creates a dataset that emits each dim-0 slice of `components` once."
|
api_def_UniqueDataset.pbtxt | 3 summary: "Creates a dataset that contains the unique elements of `input_dataset`."
|
api_def_DatasetToSingleElement.pbtxt | 4 name: "dataset" 6 A handle to a dataset that contains a single element. 15 summary: "Outputs the single element from the given dataset."
|
api_def_ParallelInterleaveDataset.pbtxt | 7 `other_arguments`, to a Dataset variant that contains elements matching 11 summary: "Creates a dataset that applies `f` to the outputs of `input_dataset`." 13 The resulting dataset is similar to the `InterleaveDataset`, with the exception 14 that if retrieving the next value from a dataset would cause the requester to 15 block, it will skip that input dataset. This dataset is especially useful 19 !! WARNING !! This dataset is not deterministic!
|
api_def_InterleaveDataset.pbtxt | 7 `other_arguments`, to a Dataset variant that contains elements matching 11 summary: "Creates a dataset that applies `f` to the outputs of `input_dataset`." 14 a Dataset variant, and InterleaveDataset will flatten successive 15 results into a single Dataset. Unlike FlatMapDataset,
|
api_def_MakeIterator.pbtxt | 3 summary: "Makes a new iterator from the given `dataset` and stores it in `iterator`." 6 iterator in `iterator` to the first element of `dataset`.
|
api_def_ParallelMapDataset.pbtxt | 10 summary: "Creates a dataset that applies `f` to the outputs of `input_dataset`." 12 Unlike a "MapDataset", which applies `f` sequentially, this dataset invokes up
|
/external/tensorflow/tensorflow/docs_src/api_guides/python/ |
input_dataset.md | 0 # Dataset Input Pipeline 4 @{tf.data.Dataset} allows you to build complex input pipelines. See the 10 Classes that create a dataset from input files. 18 Static methods in `Dataset` that create new datasets. 20 * @{tf.data.Dataset.from_generator} 21 * @{tf.data.Dataset.from_tensor_slices} 22 * @{tf.data.Dataset.from_tensors} 23 * @{tf.data.Dataset.list_files} 24 * @{tf.data.Dataset.range} 25 * @{tf.data.Dataset.zip [all...] |
/external/tensorflow/tensorflow/contrib/data/python/ops/ |
dataset_ops.py | 31 class Dataset(dataset_ops.Dataset): 34 A `Dataset` can be used to represent an input pipeline as a 39 def __init__(self, dataset): 40 super(Dataset, self).__init__() 41 self._dataset = dataset 63 @deprecation.deprecated(None, "Use `tf.data.Dataset.from_tensors()`.") 65 """Creates a `Dataset` with a single element, comprising the given tensors. 71 A `Dataset`. 73 return Dataset(dataset_ops.TensorDataset(tensors) [all...] |
batching.py | 15 """Batching dataset transformations.""" 36 Like `Dataset.padded_batch()`, this transformation combines multiple 37 consecutive elements of the dataset, which might have different 47 # contents of a dataset. 63 number of consecutive elements of this dataset to combine in a 67 resulting `tf.SparseTensor`. Each element of this dataset must 72 A `Dataset` transformation function, which can be passed to 73 @{tf.data.Dataset.apply}. 76 def _apply_fn(dataset): 77 return DenseToSparseBatchDataset(dataset, batch_size, row_shape [all...] |
get_single_element.py | 25 def get_single_element(dataset): 26 """Returns the single element in `dataset` as a nested structure of tensors. 28 This function enables you to use a @{tf.data.Dataset} in a stateless 31 as a `Dataset`, and you want to use the transformation at serving time. 41 dataset = (tf.data.Dataset.from_tensor_slices(input_batch) 45 image_batch, label_batch = tf.contrib.data.get_single_element(dataset) 49 dataset: A @{tf.data.Dataset} object containing a single element. 53 element of `dataset` [all...] |
/external/tensorflow/tensorflow/contrib/data/ |
README.md | 12 The `tf.contrib.data.Dataset` class has been renamed to `tf.data.Dataset`, and 17 The arguments accepted by the `Dataset.map()` transformation have changed: 19 * `dataset.map(..., num_threads=T)` is now `dataset.map(num_parallel_calls=T)`. 20 * `dataset.map(..., output_buffer_size=B)` is now 21 `dataset.map(...).prefetch(B)`. 23 Some transformations have been removed from `tf.data.Dataset`, and you must 24 instead apply them using `Dataset.apply()` transformation. The full list of 27 * `dataset.dense_to_sparse_batch(...)` is no [all...] |
/external/tensorflow/tensorflow/docs_src/programmers_guide/ |
datasets.md | 15 * A `tf.data.Dataset` represents a sequence of elements, in which 19 ways to create a dataset: 21 * Creating a **source** (e.g. `Dataset.from_tensor_slices()`) constructs a 22 dataset from 25 * Applying a **transformation** (e.g. `Dataset.batch()`) constructs a dataset 26 from one or more `tf.data.Dataset` objects. 29 dataset. The operation returned by `Iterator.get_next()` yields the next 30 element of a `Dataset` when executed, and typically acts as the interface 32 "one-shot iterator", which is associated with a particular `Dataset` an [all...] |
/external/protobuf/benchmarks/ |
README.md | 15 Wrote dataset: dataset.google_message1_proto3.pb 16 Wrote dataset: dataset.google_message1_proto2.pb 17 Wrote dataset: dataset.google_message2.pb
|
/external/tensorflow/tensorflow/contrib/training/python/training/ |
tensor_queue_dataset_test.py | 37 dataset = dataset_ops.Dataset.from_tensor_slices([0, 1, 2]) 38 dataset = dataset.apply( 40 self.assertEqual((dtypes.variant, dtypes.int32), dataset.output_types) 42 [x.as_list() for x in dataset.output_shapes]) 43 iterator = dataset.make_one_shot_iterator() 52 dataset = dataset_ops.Dataset.from_tensor_slices([0, 1, 2]) 53 dataset = dataset.apply [all...] |
/external/tensorflow/tensorflow/core/kernels/data/ |
BUILD | 49 name = "dataset", 51 hdrs = ["dataset.h"], 68 ":dataset", 81 ":dataset", 96 ":dataset", 107 ":dataset", 120 ":dataset", 133 ":dataset", 146 ":dataset", 161 ":dataset", [all...] |