/test/vti/dashboard/testdata/ |
test-plan-report-data.json | 4 "testPlanName": "vts-serving-staging-fuzz", 9 "testPlanName": "vts-serving-staging-hal-conventional",
|
/external/tensorflow/tensorflow/docs_src/deploy/ |
index.md | 12 * The entire document set for [TensorFlow serving](/serving), an open-source, 13 flexible, high-performance serving system for machine-learned models 14 designed for production environments. TensorFlow Serving provides 16 [Source code for TensorFlow Serving](https://github.com/tensorflow/serving)
|
/external/tensorflow/tensorflow/tools/api/golden/ |
tensorflow.saved_model.tag_constants.pbtxt | 8 name: "SERVING"
|
/test/framework/ |
README.md | 3 VTS Lab is an open source test serving infrastructure that can be used
|
/external/tensorflow/tensorflow/python/saved_model/ |
tag_constants.py | 26 # Tag for the `serving` graph. 27 SERVING = "serve" 28 tf_export("saved_model.tag_constants.SERVING").export_constant( 29 __name__, "SERVING") 45 "SERVING",
|
simple_save.py | 31 """Convenience function to build a SavedModel suitable for serving. 33 In many common cases, saving models for serving will be as simple as: 42 - It will be treated as a graph for inference / serving (i.e. uses the tag 43 `tag_constants.SERVING`) 44 - The SavedModel will load in TensorFlow Serving and supports the 46 API](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/predict.proto). 81 tags=[tag_constants.SERVING],
|
signature_constants.py | 26 # Key in the signature def map for `default` serving signatures. The default 42 CLASSIFY_METHOD_NAME = "tensorflow/serving/classify" 68 PREDICT_METHOD_NAME = "tensorflow/serving/predict" 87 REGRESS_METHOD_NAME = "tensorflow/serving/regress"
|
/external/autotest/client/site_tests/network_FirewallHolePunch/src/tcpserver/ |
index.html | 17 <p>Serving at <span class="serving-at"></span>
|
/external/tensorflow/tensorflow/contrib/session_bundle/example/ |
BUILD | 1 # Description: TensorFlow Serving session_bundle example.
|
/external/tensorflow/tensorflow/contrib/learn/python/learn/utils/ |
input_fn_utils.py | 43 This return type is currently only supported for serving input_fn. 50 `SparseTensor`, specifying labels for training or eval. For serving, set 54 Typically, this is used by a serving input_fn, which expects to be fed 60 """Build an input_fn appropriate for serving, expecting fed tf.Examples. 65 for use at serving time, so the labels return value is always None. 73 An input_fn suitable for use in serving. 82 labels = None # these are not known in serving! 88 """Build an input_fn appropriate for serving, expecting feature Tensors. 91 This input_fn is for use at serving time, so the labels return value is always 100 An input_fn suitable for use in serving [all...] |
/prebuilts/tools/linux-x86_64/kythe/studio/ |
serve_studio_kythe.sh | 17 OUT_SERVING="${OUT}/serving" 42 # If no serving table exists yet, create it from the graphstore. 49 # Start the kythe webserver for the serving table.
|
/external/tensorflow/tensorflow/cc/saved_model/ |
signature_constants.h | 21 /// Key in the signature def map for `default` serving signatures. The default 33 static constexpr char kClassifyMethodName[] = "tensorflow/serving/classify"; 48 static constexpr char kPredictMethodName[] = "tensorflow/serving/predict"; 60 static constexpr char kRegressMethodName[] = "tensorflow/serving/regress";
|
/external/webrtc/webrtc/tools/loopback_test/ |
README | 8 ./run-server.sh (to start python serving the tests)
|
/developers/build/prebuilts/gradle/RecipeAssistant/Application/src/main/assets/ |
northern-irish-vegetable-soup.json | 20 serving: [ 21 "Whole boiled potatoes are traditionally placed in the soup at time of serving."
|
guacamole.json | 24 serving: "",
|
/developers/samples/android/wearable/wear/RecipeAssistant/Application/src/main/assets/ |
northern-irish-vegetable-soup.json | 20 serving: [ 21 "Whole boiled potatoes are traditionally placed in the soup at time of serving."
|
guacamole.json | 24 serving: "",
|
/external/tensorflow/tensorflow/contrib/session_bundle/ |
test_util.h | 26 namespace serving { namespace in namespace:tensorflow 35 } // namespace serving
|
manifest.proto | 3 package tensorflow.serving; 32 // are deployed for serving. 66 // serving, analysis or debugging time. The recommended name for this signature
|
test_util.cc | 23 namespace serving { namespace in namespace:tensorflow 34 } // namespace serving
|
/external/autotest/server/site_tests/network_WiFi_RateControl/ |
control | 14 This test associates a DUT with several APs serving an open HT40 network. The
|
/prebuilts/tools/linux-x86_64/kythe/ |
README.md | 33 - write_tables :: Processes a GraphStore into efficient serving tables for http_server 81 # Process the GraphStore into serving tables 82 SERVING=/tmp/kythe_serving 83 rm -rf "$SERVING" 84 /opt/kythe/tools/write_tables --graphstore $GRAPHSTORE --out "$SERVING" 89 /opt/kythe/tools/http_server --serving_table "$SERVING" \ 92 # The sample web UI will be serving at http://localhost:9898
|
/external/tensorflow/tensorflow/docs_src/programmers_guide/ |
saved_model.md | 238 serving or training), and optionally with hardware-specific aspects (for 258 builder.add_meta_graph([tag_constants.SERVING]) 306 ### Loading and Serving a SavedModel in TensorFlow Serving 308 You can easily load and serve a SavedModel with the TensorFlow Serving Model 309 Server binary. See [instructions](https://www.tensorflow.org/serving/setup#installing_using_apt-get) 364 To prepare a trained Estimator for serving, you must export it in the standard 368 [APIs](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/prediction_service.proto) 374 ### Preparing serving inputs 377 and prepares it for use by the model. At serving time, similarly, [all...] |
/developers/samples/android/content/WidgetData/Application/src/main/ |
AndroidManifest.xml | 40 <!-- The service serving the RemoteViews to the collection widget --> 45 <!-- The content provider serving the (fake) weather data -->
|
/development/samples/WeatherListWidget/ |
AndroidManifest.xml | 38 <!-- The service serving the RemoteViews to the collection widget --> 43 <!-- The content provider serving the (fake) weather data -->
|