Home | History | Annotate | only in /external/tensorflow/tensorflow/python/compiler/tensorrt
Up to higher level directory
NameDateSize
__init__.py22-Oct-20201K
BUILD22-Oct-20204.9K
README.md22-Oct-20202.5K
test/22-Oct-2020
trt_conversion.i22-Oct-20202.1K
trt_convert.py22-Oct-202038.7K
trt_convert_test.py22-Oct-202018.7K

README.md

      1 # Using TensorRT in TensorFlow (TF-TRT)
      2 
      3 This module provides necessary bindings and introduces `TRTEngineOp` operator
      4 that wraps a subgraph in TensorRT. This module is under active development.
      5 
      6 ## Installing TF-TRT
      7 
      8 Currently TensorFlow nightly builds include TF-TRT by default, which means you
      9 don't need to install TF-TRT separately. You can pull the latest TF containers
     10 from docker hub or install the latest TF pip package to get access to the latest
     11 TF-TRT.
     12 
     13 If you want to use TF-TRT on NVIDIA Jetson platform, you can find the download
     14 links for the relevant TensorFlow pip packages here:
     15 https://docs.nvidia.com/deeplearning/dgx/index.html#installing-frameworks-for-jetson
     16 
     17 ## Installing TensorRT
     18 
     19 In order to make use of TF-TRT, you will need a local installation of TensorRT.
     20 Installation instructions for compatibility with TensorFlow are provided on the
     21 [TensorFlow GPU support](https://www.tensorflow.org/install/gpu) guide.
     22 
     23 ## Examples
     24 
     25 You can find example scripts for running inference on deep learning models in
     26 this repository: https://github.com/tensorflow/tensorrt
     27 
     28 We have used these examples to verify the accuracy and performance of TF-TRT.
     29 For more information see
     30 [Verified Models](https://docs.nvidia.com/deeplearning/dgx/integrate-tf-trt/index.html#verified-models).
     31 
     32 ## Documentation
     33 
     34 [TF-TRT documentation](https://docs.nvidia.com/deeplearning/dgx/integrate-tf-trt/index.html)
     35 gives an overview of the supported functionalities, provides tutorials and
     36 verified models, explains best practices with troubleshooting guides.
     37 
     38 ## Tests
     39 
     40 TF-TRT includes both Python tests and C++ unit tests. Most of Python tests are
     41 located in the test directory and they can be executed using `bazel test` or
     42 directly with the Python command. Most of the C++ unit tests are used to test
     43 the conversion functions that convert each TF op to a number of TensorRT layers.
     44 
     45 ## Compilation
     46 
     47 In order to compile the module, you need to have a local TensorRT installation
     48 (libnvinfer.so and respective include files). During the configuration step,
     49 TensorRT should be enabled and installation path should be set. If installed
     50 through package managers (deb,rpm), configure script should find the necessary
     51 components from the system automatically. If installed from tar packages, user
     52 has to set path to location where the library is installed during configuration.
     53 
     54 ```shell
     55 bazel build --config=cuda --config=opt //tensorflow/tools/pip_package:build_pip_package
     56 bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/
     57 ```
     58 
     59