Home | History | Annotate | Download | only in graphics
      1 <html devsite>
      2   <head>
      3     <title>SurfaceTexture</title>
      4     <meta name="project_path" value="/_project.yaml" />
      5     <meta name="book_path" value="/_book.yaml" />
      6   </head>
      7   <body>
      8   <!--
      9       Copyright 2017 The Android Open Source Project
     10 
     11       Licensed under the Apache License, Version 2.0 (the "License");
     12       you may not use this file except in compliance with the License.
     13       You may obtain a copy of the License at
     14 
     15           http://www.apache.org/licenses/LICENSE-2.0
     16 
     17       Unless required by applicable law or agreed to in writing, software
     18       distributed under the License is distributed on an "AS IS" BASIS,
     19       WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     20       See the License for the specific language governing permissions and
     21       limitations under the License.
     22   -->
     23 
     24 
     25 
     26 
     27 <p>The SurfaceTexture class was introduced in Android 3.0. Just as SurfaceView
     28 is the combination of a Surface and a View, SurfaceTexture is a rough
     29 combination of a Surface and a GLES texture (with a few caveats).</p>
     30 
     31 <p>When you create a SurfaceTexture, you are creating a BufferQueue for which
     32 your app is the consumer. When a new buffer is queued by the producer, your app
     33 is notified via callback (<code>onFrameAvailable()</code>). Your app calls
     34 <code>updateTexImage()</code>, which releases the previously-held buffer,
     35 acquires the new buffer from the queue, and makes some EGL calls to make the
     36 buffer available to GLES as an external texture.</p>
     37 
     38 
     39 <h2 id=ext_texture>External textures</h2>
     40 <p>External textures (<code>GL_TEXTURE_EXTERNAL_OES</code>) are not quite the
     41 same as textures created by GLES (<code>GL_TEXTURE_2D</code>): You have to
     42 configure your renderer a bit differently, and there are things you can't do
     43 with them. The key point is that you can render textured polygons directly
     44 from the data received by your BufferQueue. gralloc supports a wide variety of
     45 formats, so we need to guarantee the format of the data in the buffer is
     46 something GLES can recognize. To do so, when SurfaceTexture creates the
     47 BufferQueue, it sets the consumer usage flags to
     48 <code>GRALLOC_USAGE_HW_TEXTURE</code>, ensuring that any buffer created by
     49 gralloc would be usable by GLES.</p>
     50 
     51 <p>Because SurfaceTexture interacts with an EGL context, you must be careful to
     52 call its methods from the correct thread (as detailed in the class
     53 documentation).</p>
     54 
     55 <h2 id=time_transforms>Timestamps and transformations</h2>
     56 <p>If you look deeper into the class documentation, you will see a couple of odd
     57 calls. One call retrieves a timestamp, the other a transformation matrix, the
     58 value of each having been set by the previous call to
     59 <code>updateTexImage()</code>. It turns out that BufferQueue passes more than
     60 just a buffer handle to the consumer. Each buffer is accompanied by a timestamp
     61 and transformation parameters.</p>
     62 
     63 <p>The transformation is provided for efficiency. In some cases, the source data
     64 might be in the incorrect orientation for the consumer; but instead of rotating
     65 the data before sending it, we can send the data in its current orientation with
     66 a transform that corrects it. The transformation matrix can be merged with other
     67 transformations at the point the data is used, minimizing overhead.</p>
     68 
     69 <p>The timestamp is useful for certain buffer sources. For example, suppose you
     70 connect the producer interface to the output of the camera (with
     71 <code>setPreviewTexture()</code>). To create a video, you need to set the
     72 presentation timestamp for each frame; but you want to base that on the time
     73 when the frame was captured, not the time when the buffer was received by your
     74 app. The timestamp provided with the buffer is set by the camera code, resulting
     75 in a more consistent series of timestamps.</p>
     76 
     77 <h2 id=surfacet>SurfaceTexture and Surface</h2>
     78 
     79 <p>If you look closely at the API you'll see the only way for an application
     80 to create a plain Surface is through a constructor that takes a SurfaceTexture
     81 as the sole argument. (Prior to API 11, there was no public constructor for
     82 Surface at all.) This might seem a bit backward if you view SurfaceTexture as a
     83 combination of a Surface and a texture.</p>
     84 
     85 <p>Under the hood, SurfaceTexture is called GLConsumer, which more accurately
     86 reflects its role as the owner and consumer of a BufferQueue. When you create a
     87 Surface from a SurfaceTexture, what you're doing is creating an object that
     88 represents the producer side of the SurfaceTexture's BufferQueue.</p>
     89 
     90 <h2 id=continuous_capture>Case Study: Grafika's continuous capture</h2>
     91 
     92 <p>The camera can provide a stream of frames suitable for recording as a movie.
     93 To display it on screen, you create a SurfaceView, pass the Surface to
     94 <code>setPreviewDisplay()</code>, and let the producer (camera) and consumer
     95 (SurfaceFlinger) do all the work. To record the video, you create a Surface with
     96 MediaCodec's <code>createInputSurface()</code>, pass that to the camera, and
     97 again sit back and relax. To show and record the it at the same time, you have
     98 to get more involved.</p>
     99 
    100 <p>The <em>continuous capture</em> activity displays video from the camera as
    101 the video is being recorded. In this case, encoded video is written to a
    102 circular buffer in memory that can be saved to disk at any time. It's
    103 straightforward to implement so long as you keep track of where everything is.
    104 </p>
    105 
    106 <p>This flow involves three BufferQueues: one created by the app, one created by
    107 SurfaceFlinger, and one created by mediaserver:</p>
    108 <ul>
    109 <li><strong>Application</strong>. The app uses a SurfaceTexture to receive
    110 frames from Camera, converting them to an external GLES texture.</li>
    111 <li><strong>SurfaceFlinger</strong>. The app declares a SurfaceView, which we
    112 use to display the frames.</li>
    113 <li><strong>MediaServer</strong>. You configure a MediaCodec encoder with an
    114 input Surface to create the video.</li>
    115 </ul>
    116 
    117 <img src="images/continuous_capture_activity.png" alt="Grafika continuous
    118 capture activity" />
    119 
    120 <p class="img-caption"><strong>Figure 1.</strong>Grafika's continuous capture
    121 activity. Arrows indicate data propagation from the camera and BufferQueues are
    122 in color (producers are teal, consumers are green).</p>
    123 
    124 <p>Encoded H.264 video goes to a circular buffer in RAM in the app process, and
    125 is written to an MP4 file on disk using the MediaMuxer class when the capture
    126 button is hit.</p>
    127 
    128 <p>All three of the BufferQueues are handled with a single EGL context in the
    129 app, and the GLES operations are performed on the UI thread.  Doing the
    130 SurfaceView rendering on the UI thread is generally discouraged, but since we're
    131 doing simple operations that are handled asynchronously by the GLES driver we
    132 should be fine.  (If the video encoder locks up and we block trying to dequeue a
    133 buffer, the app will become unresponsive. But at that point, we're probably
    134 failing anyway.)  The handling of the encoded data -- managing the circular
    135 buffer and writing it to disk -- is performed on a separate thread.</p>
    136 
    137 <p>The bulk of the configuration happens in the SurfaceView's <code>surfaceCreated()</code>
    138 callback.  The EGLContext is created, and EGLSurfaces are created for the
    139 display and for the video encoder.  When a new frame arrives, we tell
    140 SurfaceTexture to acquire it and make it available as a GLES texture, then
    141 render it with GLES commands on each EGLSurface (forwarding the transform and
    142 timestamp from SurfaceTexture).  The encoder thread pulls the encoded output
    143 from MediaCodec and stashes it in memory.</p>
    144 
    145 <h2 id=st_vid_play>Secure texture video playback</h2>
    146 <p>Android 7.0 supports GPU post-processing of protected video content. This
    147 allows using the GPU for complex non-linear video effects (such as warps),
    148 mapping protected video content onto textures for use in general graphics scenes
    149 (e.g., using OpenGL ES), and virtual reality (VR).</p>
    150 
    151 <img src="images/graphics_secure_texture_playback.png" alt="Secure Texture Video Playback" />
    152 <p class="img-caption"><strong>Figure 2.</strong>Secure texture video playback</p>
    153 
    154 <p>Support is enabled using the following two extensions:</p>
    155 <ul>
    156 <li><strong>EGL extension</strong>
    157 (<code><a href="https://www.khronos.org/registry/egl/extensions/EXT/EGL_EXT_protected_content.txt">EGL_EXT_protected_content</code></a>).
    158 Allows the creation of protected GL contexts and surfaces, which can both
    159 operate on protected content.</li>
    160 <li><strong>GLES extension</strong>
    161 (<code><a href="https://www.khronos.org/registry/gles/extensions/EXT/EXT_protected_textures.txt">GL_EXT_protected_textures</code></a>).
    162 Allows tagging textures as protected so they can be used as framebuffer texture
    163 attachments.</li>
    164 </ul>
    165 
    166 <p>Android 7.0 also updates SurfaceTexture and ACodec
    167 (<code>libstagefright.so</code>) to allow protected content to be sent even if
    168 the windows surface does not queue to the window composer (i.e., SurfaceFlinger)
    169 and provide a protected video surface for use within a protected context. This
    170 is done by setting the correct protected consumer bits
    171 (<code>GRALLOC_USAGE_PROTECTED</code>) on surfaces created in a protected
    172 context (verified by ACodec).</p>
    173 
    174 <p>These changes benefit app developers who can create apps that perform
    175 enhanced video effects or apply video textures using protected content in GL
    176 (for example, in VR), end users who can view high-value video content (such as
    177 movies and TV shows) in GL environment (for example, in VR), and OEMs who can
    178 achieve higher sales due to added device functionality (for example, watching HD
    179 movies in VR). The new EGL and GLES extensions can be used by system on chip
    180 (SoCs) providers and other vendors, and are currently implemented on the
    181 Qualcomm MSM8994 SoC chipset used in the Nexus 6P.
    182 
    183 <p>Secure texture video playback sets the foundation for strong DRM
    184 implementation in the OpenGL ES environment. Without a strong DRM implementation
    185 such as Widevine Level 1, many content providers would not allow rendering of
    186 their high-value content in the OpenGL ES environment, preventing important VR
    187 use cases such as watching DRM protected content in VR.</p>
    188 
    189 <p>AOSP includes framework code for secure texture video playback; driver
    190 support is up to the vendor. Device implementers must implement the
    191 <code>EGL_EXT_protected_content</code> and
    192 <code>GL_EXT_protected_textures extensions</code>. When using your own codec
    193 library (to replace libstagefright), note the changes in
    194 <code>/frameworks/av/media/libstagefright/SurfaceUtils.cpp</code> that allow
    195 buffers marked with <code>GRALLOC_USAGE_PROTECTED</code> to be sent to
    196 ANativeWindows (even if the ANativeWindow does not queue directly to the window
    197 composer) as long as the consumer usage bits contain
    198 <code>GRALLOC_USAGE_PROTECTED</code>. For detailed documentation on implementing
    199 the extensions, refer to the Khronos Registry
    200 (<a href="https://www.khronos.org/registry/egl/extensions/EXT/EGL_EXT_protected_content.txt">EGL_EXT_protected_content</a>,
    201 <a href="https://www.khronos.org/registry/gles/extensions/EXT/EXT_protected_textures.txt">GL_EXT_protected_textures</a>).</p>
    202 
    203 <p>Device implementers may also need to make hardware changes to ensure that
    204 protected memory mapped onto the GPU remains protected and unreadable by
    205 unprotected code.</p>
    206 
    207   </body>
    208 </html>
    209