1 <html devsite> 2 <head> 3 <title>TextureView</title> 4 <meta name="project_path" value="/_project.yaml" /> 5 <meta name="book_path" value="/_book.yaml" /> 6 </head> 7 <body> 8 <!-- 9 Copyright 2017 The Android Open Source Project 10 11 Licensed under the Apache License, Version 2.0 (the "License"); 12 you may not use this file except in compliance with the License. 13 You may obtain a copy of the License at 14 15 http://www.apache.org/licenses/LICENSE-2.0 16 17 Unless required by applicable law or agreed to in writing, software 18 distributed under the License is distributed on an "AS IS" BASIS, 19 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 20 See the License for the specific language governing permissions and 21 limitations under the License. 22 --> 23 24 25 26 27 <p>The TextureView class introduced in Android 4.0 and is the most complex of 28 the View objects discussed here, combining a View with a SurfaceTexture.</p> 29 30 <h2 id=render_gles>Rendering with GLES</h2> 31 <p>Recall that the SurfaceTexture is a "GL consumer", consuming buffers of graphics 32 data and making them available as textures. TextureView wraps a SurfaceTexture, 33 taking over the responsibility of responding to the callbacks and acquiring new 34 buffers. The arrival of new buffers causes TextureView to issue a View 35 invalidate request. When asked to draw, the TextureView uses the contents of 36 the most recently received buffer as its data source, rendering wherever and 37 however the View state indicates it should.</p> 38 39 <p>You can render on a TextureView with GLES just as you would SurfaceView. Just 40 pass the SurfaceTexture to the EGL window creation call. However, doing so 41 exposes a potential problem.</p> 42 43 <p>In most of what we've looked at, the BufferQueues have passed buffers between 44 different processes. When rendering to a TextureView with GLES, both producer 45 and consumer are in the same process, and they might even be handled on a single 46 thread. Suppose we submit several buffers in quick succession from the UI 47 thread. The EGL buffer swap call will need to dequeue a buffer from the 48 BufferQueue, and it will stall until one is available. There won't be any 49 available until the consumer acquires one for rendering, but that also happens 50 on the UI thread so we're stuck.</p> 51 52 <p>The solution is to have BufferQueue ensure there is always a buffer 53 available to be dequeued, so the buffer swap never stalls. One way to guarantee 54 this is to have BufferQueue discard the contents of the previously-queued buffer 55 when a new buffer is queued, and to place restrictions on minimum buffer counts 56 and maximum acquired buffer counts. (If your queue has three buffers, and all 57 three buffers are acquired by the consumer, then there's nothing to dequeue and 58 the buffer swap call must hang or fail. So we need to prevent the consumer from 59 acquiring more than two buffers at once.) Dropping buffers is usually 60 undesirable, so it's only enabled in specific situations, such as when the 61 producer and consumer are in the same process.</p> 62 63 <h2 id=surface_or_texture>SurfaceView or TextureView?</h2> 64 SurfaceView and TextureView fill similar roles, but have very different 65 implementations. To decide which is best requires an understanding of the 66 trade-offs.</p> 67 68 <p>Because TextureView is a proper citizen of the View hierarchy, it behaves like 69 any other View, and can overlap or be overlapped by other elements. You can 70 perform arbitrary transformations and retrieve the contents as a bitmap with 71 simple API calls.</p> 72 73 <p>The main strike against TextureView is the performance of the composition step. 74 With SurfaceView, the content is written to a separate layer that SurfaceFlinger 75 composites, ideally with an overlay. With TextureView, the View composition is 76 always performed with GLES, and updates to its contents may cause other View 77 elements to redraw as well (e.g. if they're positioned on top of the 78 TextureView). After the View rendering completes, the app UI layer must then be 79 composited with other layers by SurfaceFlinger, so you're effectively 80 compositing every visible pixel twice. For a full-screen video player, or any 81 other application that is effectively just UI elements layered on top of video, 82 SurfaceView offers much better performance.</p> 83 84 <p>As noted earlier, DRM-protected video can be presented only on an overlay plane. 85 Video players that support protected content must be implemented with 86 SurfaceView.</p> 87 88 <h2 id=grafika>Case Study: Grafika's Play Video (TextureView)</h2> 89 90 <p>Grafika includes a pair of video players, one implemented with TextureView, the 91 other with SurfaceView. The video decoding portion, which just sends frames 92 from MediaCodec to a Surface, is the same for both. The most interesting 93 differences between the implementations are the steps required to present the 94 correct aspect ratio.</p> 95 96 <p>While SurfaceView requires a custom implementation of FrameLayout, resizing 97 SurfaceTexture is a simple matter of configuring a transformation matrix with 98 <code>TextureView#setTransform()</code>. For the former, you're sending new 99 window position and size values to SurfaceFlinger through WindowManager; for 100 the latter, you're just rendering it differently.</p> 101 102 <p>Otherwise, both implementations follow the same pattern. Once the Surface has 103 been created, playback is enabled. When "play" is hit, a video decoding thread 104 is started, with the Surface as the output target. After that, the app code 105 doesn't have to do anything -- composition and display will either be handled by 106 SurfaceFlinger (for the SurfaceView) or by TextureView.</p> 107 108 <h2 id=decode>Case Study: Grafika's Double Decode</h2> 109 110 <p>This activity demonstrates manipulation of the SurfaceTexture inside a 111 TextureView.</p> 112 113 <p>The basic structure of this activity is a pair of TextureViews that show two 114 different videos playing side-by-side. To simulate the needs of a 115 videoconferencing app, we want to keep the MediaCodec decoders alive when the 116 activity is paused and resumed for an orientation change. The trick is that you 117 can't change the Surface that a MediaCodec decoder uses without fully 118 reconfiguring it, which is a fairly expensive operation; so we want to keep the 119 Surface alive. The Surface is just a handle to the producer interface in the 120 SurfaceTexture's BufferQueue, and the SurfaceTexture is managed by the 121 TextureView;, so we also need to keep the SurfaceTexture alive. So how do we deal 122 with the TextureView getting torn down?</p> 123 124 <p>It just so happens TextureView provides a <code>setSurfaceTexture()</code> call 125 that does exactly what we want. We obtain references to the SurfaceTextures 126 from the TextureViews and save them in a static field. When the activity is 127 shut down, we return "false" from the <code>onSurfaceTextureDestroyed()</code> 128 callback to prevent destruction of the SurfaceTexture. When the activity is 129 restarted, we stuff the old SurfaceTexture into the new TextureView. The 130 TextureView class takes care of creating and destroying the EGL contexts.</p> 131 132 <p>Each video decoder is driven from a separate thread. At first glance it might 133 seem like we need EGL contexts local to each thread; but remember the buffers 134 with decoded output are actually being sent from mediaserver to our 135 BufferQueue consumers (the SurfaceTextures). The TextureViews take care of the 136 rendering for us, and they execute on the UI thread.</p> 137 138 <p>Implementing this activity with SurfaceView would be a bit harder. We can't 139 just create a pair of SurfaceViews and direct the output to them, because the 140 Surfaces would be destroyed during an orientation change. Besides, that would 141 add two layers, and limitations on the number of available overlays strongly 142 motivate us to keep the number of layers to a minimum. Instead, we'd want to 143 create a pair of SurfaceTextures to receive the output from the video decoders, 144 and then perform the rendering in the app, using GLES to render two textured 145 quads onto the SurfaceView's Surface.</p> 146 147 </body> 148 </html> 149