1 <html devsite> 2 <head> 3 <title>Graphics</title> 4 <meta name="project_path" value="/_project.yaml" /> 5 <meta name="book_path" value="/_book.yaml" /> 6 </head> 7 <body> 8 <!-- 9 Copyright 2017 The Android Open Source Project 10 11 Licensed under the Apache License, Version 2.0 (the "License"); 12 you may not use this file except in compliance with the License. 13 You may obtain a copy of the License at 14 15 http://www.apache.org/licenses/LICENSE-2.0 16 17 Unless required by applicable law or agreed to in writing, software 18 distributed under the License is distributed on an "AS IS" BASIS, 19 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 20 See the License for the specific language governing permissions and 21 limitations under the License. 22 --> 23 24 25 26 <img style="float: right; margin: 0px 15px 15px 15px;" 27 src="images/ape_fwk_hal_graphics.png" alt="Android Graphics HAL icon"/> 28 29 <p>The Android framework offers a variety of graphics rendering APIs for 2D and 30 3D that interact with manufacturer implementations of graphics drivers, so it 31 is important to have a good understanding of how those APIs work at a higher 32 level. This page introduces the graphics hardware abstraction layer (HAL) upon 33 which those drivers are built.</p> 34 35 <p>Application developers draw images to the screen in two ways: with Canvas or 36 OpenGL. See <a 37 href="/devices/graphics/architecture.html">System-level graphics 38 architecture</a> for a detailed description of Android graphics 39 components.</p> 40 41 <p><a 42 href="http://developer.android.com/reference/android/graphics/Canvas.html">android.graphics.Canvas</a> 43 is a 2D graphics API and is the most popular graphics API among developers. 44 Canvas operations draw all the stock and custom <a 45 href="http://developer.android.com/reference/android/view/View.html">android.view.View</a>s 46 in Android. In Android, hardware acceleration for Canvas APIs is accomplished 47 with a drawing library called OpenGLRenderer that translates Canvas operations 48 to OpenGL operations so they can execute on the GPU.</p> 49 50 <p>Beginning in Android 4.0, hardware-accelerated Canvas is enabled by default. 51 Consequently, a hardware GPU that supports OpenGL ES 2.0 is mandatory for 52 Android 4.0 and later devices. See the 53 <a href="https://developer.android.com/guide/topics/graphics/hardware-accel.html">Hardware Acceleration guide</a> for an explanation of how the 54 hardware-accelerated drawing path works and the differences in its behavior 55 from that of the software drawing path.</p> 56 57 <p>In addition to Canvas, the other main way that developers render graphics is 58 by using OpenGL ES to directly render to a surface. Android provides OpenGL ES 59 interfaces in the 60 <a href="http://developer.android.com/reference/android/opengl/package-summary.html">android.opengl</a> 61 package that developers can use to call into their GL implementations with the 62 SDK or with native APIs provided in the <a 63 href="https://developer.android.com/tools/sdk/ndk/index.html">Android 64 NDK</a>.</p> 65 66 <p>Android implementers can test OpenGL ES functionality using the <a href="testing.html">drawElements Quality Program</a>, also known as deqp.</p> 67 68 <h2 id="android_graphics_components">Android graphics components</h2> 69 70 <p>No matter what rendering API developers use, everything is rendered onto a 71 "surface." The surface represents the producer side of a buffer queue that is 72 often consumed by SurfaceFlinger. Every window that is created on the Android 73 platform is backed by a surface. All of the visible surfaces rendered are 74 composited onto the display by SurfaceFlinger.</p> 75 76 <p>The following diagram shows how the key components work together:</p> 77 78 <img src="images/ape_fwk_graphics.png" alt="image-rendering components"> 79 80 <p class="img-caption"><strong>Figure 1.</strong> How surfaces are rendered</p> 81 82 <p>The main components are described below:</p> 83 84 <h3 id="image_stream_producers">Image Stream Producers</h3> 85 86 <p>An image stream producer can be anything that produces graphic buffers for 87 consumption. Examples include OpenGL ES, Canvas 2D, and mediaserver video 88 decoders.</p> 89 90 <h3 id="image_stream_consumers">Image Stream Consumers</h3> 91 92 <p>The most common consumer of image streams is SurfaceFlinger, the system 93 service that consumes the currently visible surfaces and composites them onto 94 the display using information provided by the Window Manager. SurfaceFlinger is 95 the only service that can modify the content of the display. SurfaceFlinger 96 uses OpenGL and the Hardware Composer to compose a group of surfaces.</p> 97 98 <p>Other OpenGL ES apps can consume image streams as well, such as the camera 99 app consuming a camera preview image stream. Non-GL applications can be 100 consumers too, for example the ImageReader class.</p> 101 102 <h3 id="window_manager">Window Manager</h3> 103 104 <p>The Android system service that controls a window, which is a container for 105 views. A window is always backed by a surface. This service oversees 106 lifecycles, input and focus events, screen orientation, transitions, 107 animations, position, transforms, z-order, and many other aspects of a window. 108 The Window Manager sends all of the window metadata to SurfaceFlinger so 109 SurfaceFlinger can use that data to composite surfaces on the display.</p> 110 111 <h3 id="hardware_composer">Hardware Composer</h3> 112 113 <p>The hardware abstraction for the display subsystem. SurfaceFlinger can 114 delegate certain composition work to the Hardware Composer to offload work from 115 OpenGL and the GPU. SurfaceFlinger acts as just another OpenGL ES client. So 116 when SurfaceFlinger is actively compositing one buffer or two into a third, for 117 instance, it is using OpenGL ES. This makes compositing lower power than having 118 the GPU conduct all computation.</p> 119 120 <p>The <a href="/devices/graphics/architecture.html#hwcomposer">Hardware 121 Composer HAL</a> conducts the other half of the work and is the central point 122 for all Android graphics rendering. The Hardware Composer must support events, 123 one of which is VSYNC (another is hotplug for plug-and-playHDMI support).</p> 124 125 <h3 id="gralloc">Gralloc</h3> 126 127 <p>The graphics memory allocator (Gralloc) is needed to allocate memory 128 requested by image producers. For details, see <a 129 href="/devices/graphics/architecture.html#gralloc_HAL">Gralloc HAL</a>. 130 </p> 131 132 <h2 id="data_flow">Data flow</h2> 133 134 <p>See the following diagram for a depiction of the Android graphics 135 pipeline:</p> 136 137 <img src="images/graphics_pipeline.png" alt="graphics data flow"> 138 139 <p class="img-caption"><strong>Figure 2.</strong> Graphic data flow through 140 Android</p> 141 142 <p>The objects on the left are renderers producing graphics buffers, such as 143 the home screen, status bar, and system UI. SurfaceFlinger is the compositor 144 and Hardware Composer is the composer.</p> 145 146 <h3 id="bufferqueue">BufferQueue</h3> 147 148 <p>BufferQueues provide the glue between the Android graphics components. These 149 are a pair of queues that mediate the constant cycle of buffers from the 150 producer to the consumer. Once the producers hand off their buffers, 151 SurfaceFlinger is responsible for compositing everything onto the display.</p> 152 153 <p>See the following diagram for the BufferQueue communication process.</p> 154 155 <img src="images/bufferqueue.png" 156 alt="BufferQueue communication process"> 157 158 <p class="img-caption"><strong>Figure 3.</strong> BufferQueue communication 159 process</p> 160 161 <p>BufferQueue contains the logic that ties image stream producers and image 162 stream consumers together. Some examples of image producers are the camera 163 previews produced by the camera HAL or OpenGL ES games. Some examples of image 164 consumers are SurfaceFlinger or another app that displays an OpenGL ES stream, 165 such as the camera app displaying the camera viewfinder.</p> 166 167 <p>BufferQueue is a data structure that combines a buffer pool with a queue and 168 uses Binder IPC to pass buffers between processes. The producer interface, or 169 what you pass to somebody who wants to generate graphic buffers, is 170 IGraphicBufferProducer (part of <a 171 href="http://developer.android.com/reference/android/graphics/SurfaceTexture.html">SurfaceTexture</a>). 172 BufferQueue is often used to render to a Surface and consume with a GL 173 Consumer, among other tasks. 174 175 BufferQueue can operate in three different modes:</p> 176 177 <p><em>Synchronous-like mode</em> - BufferQueue by default operates in a 178 synchronous-like mode, in which every buffer that comes in from the producer 179 goes out at the consumer. No buffer is ever discarded in this mode. And if the 180 producer is too fast and creates buffers faster than they are being drained, it 181 will block and wait for free buffers.</p> 182 183 <p><em>Non-blocking mode</em> - BufferQueue can also operate in a non-blocking 184 mode where it generates an error rather than waiting for a buffer in those 185 cases. No buffer is ever discarded in this mode either. This is useful for 186 avoiding potential deadlocks in application software that may not understand 187 the complex dependencies of the graphics framework.</p> 188 189 <p><em>Discard mode</em> - Finally, BufferQueue may be configured to discard 190 old buffers rather than generate errors or wait. For instance, if conducting GL 191 rendering to a texture view and drawing as quickly as possible, buffers must be 192 dropped.</p> 193 194 <p>To conduct most of this work, SurfaceFlinger acts as just another OpenGL ES 195 client. So when SurfaceFlinger is actively compositing one buffer or two into a 196 third, for instance, it is using OpenGL ES.</p> 197 198 <p>The Hardware Composer HAL conducts the other half of the work. This HAL acts 199 as the central point for all Android graphics rendering.</p> 200 201 <h3 id="synchronization_framework">Synchronization framework</h3> 202 203 <p>Since Android graphics offer no explicit parallelism, vendors have long 204 implemented their own implicit synchronization within their own drivers. This 205 is no longer required with the Android graphics synchronization framework. See 206 the 207 <a href="/devices/graphics/implement-vsync.html#explicit_synchronization">Explicit 208 synchronization</a> section for implementation instructions.</p> 209 210 <p>The synchronization framework explicitly describes dependencies between 211 different asynchronous operations in the system. The framework provides a 212 simple API that lets components signal when buffers are released. It also 213 allows synchronization primitives to be passed between drivers from the kernel 214 to userspace and between userspace processes themselves.</p> 215 216 <p>For example, an application may queue up work to be carried out in the GPU. 217 The GPU then starts drawing that image. Although the image hasnt been drawn 218 into memory yet, the buffer pointer can still be passed to the window 219 compositor along with a fence that indicates when the GPU work will be 220 finished. The window compositor may then start processing ahead of time and 221 hand off the work to the display controller. In this manner, the CPU work can 222 be done ahead of time. Once the GPU finishes, the display controller can 223 immediately display the image.</p> 224 225 <p>The synchronization framework also allows implementers to leverage 226 synchronization resources in their own hardware components. Finally, the 227 framework provides visibility into the graphics pipeline to aid in 228 debugging.</p> 229 230 </body> 231 </html> 232