1 <html devsite> 2 <head> 3 <title>SurfaceFlinger and Hardware Composer</title> 4 <meta name="project_path" value="/_project.yaml" /> 5 <meta name="book_path" value="/_book.yaml" /> 6 </head> 7 <body> 8 <!-- 9 Copyright 2017 The Android Open Source Project 10 11 Licensed under the Apache License, Version 2.0 (the "License"); 12 you may not use this file except in compliance with the License. 13 You may obtain a copy of the License at 14 15 http://www.apache.org/licenses/LICENSE-2.0 16 17 Unless required by applicable law or agreed to in writing, software 18 distributed under the License is distributed on an "AS IS" BASIS, 19 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 20 See the License for the specific language governing permissions and 21 limitations under the License. 22 --> 23 24 25 26 <p>Having buffers of graphical data is wonderful, but life is even better when 27 you get to see them on your device's screen. That's where SurfaceFlinger and the 28 Hardware Composer HAL come in.</p> 29 30 31 <h2 id=surfaceflinger>SurfaceFlinger</h2> 32 33 <p>SurfaceFlinger's role is to accept buffers of data from multiple sources, 34 composite them, and send them to the display. Once upon a time this was done 35 with software blitting to a hardware framebuffer (e.g. 36 <code>/dev/graphics/fb0</code>), but those days are long gone.</p> 37 38 <p>When an app comes to the foreground, the WindowManager service asks 39 SurfaceFlinger for a drawing surface. SurfaceFlinger creates a layer (the 40 primary component of which is a BufferQueue) for which SurfaceFlinger acts as 41 the consumer. A Binder object for the producer side is passed through the 42 WindowManager to the app, which can then start sending frames directly to 43 SurfaceFlinger.</p> 44 45 <p class="note"><strong>Note:</strong> While this section uses SurfaceFlinger 46 terminology, WindowManager uses the term <em>window</em> instead of 47 <em>layer</em>…and uses layer to mean something else. (It can be argued 48 that SurfaceFlinger should really be called LayerFlinger.)</p> 49 50 <p>Most applications have three layers on screen at any time: the status bar at 51 the top of the screen, the navigation bar at the bottom or side, and the 52 application UI. Some apps have more, some less (e.g. the default home app has a 53 separate layer for the wallpaper, while a full-screen game might hide the status 54 bar. Each layer can be updated independently. The status and navigation bars 55 are rendered by a system process, while the app layers are rendered by the app, 56 with no coordination between the two.</p> 57 58 <p>Device displays refresh at a certain rate, typically 60 frames per second on 59 phones and tablets. If the display contents are updated mid-refresh, tearing 60 will be visible; so it's important to update the contents only between cycles. 61 The system receives a signal from the display when it's safe to update the 62 contents. For historical reasons we'll call this the VSYNC signal.</p> 63 64 <p>The refresh rate may vary over time, e.g. some mobile devices will range from 58 65 to 62fps depending on current conditions. For an HDMI-attached television, this 66 could theoretically dip to 24 or 48Hz to match a video. Because we can update 67 the screen only once per refresh cycle, submitting buffers for display at 200fps 68 would be a waste of effort as most of the frames would never be seen. Instead of 69 taking action whenever an app submits a buffer, SurfaceFlinger wakes up when the 70 display is ready for something new.</p> 71 72 <p>When the VSYNC signal arrives, SurfaceFlinger walks through its list of 73 layers looking for new buffers. If it finds a new one, it acquires it; if not, 74 it continues to use the previously-acquired buffer. SurfaceFlinger always wants 75 to have something to display, so it will hang on to one buffer. If no buffers 76 have ever been submitted on a layer, the layer is ignored.</p> 77 78 <p>After SurfaceFlinger has collected all buffers for visible layers, it asks 79 the Hardware Composer how composition should be performed.</p> 80 81 <h2 id=hwc>Hardware Composer</h2> 82 83 <p>The Hardware Composer HAL (HWC) was introduced in Android 3.0 and has evolved 84 steadily over the years. Its primary purpose is to determine the most efficient 85 way to composite buffers with the available hardware. As a HAL, its 86 implementation is device-specific and usually done by the display hardware OEM.</p> 87 88 <p>The value of this approach is easy to recognize when you consider <em>overlay 89 planes</em>, the purpose of which is to composite multiple buffers together in 90 the display hardware rather than the GPU. For example, consider a typical 91 Android phone in portrait orientation, with the status bar on top, navigation 92 bar at the bottom, and app content everywhere else. The contents for each layer 93 are in separate buffers. You could handle composition using either of the 94 following methods:</p> 95 96 <ul> 97 <li>Rendering the app content into a scratch buffer, then rendering the status 98 bar over it, the navigation bar on top of that, and finally passing the scratch 99 buffer to the display hardware.</li> 100 <li>Passing all three buffers to the display hardware and tell it to read data 101 from different buffers for different parts of the screen.</li> 102 </ul> 103 104 <p>The latter approach can be significantly more efficient.</p> 105 106 <p>Display processor capabilities vary significantly. The number of overlays, 107 whether layers can be rotated or blended, and restrictions on positioning and 108 overlap can be difficult to express through an API. The HWC attempts to 109 accommodate such diversity through a series of decisions:</p> 110 111 <ol> 112 <li>SurfaceFlinger provides HWC with a full list of layers and asks, "How do 113 you want to handle this?"</li> 114 <li>HWC responds by marking each layer as overlay or GLES composition.</li> 115 <li>SurfaceFlinger takes care of any GLES composition, passing the output buffer 116 to HWC, and lets HWC handle the rest.</li> 117 </ol> 118 119 <p>Since hardware vendors can custom tailor decision-making code, it's possible 120 to get the best performance out of every device.</p> 121 122 <p>Overlay planes may be less efficient than GL composition when nothing on the 123 screen is changing. This is particularly true when overlay contents have 124 transparent pixels and overlapping layers are blended together. In such cases, 125 the HWC can choose to request GLES composition for some or all layers and retain 126 the composited buffer. If SurfaceFlinger comes back asking to composite the same 127 set of buffers, the HWC can continue to show the previously-composited scratch 128 buffer. This can improve the battery life of an idle device.</p> 129 130 <p>Devices running Android 4.4 and later typically support four overlay planes. 131 Attempting to composite more layers than overlays causes the system to use GLES 132 composition for some of them, meaning the number of layers used by an app can 133 have a measurable impact on power consumption and performance.</p> 134 135 <h2 id=virtual-displays>Virtual displays</h2> 136 137 <p>SurfaceFlinger supports a primary display (i.e. what's built into your phone 138 or tablet), an external display (such as a television connected through HDMI), 139 and one or more virtual displays that make composited output available within 140 the system. Virtual displays can be used to record the screen or send it over a 141 network.</p> 142 143 <p>Virtual displays may share the same set of layers as the main display 144 (the layer stack) or have its own set. There is no VSYNC for a virtual display, 145 so the VSYNC for the primary display is used to trigger composition for all 146 displays.</p> 147 148 <p>In older versions of Android, virtual displays were always composited with 149 GLES and the Hardware Composer managed composition for the primary display only. 150 In Android 4.4, the Hardware Composer gained the ability to participate in 151 virtual display composition.</p> 152 153 <p>As you might expect, frames generated for a virtual display are written to a 154 BufferQueue.</p> 155 156 <h2 id=screenrecord>Case Study: screenrecord</h2> 157 158 <p>The <a href="https://android.googlesource.com/platform/frameworks/av/+/marshmallow-release/cmds/screenrecord/">screenrecord 159 command</a> allows you to record everything that appears on the screen as an 160 .mp4 file on disk. To implement, we have to receive composited frames from 161 SurfaceFlinger, write them to the video encoder, and then write the encoded 162 video data to a file. The video codecs are managed by a separate process 163 (mediaserver) so we have to move large graphics buffers around the system. To 164 make it more challenging, we're trying to record 60fps video at full resolution. 165 The key to making this work efficiently is BufferQueue.</p> 166 167 <p>The MediaCodec class allows an app to provide data as raw bytes in buffers, 168 or through a <a href="/devices/graphics/arch-sh.html">Surface</a>. When 169 screenrecord requests access to a video encoder, mediaserver creates a 170 BufferQueue, connects itself to the consumer side, then passes the producer 171 side back to screenrecord as a Surface.</p> 172 173 <p>The screenrecord command then asks SurfaceFlinger to create a virtual display 174 that mirrors the main display (i.e. it has all of the same layers), and directs 175 it to send output to the Surface that came from mediaserver. In this case, 176 SurfaceFlinger is the producer of buffers rather than the consumer.</p> 177 178 <p>After the configuration is complete, screenrecord waits for encoded data to 179 appear. As apps draw, their buffers travel to SurfaceFlinger, which composites 180 them into a single buffer that gets sent directly to the video encoder in 181 mediaserver. The full frames are never even seen by the screenrecord process. 182 Internally, mediaserver has its own way of moving buffers around that also 183 passes data by handle, minimizing overhead.</p> 184 185 <h2 id=simulate-secondary>Case Study: Simulate secondary displays</h2> 186 187 <p>The WindowManager can ask SurfaceFlinger to create a visible layer for which 188 SurfaceFlinger acts as the BufferQueue consumer. It's also possible to ask 189 SurfaceFlinger to create a virtual display, for which SurfaceFlinger acts as 190 the BufferQueue producer. What happens if you connect them, configuring a 191 virtual display that renders to a visible layer?</p> 192 193 <p>You create a closed loop, where the composited screen appears in a window. 194 That window is now part of the composited output, so on the next refresh 195 the composited image inside the window will show the window contents as well 196 (and then it's 197 <a href="https://en.wikipedia.org/wiki/Turtles_all_the_way_down">turtles all the 198 way down)</a>. To see this in action, enable 199 <a href="http://developer.android.com/tools/index.html">Developer options</a> in 200 settings, select <strong>Simulate secondary displays</strong>, and enable a 201 window. For bonus points, use screenrecord to capture the act of enabling the 202 display then play it back frame-by-frame.</p> 203 204 </body> 205 </html> 206