Home | History | Annotate | Download | only in display
      1 <html devsite>
      2   <head>
      3     <title>HDR Video Playback</title>
      4     <meta name="project_path" value="/_project.yaml" />
      5     <meta name="book_path" value="/_book.yaml" />
      6   </head>
      7   <body>
      8   <!--
      9       Copyright 2017 The Android Open Source Project
     10 
     11       Licensed under the Apache License, Version 2.0 (the "License");
     12       you may not use this file except in compliance with the License.
     13       You may obtain a copy of the License at
     14 
     15           http://www.apache.org/licenses/LICENSE-2.0
     16 
     17       Unless required by applicable law or agreed to in writing, software
     18       distributed under the License is distributed on an "AS IS" BASIS,
     19       WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     20       See the License for the specific language governing permissions and
     21       limitations under the License.
     22   -->
     23 
     24 
     25 
     26 <p>High dynamic range (HDR) video is the next frontier in high-quality
     27 video decoding, bringing unmatched scene reproduction qualities. It does
     28 so by significantly increasing the dynamic range of the luminance component
     29 (from the current 100 cd/m<sup>2</sup> to 1000s of cd/m<sup>2</sup>) and by using a much wider
     30 color space (BT 2020). This is now a central element of the 4K UHD evolution
     31 in the TV space.</p>
     32 
     33 <p>In Android 7.0, initial HDR support has been added, which includes the
     34 creation of proper constants for the discovery and setup of HDR video
     35 pipelines. That means defining codec types and display modes and specifying
     36 how HDR data must be passed to MediaCodec and supplied to HDR decoders. HDR
     37 is only supported in tunneled video playback mode.</p>
     38 
     39 <p>The purpose of this document is to help application developers support HDR stream
     40 playback, and help OEMs and SOCs enable the HDR features on Android 7.0.</p>
     41 
     42 <h2 id="technologies">Supported HDR technologies</h2>
     43 
     44 <p>As of Android 7.0 release, the following HDR technologies are supported.
     45 
     46 <table>
     47 <tbody>
     48 <tr>
     49 <th>Technology
     50 </th>
     51 <th>Dolby-Vision
     52 </th>
     53 <th>HDR10
     54 </th>
     55 <th>VP9-HLG
     56 </th>
     57 <th>VP9-PQ
     58 </th>
     59 </tr>
     60 <tr>
     61 <th>Codec
     62 </th>
     63 <td>AVC/HEVC
     64 </td>
     65 <td>HEVC
     66 </td>
     67 <td>VP9
     68 </td>
     69 <td>VP9
     70 </td>
     71 </tr>
     72 <tr>
     73 <th>Transfer Function
     74 </th>
     75 <td>ST-2084
     76 </td>
     77 <td>ST-2084
     78 </td>
     79 <td>HLG
     80 </td>
     81 <td>ST-2084
     82 </td>
     83 </tr>
     84 <tr>
     85 <th>HDR Metadata Type
     86 </th>
     87 <td>Dynamic
     88 </td>
     89 <td>Static
     90 </td>
     91 <td>None
     92 </td>
     93 <td>Static
     94 </td>
     95 </tr>
     96 </tbody>
     97 </table>
     98 
     99 <p>In Android 7.0, <b>only HDR playback via tunneled mode is defined</b>,
    100 but devices may add support for playback of HDR on SurfaceViews using opaque
    101 video buffers. In other words:</p>
    102 <ul>
    103 <li>There is no standard Android API to check if HDR playback is supported
    104 using non-tunneled decoders.</li>
    105 <li>Tunneled video decoders that advertise HDR playback capability must
    106 support HDR playback when connected to HDR-capable displays.</li>
    107 <li>GL composition of HDR content is not supported by the AOSP Android
    108 7.0 release.</li>
    109 </ul>
    110 
    111 <h2 id="discovery">Discovery</h2>
    112 
    113 <p>HDR Playback requires an HDR-capable decoder and a connection to an
    114 HDR-capable display. Optionally, some technologies require a specific
    115 extractor.</p>
    116 
    117 <h3 id="display">Display</h3>
    118 
    119 <p>Applications shall use the new <code>Display.getHdrCapabilities</code>
    120 API to query the HDR technologies supported by the specified display. This is
    121 basically the information in the EDID Static Metadata Data Block as defined
    122 in CTA-861.3:</p>
    123 
    124 <ul>
    125 
    126 <li><code>public Display.HdrCapabilities getHdrCapabilities()</code><br>
    127 Returns the display's HDR capabilities.</li>
    128 
    129 <li><code>Display.HdrCapabilities</code><br>
    130 Encapsulates the HDR capabilities of a given display. For example, what HDR
    131 types it supports and details about the desired luminance data.</li>
    132 </ul>
    133 
    134 <p><b>Constants:</b></p>
    135 
    136 <ul>
    137 <li><code>int HDR_TYPE_DOLBY_VISION</code><br>
    138 Dolby Vision support.</li>
    139 
    140 <li><code>int HDR_TYPE_HDR10</code><br>
    141 HDR10 / PQ support.</li>
    142 
    143 <li><code>int HDR_TYPE_HLG</code><br>
    144 Hybrid Log-Gamma support.</li>
    145 
    146 <li><code>float INVALID_LUMINANCE</code><br>
    147 Invalid luminance value.</li>
    148 </ul>
    149 
    150 <p><b>Public Methods:</b></p>
    151 
    152 <ul>
    153 <li><code>float getDesiredMaxAverageLuminance()</code><br>
    154 Returns the desired content max frame-average luminance data in cd/cd/m<sup>2</sup> for
    155 this display.</li>
    156 
    157 <li><code>float getDesiredMaxLuminance()</code><br>
    158 Returns the desired content max luminance data in cd/cd/m<sup>2</sup> for this display.</li>
    159 
    160 <li><code>float getDesiredMinLuminance()</code><br>
    161 Returns the desired content min luminance data in cd/cd/m<sup>2</sup> for this display.</li>
    162 
    163 <li><code>int[] getSupportedHdrTypes()</code><br>
    164 Gets the supported HDR types of this display (see constants). Returns empty
    165 array if HDR is not supported by the display.</li>
    166 </ul>
    167 
    168 <h3 id="decoder">Decoder</h3>
    169 
    170 <p>Applications shall use the existing
    171 <a href="https://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html#profileLevels">
    172 <code>CodecCapabilities.profileLevels</code></a> API to verify support for the
    173 new HDR capable profiles:</p>
    174 
    175 <h4>Dolby-Vision</h4>
    176 
    177 <p><code>MediaFormat</code> mime constant:</p>
    178 <pre class="devsite-click-to-copy">
    179 String MIMETYPE_VIDEO_DOLBY_VISION
    180 </pre>
    181 
    182 <p><code>MediaCodecInfo.CodecProfileLevel</code> profile constants:</p>
    183 <pre class="devsite-click-to-copy">
    184 int DolbyVisionProfileDvavPen
    185 int DolbyVisionProfileDvavPer
    186 int DolbyVisionProfileDvheDen
    187 int DolbyVisionProfileDvheDer
    188 int DolbyVisionProfileDvheDtb
    189 int DolbyVisionProfileDvheDth
    190 int DolbyVisionProfileDvheDtr
    191 int DolbyVisionProfileDvheStn
    192 </pre>
    193 
    194 <p>Dolby Vision video layers and metadata must be concatenated into a single
    195 buffer per frames by video applications. This is done automatically by the
    196 Dolby-Vision capable MediaExtractor.</p>
    197 
    198 <h4>HEVC HDR 10</h4>
    199 
    200 <p><code>MediaCodecInfo.CodecProfileLevel</code> profile constants:</p>
    201 <pre class="devsite-click-to-copy">
    202 int HEVCProfileMain10HDR10
    203 </pre>
    204 
    205 <h4>VP9 HLG & PQ</h4>
    206 
    207 <p><code>MediaCodecInfo.CodecProfileLevel</code> profile
    208 constants:</p>
    209 <pre class="devsite-click-to-copy">
    210 int VP9Profile2HDR
    211 int VP9Profile3HDR
    212 </pre>
    213 
    214 <p>If a platform supports an HDR-capable decoder, it shall also support an
    215 HDR-capable extractor.</p>
    216 
    217 <p>Only tunneled decoders are guaranteed to play back HDR content. Playback
    218 by non-tunneled decoders may result in the HDR information being lost and
    219 the content being flattened into an SDR color volume.</p>
    220 
    221 <h3 id="extractor">Extractor</h3>
    222 
    223 <p>The following containers are supported for the various HDR technologies
    224 on Android 7.0:</p>
    225 
    226 <table>
    227 <tbody>
    228 <tr>
    229 <th>Technology
    230 </th>
    231 <th>Dolby-Vision
    232 </th>
    233 <th>HDR10
    234 </th>
    235 <th>VP9-HLG
    236 </th>
    237 <th>VP9-PQ
    238 </th>
    239 </tr>
    240 <tr>
    241 <th>Container
    242 </th>
    243 <td>MP4
    244 </td>
    245 <td>MP4
    246 </td>
    247 <td>WebM
    248 </td>
    249 <td>WebM
    250 </td>
    251 </tr>
    252 </tbody>
    253 </table>
    254 
    255 <p>Discovery of whether a track (of a file) requires HDR support is not
    256 supported by the platform. Applications may parse the codec-specific data
    257 to determine if a track requires a specific HDR profile.</p>
    258 
    259 <h3 id ="summary">Summary</h3>
    260 
    261 <p>Component requirements for each HDR technology are shown in the following table:</p>
    262 
    263 <div style="overflow:auto">
    264 <table>
    265 <tbody>
    266 <tr>
    267 <th>Technology
    268 </th>
    269 <th>Dolby-Vision
    270 </th>
    271 <th>HDR10
    272 </th>
    273 <th>VP9-HLG
    274 </th>
    275 <th>VP9-PQ
    276 </th>
    277 </tr>
    278 <tr>
    279 <th>Supported HDR type (Display)
    280 </th>
    281 <td>HDR_TYPE_DOLBY_VISION
    282 </td>
    283 <td>HDR_TYPE_HDR10
    284 </td>
    285 <td>HDR_TYPE_HLG
    286 </td>
    287 <td>HDR_TYPE_HDR10
    288 </td>
    289 </tr>
    290 <tr>
    291 <th>Container (Extractor)
    292 </th>
    293 <td>MP4
    294 </td>
    295 <td>MP4
    296 </td>
    297 <td>WebM
    298 </td>
    299 <td>WebM
    300 </td>
    301 </tr>
    302 <tr>
    303 <th>Decoder
    304 </th>
    305 <td>MIMETYPE_VIDEO_DOLBY_VISION
    306 </td>
    307 <td>MIMETYPE_VIDEO_HEVC
    308 </td>
    309 <td>MIMETYPE_VIDEO_VP9
    310 </td>
    311 <td>MIMETYPE_VIDEO_VP9
    312 </td>
    313 </tr>
    314 <tr>
    315 <th>Profile (Decoder)
    316 </th>
    317 <td>One of the Dolby profiles
    318 </td>
    319 <td>HEVCProfileMain10HDR10
    320 </td>
    321 <td>VP9Profile2HDR or
    322 VP9Profile3HDR
    323 </td>
    324 <td>VP9Profile2HDR or
    325 VP9Profile3HDR
    326 </td>
    327 </tr>
    328 </tbody>
    329 </table>
    330 </div>
    331 <br>
    332 
    333 <p>Notes:</p>
    334 <ul>
    335 <li>Dolby-Vision bitstreams are packaged in an MP4 container in a way defined
    336 by Dolby. Applications may implement their own Dolby-capable extractors as
    337 long as they package the access units from the corresponding layers into a
    338 single access unit for the decoder as defined by Dolby.</li>
    339 <li>A platform may support an HDR-capable extractor, but no corresponding
    340 HDR-capable decoder.</li>
    341 </ul>
    342 
    343 <h2 id="playback">Playback</h2>
    344 
    345 <p>After an application has verified support for HDR playback, it can play
    346 back HDR content nearly the same way as it plays back non-HDR content,
    347 with the following caveats:</p>
    348 
    349 <ul>
    350 <li>For Dolby-Vision, whether or not a specific media file/track requires
    351 an HDR capable decoder is not immediately available. The application must
    352 have this information in advance or be able to obtain this information by
    353 parsing the codec-specific data section of the MediaFormat.</li>
    354 <li><code>CodecCapabilities.isFormatSupported</code> does not consider whether
    355 the tunneled decoder feature is required for supporting such a profile.</li>
    356 </ul>
    357 
    358 <h2 id="enablinghdr">Enabling HDR platform support</h2>
    359 
    360 <p>SoC vendors and OEMs must do additional work to enable HDR platform
    361 support for a device.</p>
    362 
    363 <h3 id="platformchanges">Platform changes in Android 7.0 for HDR</h3>
    364 
    365 <p>Here are some key changes in the platform (Application/Native layer)
    366 that OEMs and SOCs need to be aware of.</p>
    367 
    368 <h3 id="display">Display</h3>
    369 
    370 <h4>Hardware composition</h4>
    371 
    372 <p>HDR-capable platforms must support blending HDR content with non-HDR
    373 content. The exact blending characteristics and operations are not defined
    374 by Android as of release 7.0, but the process generally follows these steps:</p>
    375 <ol>
    376 <li>Determine a linear color space/volume that contains all layers to be
    377 composited, based on the layers' color, mastering, and potential dynamic
    378 metadata.
    379 <br>If compositing directly to a display, this could be the linear space
    380 that matches the display's color volume.</li>
    381 <li>Convert all layers to the common color space.</li>
    382 <li>Perform the blending.</li>
    383 <li>If displaying through HDMI:
    384 <ol style="list-style-type: lower-alpha">
    385 <li>Determine the color, mastering, and potential dynamic metadata for the
    386 blended scene.</li>
    387 <li>Convert the resulting blended scene to the derived color
    388 space/volume.</li>
    389 </ol>
    390 </li>
    391 <li>If displaying directly to the display, convert the resulting blended
    392 scene to the required display signals to produce that scene.
    393 </li>
    394 </ol>
    395 
    396 <h4>Display discovery</h4>
    397 
    398 <p>HDR display discovery is only supported via HWC2. Device implementers must
    399 selectively enable the HWC2 adapter that is released with Android 7.0 for this
    400 feature to work. Therefore, platforms must add support for HWC2 or extend the
    401 AOSP framework to allow a way to provide this information. HWC2 exposes a new
    402 API to propagate HDR Static Data to the framework and the application.</p>
    403 
    404 <h4>HDMI</h4>
    405 
    406 <ul>
    407 <li>A connected HDMI display advertises
    408 its HDR capability through HDMI EDID as defined in
    409 <a href="https://standards.cta.tech/kwspub/published_docs/CTA-861.3-Preview.pdf">
    410 CTA-861.3</a>
    411 section 4.2.</li>
    412 <li>The following EOTF mapping shall be used:
    413 <ul>
    414 <li>ET_0 Traditional gamma - SDR Luminance Range: not mapped to any HDR
    415 type</li>
    416 <li>ET_1 Traditional gamma - HDR Luminance Range: not mapped to any HDR
    417 type</li>
    418 <li>ET_2 SMPTE ST 2084 - mapped to HDR type HDR10</li>
    419 </ul>
    420 </li>
    421 <li>The signaling of Dolby Vision or HLG support over HDMI is done as defined
    422 by their relevant bodies.</li>
    423 <li>Note that the HWC2 API uses float desired luminance values, so the 8-bit
    424 EDID values must be translated in a suitable fashion.</li>
    425 </ul>
    426 
    427 <h3 id="decoders">Decoders</h3>
    428 
    429 <p>Platforms must add HDR-capable tunneled decoders and advertise their HDR
    430 support. Generally, HDR-capable decoders must:</p>
    431 <ul>
    432 <li>Support tunneled decoding (<code>FEATURE_TunneledPlayback</code>).</li>
    433 <li>Support HDR static metadata
    434 (<code>OMX.google.android.index.describeHDRColorInfo</code>) and its
    435 propagation to the display/hardware composition. For HLG, appropriate metadata
    436 must be submitted to the display.</li>
    437 <li>Support color description
    438 (<code>OMX.google.android.index.describeColorAspects</code>) and its
    439 propagation to the display/hardware composition.</li>
    440 <li>Support HDR embedded metadata as defined by the relevant standard.</li>
    441 </ul>
    442 
    443 <h4>Dolby Vision decoder support</h4>
    444 
    445 <p>To support Dolby Vision, platforms must add a Dolby-Vision capable
    446 HDR OMX decoder. Given the specifics of Dolby Vision, this is normally a
    447 wrapper decoder around one or more AVC and/or HEVC decoders as well as a
    448 compositor. Such decoders must:</p>
    449 <ul>
    450 <li>Support mime type "video/dolby-vision."</li>
    451 <li>Advertise supported Dolby Vision profiles/levels.</li>
    452 <li>Accept access units that contain the sub-access-units of all layers as
    453 defined by Dolby.</li>
    454 <li>Accept codec-specific data defined by Dolby. For example, data containing
    455 Dolby Vision profile/level and possibly the codec-specific data for the
    456 internal decoders.</li>
    457 <li>Support adaptive switching between Dolby Vision profiles/levels as
    458 required by Dolby.</li>
    459 </ul>
    460 
    461 <p>When configuring the decoder, the actual Dolby profile is not communicated
    462 to the codec. This is only done via codec-specific data after the decoder
    463 has been started. A platform could choose to support multiple Dolby Vision
    464 decoders: one for AVC profiles, and another for HEVC profiles to be able to
    465 initialize underlying codecs during configure time. If a single Dolby Vision
    466 decoder supports both types of profiles, it must also support switching
    467 between those dynamically in an adaptive fashion.</p>
    468 <p>If a platform provides a Dolby-Vision capable decoder in addition to the
    469 general HDR decoder support, it must:</p>
    470 
    471 <ul>
    472 <li>Provide a Dolby-Vision aware extractor, even if it does not support
    473 HDR playback.</li>
    474 <li>Provide a decoder that supports at least Dolby Vision profile X/level
    475 Y.</li>
    476 </ul>
    477 
    478 <h4>HDR10 decoder support</h4>
    479 
    480 <p>To support HDR10, platforms must add an HDR10-capable OMX decoder. This
    481 is normally a tunneled HEVC decoder that also supports parsing and handling
    482 HDMI related metadata. Such a decoder (in addition to the general HDR decoder
    483 support) must:</p>
    484 <ul>
    485 <li>Support mime type "video/hevc."</li>
    486 <li>Advertise supported HEVCMain10HDR10. HEVCMain10HRD10 profile support
    487 also requires supporting the HEVCMain10 profile, which requires supporting
    488 the HEVCMain profile at the same levels.</li>
    489 <li>Support parsing the mastering metadata SEI blocks, as well as other HDR
    490 related info contained in SPS.</li>
    491 </ul>
    492 
    493 <h4>VP9 decoder support</h4>
    494 
    495 <p>To support VP9 HDR, platforms must add a VP9 Profile2-capable HDR OMX
    496 decoder. This is normally a tunneled VP9 decoder that also supports handling
    497 HDMI related metadata. Such decoders (in addition to the general HDR decoder
    498 support) must:</p>
    499 <ul>
    500 <li>Support mime type "video/x-vnd.on2.vp9."</li>
    501 <li>Advertise supported VP9Profile2HDR. VP9Profile2HDR profile support also
    502 requires supporting VP9Profile2 profile at the same level.</li>
    503 </ul>
    504 
    505 <h3 id="extractors">Extractors</h3>
    506 
    507 <h4>Dolby Vision extractor support</h4>
    508 
    509 <p>Platforms that support Dolby Vision decoders must add Dolby extractor
    510 (called Dolby Extractor) support for Dolby Video content.</p>
    511 <ul>
    512 <li>A regular MP4 extractor can only extract the base layer from a file,
    513 but not the enhancement or metadata layers. So a special Dolby extractor is
    514 needed to extract the data from the file.</li>
    515 <li>The Dolby extractor must expose 1 to 2 tracks for each Dolby video track
    516 (group):
    517 <ul>
    518 <li>A Dolby Vision HDR track with the type of "video/dolby-vision" for the
    519 combined 2/3-layers Dolby stream. The HDR track's access-unit format, which
    520 defines how to package the access units from the base/enhancement/metadata
    521 layers into a single buffer to be decoded into a single HDR frame, is to be
    522 defined by Dolby.</li>
    523 <li>If a Dolby Vision video track contains a separate (backward compatible)
    524 base-layer (BL), the extractor must also expose this as a separate "video/avc"
    525 or "video/hevc" track. The extractor must provide regular AVC/HEVC access
    526 units for this track.</li>
    527 <li>The BL track must have the same track-unique-ID ("track-ID") as the
    528 HDR track so the app understands that these are two encodings of the same
    529 video.</li>
    530 <li>The application can decide which track to choose based on the platform's
    531 capability.</li>
    532 </ul>
    533 </li>
    534 <li>The Dolby Vision profile/level must be exposed in the track format of
    535 the HDR track.</li>
    536 <li>If a platform provides a Dolby-Vision capable decoder, it must also provide
    537 a Dolby-Vision aware extractor, even if it does not support HDR playback.</li>
    538 </ul>
    539 
    540 <h4>HDR10 and VP9 HDR extractor support</h4>
    541 
    542 <p>There are no additional extractor requirements to support HDR10 or VP9
    543 HLG. Platforms must extend MP4 extractor to support VP9 PQ in MP4. HDR
    544 static metadata must be propagated in the VP9 PQ bitstream, such that this
    545 metadata is passed to the VP9 PQ decoder and to the display via the normal
    546 MediaExtractor =&gt; MediaCodec pipeline.</p>
    547 
    548 <h3 id="stagefright">Stagefright extensions for Dolby Vision support</h3>
    549 
    550 <p>Platforms must add Dolby Vision format support to Stagefright:</p>
    551 <ul>
    552 <li>Support for port definition query for compressed port.</li>
    553 <li>Support profile/level enumeration for DV decoder.</li>
    554 <li>Support exposing DV profile/level for DV HDR tracks.</li>
    555 </ul>
    556 
    557 <h2 id="implementationnotes">Technology-specific implementation details</h2>
    558 
    559 <h3 id="hdr10decoder">HDR10 decoder pipeline</h3>
    560 
    561 <p><img src="/devices/tech/images/hdr10_decoder_pipeline.png"></p>
    562 
    563 <p class="img-caption"><strong>Figure 1.</strong> HDR10 pipeline</p>
    564 
    565 <p>HDR10 bitstreams are packaged in MP4 containers. Applications use a regular
    566 MP4 extractor to extract the frame data and send it to the decoder.</p>
    567 
    568 <ul>
    569 <li><b>MPEG4 Extractor</b><br>
    570 HDR10 bitstreams are recognized as just a normal HEVC stream by a
    571 MPEG4Extractor and the HDR track with the type "video/HEVC" will be
    572 extracted. The framework picks an HEVC video decoder that supports the
    573 Main10HDR10 profile to decode that track.</li>
    574 
    575 <li><b>HEVC Decoder</b><br>
    576 HDR information is in either SEI or SPS. The HEVC decoder first receives
    577 frames that contain the HDR information. The decoder then extracts the HDR
    578 information and notifies the application that it is decoding an HDR video. HDR
    579 information is bundled into decoder output format, which is propagated to
    580 the surface later.</li>
    581 </ul>
    582 
    583 <h4>Vendor actions</h4>
    584 <ol>
    585 <li>Advertise supported HDR decoder profile and level OMX type. Example:<br>
    586 <code>OMX_VIDEO_HEVCProfileMain10HDR10</code> (and <code>Main10</code>)</li>
    587 <li>Implement support for index:
    588 '<code>OMX.google.android.index.describeHDRColorInfo</code>'</li>
    589 <li>Implement support for index:
    590 '<code>OMX.google.android.index.describeColorAspects</code>'</li>
    591 <li>Implement support for SEI parsing of mastering metadata.</li>
    592 </ol>
    593 
    594 <h3 id="dvdecoder">Dolby Vision decoder pipeline</h3>
    595 
    596 <p><img src="/devices/tech/images/dolby_vision_decoder_pipleline.png"></p>
    597 
    598 <p class="img-caption"><strong>Figure 2.</strong> Dolby Vision pipeline</p>
    599 
    600 <p>Dolby-bitstreams are packaged in MP4 containers as defined by
    601 Dolby. Applications could, in theory, use a regular MP4 extractor to extract
    602 the base layer, enhancement layer, and metadata layer independently; however,
    603 this does not fit the current Android MediaExtractor/MediaCodec model.</p>
    604 
    605 <ul>
    606 <li>DolbyExtractor:
    607 <ul>
    608 <li>Dolby-bitstreams are recognized by a DolbyExtractor, which exposes the
    609 various layers as 1 to 2 tracks for each dolby video track (group):
    610 <ul>
    611 <li>An HDR track with the type of "video/dolby-vision" for the combined
    612 2/3-layers dolby stream. The HDR track's access-unit format, which defines
    613 how to package the access units from the base/enhancement/metadata layers
    614 into a single buffer to be decoded into a single HDR frame, is to be defined
    615 by Dolby.</li>
    616 <li>(Optional, only if the BL is backward compatible) A BL track contains
    617 only the base layer, which must be decodable by regular MediaCodec decoder,
    618 for example, AVC/HEVC decoder. The extractor should provide regular AVC/HEVC
    619 access units for this track. This BL track must have the same track-unique-ID
    620 ("track-ID") as the Dolby track so the application understands that these
    621 are two encodings of the same video.</li>
    622 </ul>
    623 <li>The application can decide which track to choose based on the platform's
    624 capability.</li>
    625 <li>Because an HDR track has a specific HDR type, the framework will pick
    626 a Dolby video decoder to decode that track. The BL track will be decoded by
    627 a regular AVC/HEVC video decoder.</li>
    628 </ul>
    629 
    630 <li>DolbyDecoder:
    631 <ul>
    632 <li>The DolbyDecoder receives access units that contain the required access
    633 units for all layers (EL+BL+MD or BL+MD)</li>
    634 <li>CSD (codec specific data, such as SPS+PPS+VPS) information for the
    635 individual layers can be packaged into 1 CSD frame to be defined by
    636 Dolby. Having a single CSD frame is required.</li>
    637 </ul>
    638 </ul>
    639 
    640 <h4>Dolby actions</h4>
    641 <ol>
    642 <li>Define the packaging of access units for the various Dolby container
    643 schemes (e.g. BL+EL+MD) for the abstract Dolby decoder (i.e. the buffer
    644 format expected by the HDR decoder).</li>
    645 <li>Define the packaging of CSD for the abstract Dolby decoder.</li>
    646 </ol>
    647 
    648 <h4>Vendor actions</h4>
    649 <ol>
    650 <li>Implement Dolby extractor. This can also be done by Dolby.</li>
    651 <li>Integrate DolbyExtractor into the framework.  The entry point is
    652 <code>frameworks/av/media/libstagefright/MediaExtractor.cpp</code>.</li>
    653 <li>Declare HDR decoder profile and level OMX
    654 type. Example: <code>OMX_VIDEO_DOLBYPROFILETYPE</code> and
    655 <code>OMX_VIDEO_DOLBYLEVELTYP</code>.</li>
    656 <li>Implement support for index:
    657 <code>'OMX.google.android.index.describeColorAspects</code>'</li>
    658 <li>Propagate the dynamic HDR metadata to the app and surface in each
    659 frame. Typically this information must be packaged into the decoded frame
    660 as defined by Dolby, because the HDMI standard does not provide a way to
    661 pass this to the display.</li>
    662 </ol>
    663 
    664 <h3 id="v9decoder">VP9 decoder pipeline</h3>
    665 
    666 <p><img src="/devices/tech/images/vp9-pq_decoder_pipleline.png"></p>
    667 
    668 <p class="img-caption"><strong>Figure 3.</strong> VP9-PQ pipeline</p>
    669 
    670 <p>VP9 bitstreams are packaged in WebM containers in a way defined by WebM
    671 team. Applications need to use a WebM extractor to extract HDR metadata from
    672 the bitstream before sending frames to the decoder.</p>
    673 
    674 <ul>
    675 <li>WebM Extractor:
    676 <ul>
    677 <li>WebM Extractor extracts the HDR <a
    678 href="http://www.webmproject.org/docs/container/#colour">metadata</a>
    679 and frames from the <a
    680 href="http://www.webmproject.org/docs/container/#location-of-the-colour-element-in-an-mkv-file">
    681 container</a>.</li>
    682 </ul>
    683 
    684 <li>VP9 Decoder:
    685 <ul>
    686 <li>Decoder receives Profile2 bitstreams and decodes them as normal VP9
    687 streams.</li>
    688 <li>Decoder receives any HDR static metadata from the framework.</li>
    689 <li>Decoder receives static metadata via the bitstream access units for VP9
    690 PQ streams.</li>
    691 <li>VP9 decoder must be able to propagate the HDR static/dynamic metadata
    692 to the display.</li>
    693 </ul>
    694 </ul>
    695 
    696 <h4>Vendor Actions</h4>
    697 
    698 <ol>
    699 <li>Implement support for index:
    700 <code>OMX.google.android.index.describeHDRColorInfo</code></li>
    701 <li>Implement support for index:
    702 <code>OMX.google.android.index.describeColorAspects</code></li>
    703 <li>Propagate HDR static metadata</li>
    704 </ol>
    705 
    706   </body>
    707 </html>
    708