High dynamic range (HDR) video is the next frontier in high-quality video decoding, bringing unmatched scene reproduction qualities. It does so by significantly increasing the dynamic range of the luminance component (from the current 100 cd/m2 to 1000s of cd/m2) and by using a much wider color space (BT 2020). This is now a central element of the 4K UHD evolution in the TV space.

In Android 7.0, initial HDR support has been added, which includes the creation of proper constants for the discovery and setup of HDR video pipelines. That means defining codec types and display modes and specifying how HDR data must be passed to MediaCodec and supplied to HDR decoders. HDR is only supported in tunneled video playback mode.

The purpose of this document is to help application developers support HDR stream playback, and help OEMs and SOCs enable the HDR features on Android 7.0.

Supported HDR technologies

As of Android 7.0 release, the following HDR technologies are supported.
Technology Dolby-Vision HDR10 VP9-HLG VP9-PQ
Codec AVC/HEVC HEVC VP9 VP9
Transfer Function ST-2084 ST-2084 HLG ST-2084
HDR Metadata Type Dynamic Static None Static

In Android 7.0, only HDR playback via tunneled mode is defined, but devices may add support for playback of HDR on SurfaceViews using opaque video buffers. In other words:

Discovery

HDR Playback requires an HDR-capable decoder and a connection to an HDR-capable display. Optionally, some technologies require a specific extractor.

Display

Applications shall use the new Display.getHdrCapabilities API to query the HDR technologies supported by the specified display. This is basically the information in the EDID Static Metadata Data Block as defined in CTA-861.3:

Constants:

Public Methods:

Decoder

Applications shall use the existing CodecCapabilities.profileLevels API to verify support for the new HDR capable profiles:

Dolby-Vision

MediaFormat mime constant:

String MIMETYPE_VIDEO_DOLBY_VISION

MediaCodecInfo.CodecProfileLevel profile constants:

int DolbyVisionProfileDvavPen
int DolbyVisionProfileDvavPer
int DolbyVisionProfileDvheDen
int DolbyVisionProfileDvheDer
int DolbyVisionProfileDvheDtb
int DolbyVisionProfileDvheDth
int DolbyVisionProfileDvheDtr
int DolbyVisionProfileDvheStn

Dolby Vision video layers and metadata must be concatenated into a single buffer per frames by video applications. This is done automatically by the Dolby-Vision capable MediaExtractor.

HEVC HDR 10

MediaCodecInfo.CodecProfileLevel profile constants:

int HEVCProfileMain10HDR10

VP9 HLG & PQ

MediaCodecInfo.CodecProfileLevel profile constants:

int VP9Profile2HDR
int VP9Profile3HDR

If a platform supports an HDR-capable decoder, it shall also support an HDR-capable extractor.

Only tunneled decoders are guaranteed to play back HDR content. Playback by non-tunneled decoders may result in the HDR information being lost and the content being flattened into an SDR color volume.

Extractor

The following containers are supported for the various HDR technologies on Android 7.0:

Technology Dolby-Vision HDR10 VP9-HLG VP9-PQ
Container MP4 MP4 WebM WebM

Discovery of whether a track (of a file) requires HDR support is not supported by the platform. Applications may parse the codec-specific data to determine if a track requires a specific HDR profile.

Summary

Component requirements for each HDR technology are shown in the following table:

Technology Dolby-Vision HDR10 VP9-HLG VP9-PQ
Supported HDR type (Display) HDR_TYPE_DOLBY_VISION HDR_TYPE_HDR10 HDR_TYPE_HLG HDR_TYPE_HDR10
Container (Extractor) MP4 MP4 WebM WebM
Decoder MIMETYPE_VIDEO_DOLBY_VISION MIMETYPE_VIDEO_HEVC MIMETYPE_VIDEO_VP9 MIMETYPE_VIDEO_VP9
Profile (Decoder) One of the Dolby profiles HEVCProfileMain10HDR10 VP9Profile2HDR or VP9Profile3HDR VP9Profile2HDR or VP9Profile3HDR

Notes:

Playback

After an application has verified support for HDR playback, it can play back HDR content nearly the same way as it plays back non-HDR content, with the following caveats:

Enabling HDR platform support

SoC vendors and OEMs must do additional work to enable HDR platform support for a device.

Platform changes in Android 7.0 for HDR

Here are some key changes in the platform (Application/Native layer) that OEMs and SOCs need to be aware of.

Display

Hardware composition

HDR-capable platforms must support blending HDR content with non-HDR content. The exact blending characteristics and operations are not defined by Android as of release 7.0, but the process generally follows these steps:

  1. Determine a linear color space/volume that contains all layers to be composited, based on the layers' color, mastering, and potential dynamic metadata.
    If compositing directly to a display, this could be the linear space that matches the display's color volume.
  2. Convert all layers to the common color space.
  3. Perform the blending.
  4. If displaying through HDMI:
    1. Determine the color, mastering, and potential dynamic metadata for the blended scene.
    2. Convert the resulting blended scene to the derived color space/volume.
  5. If displaying directly to the display, convert the resulting blended scene to the required display signals to produce that scene.

Display discovery

HDR display discovery is only supported via HWC2. Device implementers must selectively enable the HWC2 adapter that is released with Android 7.0 for this feature to work. Therefore, platforms must add support for HWC2 or extend the AOSP framework to allow a way to provide this information. HWC2 exposes a new API to propagate HDR Static Data to the framework and the application.

HDMI

Decoders

Platforms must add HDR-capable tunneled decoders and advertise their HDR support. Generally, HDR-capable decoders must:

Dolby Vision decoder support

To support Dolby Vision, platforms must add a Dolby-Vision capable HDR OMX decoder. Given the specifics of Dolby Vision, this is normally a wrapper decoder around one or more AVC and/or HEVC decoders as well as a compositor. Such decoders must:

When configuring the decoder, the actual Dolby profile is not communicated to the codec. This is only done via codec-specific data after the decoder has been started. A platform could choose to support multiple Dolby Vision decoders: one for AVC profiles, and another for HEVC profiles to be able to initialize underlying codecs during configure time. If a single Dolby Vision decoder supports both types of profiles, it must also support switching between those dynamically in an adaptive fashion.

If a platform provides a Dolby-Vision capable decoder in addition to the general HDR decoder support, it must:

HDR10 decoder support

To support HDR10, platforms must add an HDR10-capable OMX decoder. This is normally a tunneled HEVC decoder that also supports parsing and handling HDMI related metadata. Such a decoder (in addition to the general HDR decoder support) must:

VP9 decoder support

To support VP9 HDR, platforms must add a VP9 Profile2-capable HDR OMX decoder. This is normally a tunneled VP9 decoder that also supports handling HDMI related metadata. Such decoders (in addition to the general HDR decoder support) must:

Extractors

Dolby Vision extractor support

Platforms that support Dolby Vision decoders must add Dolby extractor (called Dolby Extractor) support for Dolby Video content.

HDR10 and VP9 HDR extractor support

There are no additional extractor requirements to support HDR10 or VP9 HLG. Platforms must extend MP4 extractor to support VP9 PQ in MP4. HDR static metadata must be propagated in the VP9 PQ bitstream, such that this metadata is passed to the VP9 PQ decoder and to the display via the normal MediaExtractor => MediaCodec pipeline.

Stagefright extensions for Dolby Vision support

Platforms must add Dolby Vision format support to Stagefright:

Technology-specific implementation details

HDR10 decoder pipeline

Figure 1. HDR10 pipeline

HDR10 bitstreams are packaged in MP4 containers. Applications use a regular MP4 extractor to extract the frame data and send it to the decoder.

Vendor actions

  1. Advertise supported HDR decoder profile and level OMX type. Example:
    OMX_VIDEO_HEVCProfileMain10HDR10 (and Main10)
  2. Implement support for index: 'OMX.google.android.index.describeHDRColorInfo'
  3. Implement support for index: 'OMX.google.android.index.describeColorAspects'
  4. Implement support for SEI parsing of mastering metadata.

Dolby Vision decoder pipeline

Figure 2. Dolby Vision pipeline

Dolby-bitstreams are packaged in MP4 containers as defined by Dolby. Applications could, in theory, use a regular MP4 extractor to extract the base layer, enhancement layer, and metadata layer independently; however, this does not fit the current Android MediaExtractor/MediaCodec model.

Dolby actions

  1. Define the packaging of access units for the various Dolby container schemes (e.g. BL+EL+MD) for the abstract Dolby decoder (i.e. the buffer format expected by the HDR decoder).
  2. Define the packaging of CSD for the abstract Dolby decoder.

Vendor actions

  1. Implement Dolby extractor. This can also be done by Dolby.
  2. Integrate DolbyExtractor into the framework. The entry point is frameworks/av/media/libstagefright/MediaExtractor.cpp.
  3. Declare HDR decoder profile and level OMX type. Example: OMX_VIDEO_DOLBYPROFILETYPE and OMX_VIDEO_DOLBYLEVELTYP.
  4. Implement support for index: 'OMX.google.android.index.describeColorAspects'
  5. Propagate the dynamic HDR metadata to the app and surface in each frame. Typically this information must be packaged into the decoded frame as defined by Dolby, because the HDMI standard does not provide a way to pass this to the display.

VP9 decoder pipeline

Figure 3. VP9-PQ pipeline

VP9 bitstreams are packaged in WebM containers in a way defined by WebM team. Applications need to use a WebM extractor to extract HDR metadata from the bitstream before sending frames to the decoder.

Vendor Actions

  1. Implement support for index: OMX.google.android.index.describeHDRColorInfo
  2. Implement support for index: OMX.google.android.index.describeColorAspects
  3. Propagate HDR static metadata