Home | History | Annotate | Download | only in camera
      1 page.title=HAL subsystem
      2 @jd:body
      3 
      4 <!--
      5     Copyright 2013 The Android Open Source Project
      6 
      7     Licensed under the Apache License, Version 2.0 (the "License");
      8     you may not use this file except in compliance with the License.
      9     You may obtain a copy of the License at
     10 
     11         http://www.apache.org/licenses/LICENSE-2.0
     12 
     13     Unless required by applicable law or agreed to in writing, software
     14     distributed under the License is distributed on an "AS IS" BASIS,
     15     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     16     See the License for the specific language governing permissions and
     17     limitations under the License.
     18 -->
     19 <div id="qv-wrapper">
     20   <div id="qv">
     21     <h2>In this document</h2>
     22     <ol id="auto-toc">
     23     </ol>
     24   </div>
     25 </div>
     26 
     27 <h2 id="requests">Requests</h2>
     28 <p> The app framework issues requests for captured results to the camera subsystem. 
     29   One request corresponds to one set of results. A request encapsulates all 
     30   configuration information about the capturing and processing of those results. 
     31   This includes things such as resolution and pixel format; manual sensor, lens, 
     32   and flash control; 3A operating modes; RAW to YUV processing control; and 
     33   statistics generation. This allows for much more control over the results' 
     34   output and processing. Multiple requests can be in flight at once, and 
     35   submitting requests is non-blocking. And the requests are always processed in 
     36   the order they are received.<br/>
     37   <img src="images/camera_model.png" alt="Camera request model" id="figure1" />
     38   <p class="img-caption">
     39   <strong>Figure 1.</strong> Camera model
     40 </p>
     41 <h2 id="hal-subsystem">The HAL and camera subsystem</h2>
     42 <p> The camera subsystem includes the implementations for components in the camera 
     43   pipeline such as the 3A algorithm and processing controls. The camera HAL 
     44   provides interfaces for you to implement your versions of these components. To 
     45   maintain cross-platform compatibility between multiple device manufacturers and 
     46   Image Signal Processor (ISP, or camera sensor) vendors, the camera pipeline 
     47   model is virtual and does not directly correspond to any real ISP. However, it 
     48   is similar enough to real processing pipelines so that you can map it to your 
     49   hardware efficiently. In addition, it is abstract enough to allow for multiple 
     50   different algorithms and orders of operation without compromising either 
     51   quality, efficiency, or cross-device compatibility.<br/>
     52   The camera pipeline also supports triggers that the app framework can initiate 
     53   to turn on things such as auto-focus. It also sends notifications back to the 
     54   app framework, notifying apps of events such as an auto-focus lock or errors.<br/>
     55   <img src="images/camera_hal.png" alt="Camera hardware abstraction layer" id="figure2" />
     56   <p class="img-caption">
     57   <strong>Figure 2.</strong> Camera pipeline
     58   </p>
     59   Please note, some image processing blocks shown in the diagram above are not 
     60   well-defined in the initial release.<br/>
     61   The camera pipeline makes the following assumptions:</p>
     62 <ul>
     63   <li>RAW Bayer output undergoes no processing inside the ISP.</li>
     64   <li>Statistics are generated based off the raw sensor data.</li>
     65   <li>The various processing blocks that convert raw sensor data to YUV are in an 
     66     arbitrary order.</li>
     67   <li>While multiple scale and crop units are shown, all scaler units share the 
     68     output region controls (digital zoom). However, each unit may have a different 
     69     output resolution and pixel format.</li>
     70 </ul>
     71 <p><strong>Summary of API use</strong><br/>
     72   This is a brief summary of the steps for using the Android camera API. See the 
     73   Startup and expected operation sequence section for a detailed breakdown of 
     74   these steps, including API calls.</p>
     75 <ol>
     76   <li>Listen for and enumerate camera devices.</li>
     77   <li>Open device and connect listeners.</li>
     78   <li>Configure outputs for target use case (such as still capture, recording, 
     79     etc.).</li>
     80   <li>Create request(s) for target use case.</li>
     81   <li>Capture/repeat requests and bursts.</li>
     82   <li>Receive result metadata and image data.</li>
     83   <li>When switching use cases, return to step 3.</li>
     84 </ol>
     85 <p><strong>HAL operation summary</strong></p>
     86 <ul>
     87   <li>Asynchronous requests for captures come from the framework.</li>
     88   <li>HAL device must process requests in order. And for each request, produce 
     89     output result metadata, and one or more output image buffers.</li>
     90   <li>First-in, first-out for requests and results, and for streams referenced by 
     91     subsequent requests. </li>
     92   <li>Timestamps must be identical for all outputs from a given request, so that the 
     93     framework can match them together if needed. </li>
     94   <li>All capture configuration and state (except for the 3A routines) is 
     95     encapsulated in the requests and results.</li>
     96 </ul>
     97 <img src="images/camera-hal-overview.png" alt="Camera HAL overview" id="figure3" />
     98 <p class="img-caption">
     99   <strong>Figure 3.</strong> Camera HAL overview
    100 </p>
    101 <h2 id="startup">Startup and expected operation sequence</h2>
    102 <p>This section contains a detailed explanation of the steps expected when using 
    103   the camera API. Please see <a href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera3.h">platform/hardware/libhardware/include/hardware/camera3.h</a> for definitions of these structures and methods.</p>
    104 <ol>
    105   <li>Framework calls camera_module_t-&gt;common.open(), which returns a 
    106     hardware_device_t structure.</li>
    107   <li>Framework inspects the hardware_device_t-&gt;version field, and instantiates the 
    108     appropriate handler for that version of the camera hardware device. In case 
    109     the version is CAMERA_DEVICE_API_VERSION_3_0, the device is cast to a 
    110     camera3_device_t.</li>
    111   <li>Framework calls camera3_device_t-&gt;ops-&gt;initialize() with the framework 
    112     callback function pointers. This will only be called this one time after 
    113     open(), before any other functions in the ops structure are called.</li>
    114   <li>The framework calls camera3_device_t-&gt;ops-&gt;configure_streams() with a list of 
    115     input/output streams to the HAL device.</li>
    116   <li>The framework allocates gralloc buffers and calls 
    117     camera3_device_t-&gt;ops-&gt;register_stream_buffers() for at least one of the 
    118     output streams listed in configure_streams. The same stream is registered 
    119     only once.</li>
    120   <li>The framework requests default settings for some number of use cases with 
    121     calls to camera3_device_t-&gt;ops-&gt;construct_default_request_settings(). This 
    122     may occur any time after step 3.</li>
    123   <li>The framework constructs and sends the first capture request to the HAL with 
    124     settings based on one of the sets of default settings, and with at least one 
    125     output stream that has been registered earlier by the framework. This is sent 
    126     to the HAL with camera3_device_t-&gt;ops-&gt;process_capture_request(). The HAL 
    127     must block the return of this call until it is ready for the next request to 
    128     be sent.</li>
    129   <li>The framework continues to submit requests, and possibly call 
    130     register_stream_buffers() for not-yet-registered streams, and call 
    131     construct_default_request_settings to get default settings buffers for other 
    132     use cases.</li>
    133   <li>When the capture of a request begins (sensor starts exposing for the 
    134     capture), the HAL calls camera3_callback_ops_t-&gt;notify() with the SHUTTER 
    135     event, including the frame number and the timestamp for start of exposure. 
    136     This notify call must be made before the first call to 
    137     process_capture_result() for that frame number.</li>
    138   <li>After some pipeline delay, the HAL begins to return completed captures to 
    139     the framework with camera3_callback_ops_t-&gt;process_capture_result(). These 
    140     are returned in the same order as the requests were submitted. Multiple 
    141     requests can be in flight at once, depending on the pipeline depth of the 
    142     camera HAL device.</li>
    143   <li>After some time, the framework may stop submitting new requests, wait for 
    144     the existing captures to complete (all buffers filled, all results 
    145     returned), and then call configure_streams() again. This resets the camera 
    146     hardware and pipeline for a new set of input/output streams. Some streams 
    147     may be reused from the previous configuration; if these streams' buffers had 
    148     already been registered with the HAL, they will not be registered again. The 
    149     framework then continues from step 7, if at least one registered output 
    150     stream remains. (Otherwise, step 5 is required first.)</li>
    151   <li>Alternatively, the framework may call camera3_device_t-&gt;common-&gt;close() to 
    152     end the camera session. This may be called at any time when no other calls 
    153     from the framework are active, although the call may block until all 
    154     in-flight captures have completed (all results returned, all buffers 
    155     filled). After the close call returns, no more calls to the 
    156     camera3_callback_ops_t functions are allowed from the HAL. Once the close() 
    157     call is underway, the framework may not call any other HAL device functions.</li>
    158   <li>In case of an error or other asynchronous event, the HAL must call 
    159     camera3_callback_ops_t-&gt;notify() with the appropriate error/event message. 
    160     After returning from a fatal device-wide error notification, the HAL should 
    161     act as if close() had been called on it. However, the HAL must either cancel 
    162     or complete all outstanding captures before calling notify(), so that once 
    163     notify() is called with a fatal error, the framework will not receive 
    164     further callbacks from the device. Methods besides close() should return 
    165     -ENODEV or NULL after the notify() method returns from a fatal error 
    166     message.</li>
    167 </ol>
    168 <img src="images/camera-ops-flow.png" width="600" height="434" alt="Camera operations flow" id="figure4" />
    169 <p class="img-caption">
    170   <strong>Figure 4.</strong> Camera operational flow
    171 </p>
    172 <h2 id="ops-modes">Operational modes</h2>
    173 <p>The camera 3 HAL device can implement one of two possible operational modes: 
    174   limited and full. Full support is expected from new higher-end devices. Limited 
    175   mode has hardware requirements roughly in line with those for a camera HAL 
    176   device v1 implementation, and is expected from older or inexpensive devices. 
    177   Full is a strict superset of limited, and they share the same essential 
    178   operational flow, as documented above.</p>
    179 <p>The HAL must indicate its level of support with the 
    180   android.info.supportedHardwareLevel static metadata entry, with 0 indicating 
    181   limited mode, and 1 indicating full mode support.</p>
    182 <p>Roughly speaking, limited-mode devices do not allow for application control of 
    183   capture settings (3A control only), high-rate capture of high-resolution images, 
    184   raw sensor readout, or support for YUV output streams above maximum recording 
    185   resolution (JPEG only for large images).<br/>
    186   Here are the details of limited-mode behavior:</p>
    187 <ul>
    188   <li>Limited-mode devices do not need to implement accurate synchronization between 
    189     capture request settings and the actual image data captured. Instead, changes 
    190     to settings may take effect some time in the future, and possibly not for the 
    191     same output frame for each settings entry. Rapid changes in settings may 
    192     result in some settings never being used for a capture. However, captures that 
    193     include high-resolution output buffers ( &gt; 1080p ) have to use the settings as 
    194     specified (but see below for processing rate).</li>
    195   <li>Captures in limited mode that include high-resolution (&gt; 1080p) output buffers 
    196     may block in process_capture_request() until all the output buffers have been 
    197     filled. A full-mode HAL device must process sequences of high-resolution 
    198     requests at the rate indicated in the static metadata for that pixel format. 
    199     The HAL must still call process_capture_result() to provide the output; the 
    200     framework must simply be prepared for process_capture_request() to block until 
    201     after process_capture_result() for that request completes for high-resolution 
    202     captures for limited-mode devices.</li>
    203   <li>Limited-mode devices do not need to support most of the settings/result/static 
    204     info metadata. Only the following settings are expected to be consumed or 
    205     produced by a limited-mode HAL device:
    206     <ul>
    207       <li>android.control.aeAntibandingMode (controls)</li>
    208       <li>android.control.aeExposureCompensation (controls)</li>
    209       <li>android.control.aeLock (controls)</li>
    210       <li>android.control.aeMode (controls)</li>
    211       <li>[OFF means ON_FLASH_TORCH]</li>
    212       <li>android.control.aeRegions (controls)</li>
    213       <li>android.control.aeTargetFpsRange (controls)</li>
    214       <li>android.control.afMode (controls)</li>
    215       <li>[OFF means infinity focus]</li>
    216       <li>android.control.afRegions (controls)</li>
    217       <li>android.control.awbLock (controls)</li>
    218       <li>android.control.awbMode (controls)</li>
    219       <li>[OFF not supported]</li>
    220       <li>android.control.awbRegions (controls)</li>
    221       <li>android.control.captureIntent (controls)</li>
    222       <li>android.control.effectMode (controls)</li>
    223       <li>android.control.mode (controls)</li>
    224       <li>[OFF not supported]</li>
    225       <li>android.control.sceneMode (controls)</li>
    226       <li>android.control.videoStabilizationMode (controls)</li>
    227       <li>android.control.aeAvailableAntibandingModes (static)</li>
    228       <li>android.control.aeAvailableModes (static)</li>
    229       <li>android.control.aeAvailableTargetFpsRanges (static)</li>
    230       <li>android.control.aeCompensationRange (static)</li>
    231       <li>android.control.aeCompensationStep (static)</li>
    232       <li>android.control.afAvailableModes (static)</li>
    233       <li>android.control.availableEffects (static)</li>
    234       <li>android.control.availableSceneModes (static)</li>
    235       <li>android.control.availableVideoStabilizationModes (static)</li>
    236       <li>android.control.awbAvailableModes (static)</li>
    237       <li>android.control.maxRegions (static)</li>
    238       <li>android.control.sceneModeOverrides (static)</li>
    239       <li>android.control.aeRegions (dynamic)</li>
    240       <li>android.control.aeState (dynamic)</li>
    241       <li>android.control.afMode (dynamic)</li>
    242       <li>android.control.afRegions (dynamic)</li>
    243       <li>android.control.afState (dynamic)</li>
    244       <li>android.control.awbMode (dynamic)</li>
    245       <li>android.control.awbRegions (dynamic)</li>
    246       <li>android.control.awbState (dynamic)</li>
    247       <li>android.control.mode (dynamic)</li>
    248       <li>android.flash.info.available (static)</li>
    249       <li>android.info.supportedHardwareLevel (static)</li>
    250       <li>android.jpeg.gpsCoordinates (controls)</li>
    251       <li>android.jpeg.gpsProcessingMethod (controls)</li>
    252       <li>android.jpeg.gpsTimestamp (controls)</li>
    253       <li>android.jpeg.orientation (controls)</li>
    254       <li>android.jpeg.quality (controls)</li>
    255       <li>android.jpeg.thumbnailQuality (controls)</li>
    256       <li>android.jpeg.thumbnailSize (controls)</li>
    257       <li>android.jpeg.availableThumbnailSizes (static)</li>
    258       <li>android.jpeg.maxSize (static)</li>
    259       <li>android.jpeg.gpsCoordinates (dynamic)</li>
    260       <li>android.jpeg.gpsProcessingMethod (dynamic)</li>
    261       <li>android.jpeg.gpsTimestamp (dynamic)</li>
    262       <li>android.jpeg.orientation (dynamic)</li>
    263       <li>android.jpeg.quality (dynamic)</li>
    264       <li>android.jpeg.size (dynamic)</li>
    265       <li>android.jpeg.thumbnailQuality (dynamic)</li>
    266       <li>android.jpeg.thumbnailSize (dynamic)</li>
    267       <li>android.lens.info.minimumFocusDistance (static)</li>
    268       <li>android.request.id (controls)</li>
    269       <li>android.request.id (dynamic)</li>
    270       <li>android.scaler.cropRegion (controls)</li>
    271       <li>[ignores (x,y), assumes center-zoom]</li>
    272       <li>android.scaler.availableFormats (static)</li>
    273       <li>[RAW not supported]</li>
    274       <li>android.scaler.availableJpegMinDurations (static)</li>
    275       <li>android.scaler.availableJpegSizes (static)</li>
    276       <li>android.scaler.availableMaxDigitalZoom (static)</li>
    277       <li>android.scaler.availableProcessedMinDurations (static)</li>
    278       <li>android.scaler.availableProcessedSizes (static)</li>
    279       <li>[full resolution not supported]</li>
    280       <li>android.scaler.maxDigitalZoom (static)</li>
    281       <li>android.scaler.cropRegion (dynamic)</li>
    282       <li>android.sensor.orientation (static)</li>
    283       <li>android.sensor.timestamp (dynamic)</li>
    284       <li>android.statistics.faceDetectMode (controls)</li>
    285       <li>android.statistics.info.availableFaceDetectModes (static)</li>
    286       <li>android.statistics.faceDetectMode (dynamic)</li>
    287       <li>android.statistics.faceIds (dynamic)</li>
    288       <li>android.statistics.faceLandmarks (dynamic)</li>
    289       <li>android.statistics.faceRectangles (dynamic)</li>
    290       <li>android.statistics.faceScores (dynamic)</li>
    291     </ul>
    292   </li>
    293 </ul>
    294 <h2 id="interaction">Interaction between the application capture request, 3A
    295 control, and the processing pipeline</h2>
    296 <p>Depending on the settings in the 3A control block, the camera pipeline ignores 
    297   some of the parameters in the application's capture request and uses the values 
    298   provided by the 3A control routines instead. For example, when auto-exposure is 
    299   active, the exposure time, frame duration, and sensitivity parameters of the 
    300   sensor are controlled by the platform 3A algorithm, and any app-specified values 
    301   are ignored. The values chosen for the frame by the 3A routines must be reported 
    302   in the output metadata. The following table describes the different modes of the 
    303   3A control block and the properties that are controlled by these modes. See 
    304   the <a href="https://android.googlesource.com/platform/system/media/+/master/camera/docs/docs.html">platform/system/media/camera/docs/docs.html</a> file for definitions of these properties.</p>
    305 <table>
    306   <tr>
    307     <th>Parameter</th>
    308     <th>State</th>
    309     <th>Properties controlled</th>
    310   </tr>
    311   <tr>
    312     <td>android.control.aeMode</td>
    313     <td>OFF</td>
    314     <td>None</td>
    315   </tr>
    316   <tr>
    317     <td></td>
    318     <td>ON</td>
    319     <td>android.sensor.exposureTime
    320       android.sensor.frameDuration
    321       android.sensor.sensitivity
    322       android.lens.aperture (if supported)
    323       android.lens.filterDensity (if supported)</td>
    324   </tr>
    325   <tr>
    326     <td></td>
    327     <td>ON_AUTO_FLASH</td>
    328     <td>Everything is ON, plus android.flash.firingPower, android.flash.firingTime, and android.flash.mode</td>
    329   </tr>
    330   <tr>
    331     <td></td>
    332     <td>ON_ALWAYS_FLASH</td>
    333     <td>Same as ON_AUTO_FLASH</td>
    334   </tr>
    335   <tr>
    336     <td></td>
    337     <td>ON_AUTO_FLASH_RED_EYE</td>
    338     <td>Same as ON_AUTO_FLASH</td>
    339   </tr>
    340   <tr>
    341     <td>android.control.awbMode</td>
    342     <td>OFF</td>
    343     <td>None</td>
    344   </tr>
    345   <tr>
    346     <td></td>
    347     <td>WHITE_BALANCE_*</td>
    348     <td>android.colorCorrection.transform. Platform-specific adjustments if android.colorCorrection.mode is FAST or HIGH_QUALITY.</td>
    349   </tr>
    350   <tr>
    351     <td>android.control.afMode</td>
    352     <td>OFF</td>
    353     <td>None</td>
    354   </tr>
    355   <tr>
    356     <td></td>
    357     <td>FOCUS_MODE_*</td>
    358     <td>android.lens.focusDistance</td>
    359   </tr>
    360   <tr>
    361     <td>android.control.videoStabilization</td>
    362     <td>OFF</td>
    363     <td>None</td>
    364   </tr>
    365   <tr>
    366     <td></td>
    367     <td>ON</td>
    368     <td>Can adjust android.scaler.cropRegion to implement video stabilization</td>
    369   </tr>
    370   <tr>
    371     <td>android.control.mode</td>
    372     <td>OFF</td>
    373     <td>AE, AWB, and AF are disabled</td>
    374   </tr>
    375   <tr>
    376     <td></td>
    377     <td>AUTO</td>
    378     <td>Individual AE, AWB, and AF settings are used</td>
    379   </tr>
    380   <tr>
    381     <td></td>
    382     <td>SCENE_MODE_*</td>
    383     <td>Can override all parameters listed above. Individual 3A controls are disabled.</td>
    384   </tr>
    385 </table>
    386 <p>The controls exposed for the 3A algorithm mostly map 1:1 to the old API's 
    387   parameters (such as exposure compensation, scene mode, or white balance mode).<br/>
    388   The controls in the Image Processing block in Figure 2</a> all 
    389   operate on a similar principle, and generally each block has three modes:</p>
    390 <ul>
    391   <li>OFF: This processing block is disabled. The demosaic, color correction, and 
    392     tone curve adjustment blocks cannot be disabled.</li>
    393   <li>FAST: In this mode, the processing block may not slow down the output frame 
    394     rate compared to OFF mode, but should otherwise produce the best-quality 
    395     output it can given that restriction. Typically, this would be used for 
    396     preview or video recording modes, or burst capture for still images. On some 
    397     devices, this may be equivalent to OFF mode (no processing can be done without 
    398     slowing down the frame rate), and on some devices, this may be equivalent to 
    399     HIGH_QUALITY mode (best quality still does not slow down frame rate).</li>
    400   <li>HIGHQUALITY: In this mode, the processing block should produce the best 
    401     quality result possible, slowing down the output frame rate as needed. 
    402     Typically, this would be used for high-quality still capture. Some blocks 
    403     include a manual control which can be optionally selected instead of FAST or 
    404     HIGHQUALITY. For example, the color correction block supports a color 
    405     transform matrix, while the tone curve adjustment supports an arbitrary global 
    406     tone mapping curve.</li>
    407 </ul>
    408   <p>The maximum frame rate that can be supported by a camera subsystem is a function 
    409   of many factors:</p>
    410 <ul>
    411   <li>Requested resolutions of output image streams</li>
    412   <li>Availability of binning / skipping modes on the imager</li>
    413   <li>The bandwidth of the imager interface</li>
    414   <li>The bandwidth of the various ISP processing blocks</li>
    415 </ul>
    416 <p>Since these factors can vary greatly between different ISPs and sensors, the 
    417   camera HAL interface tries to abstract the bandwidth restrictions into as simple 
    418   model as possible. The model presented has the following characteristics:</p>
    419 <ul>
    420   <li>The image sensor is always configured to output the smallest resolution 
    421     possible given the application's requested output stream sizes.  The smallest 
    422     resolution is defined as being at least as large as the largest requested 
    423     output stream size.</li>
    424   <li>Since any request may use any or all the currently configured output streams, 
    425     the sensor and ISP must be configured to support scaling a single capture to 
    426     all the streams at the same time. </li>
    427   <li>JPEG streams act like processed YUV streams for requests for which they are 
    428     not included; in requests in which they are directly referenced, they act as 
    429     JPEG streams.</li>
    430   <li>The JPEG processor can run concurrently to the rest of the camera pipeline but 
    431     cannot process more than one capture at a time.</li>
    432 </ul>
    433