1 page.title=Metadata and Controls 2 @jd:body 3 4 <!-- 5 Copyright 2013 The Android Open Source Project 6 7 Licensed under the Apache License, Version 2.0 (the "License"); 8 you may not use this file except in compliance with the License. 9 You may obtain a copy of the License at 10 11 http://www.apache.org/licenses/LICENSE-2.0 12 13 Unless required by applicable law or agreed to in writing, software 14 distributed under the License is distributed on an "AS IS" BASIS, 15 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 See the License for the specific language governing permissions and 17 limitations under the License. 18 --> 19 <div id="qv-wrapper"> 20 <div id="qv"> 21 <h2>In this document</h2> 22 <ol id="auto-toc"> 23 </ol> 24 </div> 25 </div> 26 27 <h2 id="metadata">Metadata support</h2> 28 <p> To support the saving of raw image files by the Android framework, substantial 29 metadata is required about the sensor's characteristics. This includes 30 information such as color spaces and lens shading functions.</p> 31 <p>Most of this information is a static property of the camera subsystem and can 32 therefore be queried before configuring any output pipelines or submitting any 33 requests. The new camera APIs greatly expand the information provided by the 34 getCameraInfo() method to provide this information to the application.</p> 35 <p>In addition, manual control of the camera subsystem requires feedback from the 36 assorted devices about their current state, and the actual parameters used in 37 capturing a given frame. The actual values of the controls (exposure time, frame 38 duration, and sensitivity) as actually used by the hardware must be included in 39 the output metadata. This is essential so that applications know when either 40 clamping or rounding took place, and so that the application can compensate for 41 the real settings used for image capture.</p> 42 <p>For example, if an application sets frame duration to 0 in a request, the HAL 43 must clamp the frame duration to the real minimum frame duration for that 44 request, and report that clamped minimum duration in the output result metadata.</p> 45 <p>So if an application needs to implement a custom 3A routine (for example, to 46 properly meter for an HDR burst), it needs to know the settings used to capture 47 the latest set of results it has received in order to update the settings for 48 the next request. Therefore, the new camera API adds a substantial amount of 49 dynamic metadata to each captured frame. This includes the requested and actual 50 parameters used for the capture, as well as additional per-frame metadata such 51 as timestamps and statistics generator output.</p> 52 <h2 id="per-setting">Per-setting control</h2> 53 <p> For most settings, the expectation is that they can be changed every frame, 54 without introducing significant stutter or delay to the output frame stream. 55 Ideally, the output frame rate should solely be controlled by the capture 56 request's frame duration field, and be independent of any changes to processing 57 blocks' configuration. In reality, some specific controls are known to be slow 58 to change; these include the output resolution and output format of the camera 59 pipeline, as well as controls that affect physical devices, such as lens focus 60 distance. The exact requirements for each control set are detailed later.</p> 61 <h2 id="raw-sensor">Raw sensor data support</h2> 62 <p>In addition to the pixel formats supported by 63 the old API, the new API adds a requirement for support for raw sensor data 64 (Bayer RAW), both for advanced camera applications as well as to support raw 65 image files.</p> 66