Home | History | Annotate | Download | only in sensors
      1 page.title=Sensor stack
      2 @jd:body
      3 
      4 <!--
      5     Copyright 2015 The Android Open Source Project
      6 
      7     Licensed under the Apache License, Version 2.0 (the "License");
      8     you may not use this file except in compliance with the License.
      9     You may obtain a copy of the License at
     10 
     11         http://www.apache.org/licenses/LICENSE-2.0
     12 
     13     Unless required by applicable law or agreed to in writing, software
     14     distributed under the License is distributed on an "AS IS" BASIS,
     15     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     16     See the License for the specific language governing permissions and
     17     limitations under the License.
     18 -->
     19 <div id="qv-wrapper">
     20   <div id="qv">
     21     <h2>In this document</h2>
     22     <ol id="auto-toc">
     23     </ol>
     24   </div>
     25 </div>
     26 
     27 <p>The figure below represents the Android sensor stack. Each component
     28   communicates only with the components directly above and below it, though some
     29   sensors can bypass the sensor hub when it is present. Control flows from the
     30   applications down to the sensors, and data flows from the sensors up to the
     31   applications.</p>
     32 <img src="images/ape_fwk_sensors.png" alt="Layers and owners of the Android sensor stack" />
     33 <p class="img-caption"><strong>Figure 1.</strong> Layers of the Android sensor stack and their respective owners</p>
     34 
     35 <h2 id="sdk">SDK</h2>
     36 <p>Applications access sensors through the <a href="http://developer.android.com/reference/android/hardware/SensorManager.html">Sensors SDK (Software Development Kit) API</a>. The SDK contains functions to list available sensors and to register to a
     37   sensor.</p>
     38 <p>When registering to a sensor, the application specifies its preferred sampling
     39   frequency and its latency requirements.</p>
     40 <ul>
     41   <li> For example, an application might register to the default accelerometer,
     42     requesting events at 100Hz, and allowing events to be reported with a 1-second
     43     latency. </li>
     44   <li> The application will receive events from the accelerometer at a rate of at
     45     least 100Hz, and possibly delayed up to 1 second. </li>
     46 </ul>
     47 <p>See the <a href="index.html#targeted_at_developers">developer documentation</a> for more information on the SDK.</p>
     48 <h2 id="framework">Framework</h2>
     49 <p>The framework is in charge of linking the several applications to the <a href="hal-interface.html">HAL</a>. The HAL itself is single-client. Without this multiplexing happening at the
     50   framework level, only a single application could access each sensor at any
     51   given time.</p>
     52 <ul>
     53   <li> When a first application registers to a sensor, the framework sends a request
     54     to the HAL to activate the sensor. </li>
     55   <li> When additional applications register to the same sensor, the framework takes
     56     into account requirements from each application and sends the updated requested
     57     parameters to the HAL.
     58     <ul>
     59       <li> The <a href="hal-interface.html#sampling_period_ns">sampling frequency</a> will be the maximum of the requested sampling frequencies, meaning some
     60         applications will receive events at a frequency higher than the one they
     61         requested. </li>
     62       <li> The <a href="hal-interface.html#max_report_latency_ns">maximum reporting latency</a> will be the minimum of the requested ones. If one application requests one
     63         sensor with a maximum reporting latency of 0, all applications will receive the
     64         events from this sensor in continuous mode even if some requested the sensor
     65         with a non-zero maximum reporting latency. See <a href="batching.html">Batching</a> for more details. </li>
     66     </ul>
     67   </li>
     68   <li> When the last application registered to one sensor unregisters from it, the
     69     frameworks sends a request to the HAL to deactivate the sensor so power is not
     70     consumed unnecessarily. </li>
     71 </ul>
     72 <h3 id="impact_of_multiplexing">Impact of multiplexing</h3>
     73 <p>This need for a multiplexing layer in the framework explains some design
     74   decisions.</p>
     75 <ul>
     76   <li> When an application requests a specific sampling frequency, there is no
     77     guarantee that events wont arrive at a faster rate. If another application
     78     requested the same sensor at a faster rate, the first application will also
     79     receive them at the fast rate. </li>
     80   <li> The same lack of guarantee applies to the requested maximum reporting latency:
     81     applications might receive events with much less latency than they requested. </li>
     82   <li> Besides sampling frequency and maximum reporting latency, applications cannot
     83     configure sensor parameters.
     84     <ul>
     85       <li> For example, imagine a physical sensor that can function both in high
     86         accuracy and low power modes. </li>
     87       <li> Only one of those two modes can be used on an Android device, because
     88         otherwise, an application could request the high accuracy mode, and another one
     89         a low power mode; there would be no way for the framework to satisfy both
     90         applications. The framework must always be able to satisfy all its clients, so
     91         this is not an option. </li>
     92     </ul>
     93   </li>
     94   <li> There is no mechanism to send data down from the applications to the sensors or
     95     their drivers. This ensures one application cannot modify the behavior of the
     96     sensors, breaking other applications. </li>
     97 </ul>
     98 <h3 id="sensor_fusion">Sensor fusion</h3>
     99 <p>The Android framework provides a default implementation for some composite
    100   sensors. When a <a href="sensor-types.html#gyroscope">gyroscope</a>, an <a href="sensor-types.html#accelerometer">accelerometer</a> and a <a href="sensor-types.html#magnetic_field_sensor">magnetometer</a> are present on a device, but no <a href="sensor-types.html#rotation_vector">rotation vector</a>, <a href="sensor-types.html#gravity">gravity</a> and <a href="sensor-types.html#linear_acceleration">linear acceleration</a> sensors are present, the framework implements those sensors so applications
    101   can still use them.</p>
    102 <p>The default implementation does not have access to all the data that other
    103   implementations have access to, and it must run on the SoC, so it is not as
    104   accurate nor as power efficient as other implementations can be. As much as
    105   possible, device manufacturers should define their own fused sensors (rotation
    106   vector, gravity and linear acceleration, as well as newer composite sensors like
    107   the <a href="sensor-types.html#game_rotation_vector">game rotation vector</a>) rather than rely on this default implementation. Device manufacturers can
    108   also request sensor chip vendors to provide them with an implementation.</p>
    109 <p>The default sensor fusion implementation is not being maintained and
    110   might cause devices relying on it to fail CTS.</p>
    111 <h3 id="under_the_hood">Under the Hood</h3>
    112 <p>This section is provided as background information for those maintaining the
    113   Android Open Source Project (AOSP) framework code. It is not relevant for
    114   hardware manufacturers.</p>
    115 <h4 id="jni">JNI</h4>
    116 <p>The framework uses a Java Native Interface (JNI) associated with <a href="http://developer.android.com/reference/android/hardware/package-summary.html">android.hardware</a> and located in the <code>frameworks/base/core/jni/</code> directory. This code calls the
    117   lower level native code to obtain access to the sensor hardware.</p>
    118 <h4 id="native_framework">Native framework</h4>
    119 <p>The native framework is defined in <code>frameworks/native/</code> and provides a native
    120   equivalent to the <a href="http://developer.android.com/reference/android/hardware/package-summary.html">android.hardware</a> package. The native framework calls the Binder IPC proxies to obtain access to
    121   sensor-specific services.</p>
    122 <h4 id="binder_ipc">Binder IPC</h4>
    123 <p>The Binder IPC proxies facilitate communication over process boundaries.</p>
    124 <h2 id="hal">HAL</h2>
    125 <p>The Sensors Hardware Abstraction Layer (HAL) API is the interface between the
    126   hardware drivers and the Android framework. It consists of one HAL interface
    127   sensors.h and one HAL implementation we refer to as sensors.cpp.</p>
    128 <p>The interface is defined by Android and AOSP contributors, and the
    129   implementation is provided by the manufacturer of the device.</p>
    130 <p>The sensor HAL interface is located in <code>hardware/libhardware/include/hardware</code>.
    131   See <a href="{@docRoot}devices/halref/sensors_8h.html">sensors.h</a> for additional details.</p>
    132 <h3 id="release_cycle">Release cycle</h3>
    133 <p>The HAL implementation specifies what version of the HAL interface it
    134   implements by setting <code>your_poll_device.common.version</code>. The existing HAL
    135   interface versions are defined in sensors.h, and functionality is tied to those
    136   versions.</p>
    137 <p>The Android framework currently supports versions 1.0 and 1.3, but 1.0 will
    138   soon not be supported anymore. This documentation describes the behavior of version
    139   1.3, to which all devices should upgrade. For details on how to upgrade to
    140   1.3, see <a href="versioning.html">HAL version deprecation</a>.</p>
    141 <h2 id="kernel_driver">Kernel driver</h2>
    142 <p>The sensor drivers interact with the physical devices. In some cases, the HAL
    143   implementation and the drivers are the same software entity. In other cases,
    144   the hardware integrator requests sensor chip manufacturers to provide the
    145   drivers, but they are the ones writing the HAL implementation.</p>
    146 <p>In all cases, HAL implementation and kernel drivers are the responsibility of
    147   the hardware manufacturers, and Android does not provide preferred approaches to
    148   write them.</p>
    149 <h2 id="sensor_hub">Sensor hub</h2>
    150 <p>The sensor stack of a device can optionally include a sensor hub, useful to
    151   perform some low-level computation at low power while the SoC can be in a
    152   suspend mode. For example, step counting or sensor fusion can be performed on
    153   those chips. It is also a good place to implement sensor batching, adding
    154   hardware FIFOs for the sensor events. See <a
    155 href="batching.html">Batching</a> for more information.</p>
    156 <p>How the sensor hub is materialized depends on the architecture. It is sometimes
    157   a separate chip, and sometimes included on the same chip as the SoC. Important
    158   characteristics of the sensor hub is that it should contain sufficient memory
    159   for batching and consume very little power to enable implementation of the low
    160   power Android sensors. Some sensor hubs contain a microcontroller for generic
    161   computation, and hardware accelerators to enable very low power computation for
    162   low power sensors.</p>
    163 <p>How the sensor hub is architectured and how it communicates with the sensors
    164   and the SoC (I2C bus, SPI bus, ) is not specified by Android, but it should aim
    165   at minimizing overall power use.</p>
    166 <p>One option that appears to have a significant impact on implementation
    167   simplicity is having two interrupt lines going from the sensor hub to the SoC:
    168   one for wake-up interrupts (for wake-up sensors), and the other for non-wake-up
    169   interrupts (for non-wake-up sensors).</p>
    170 <h2 id="sensors">Sensors</h2>
    171 <p>Those are the physical MEMs chips making the measurements. In many cases,
    172   several physical sensors are present on the same chip. For example, some chips
    173   include an accelerometer, a gyroscope and a magnetometer. (Such chips are often
    174   called 9-axis chips, as each sensor provides data over 3 axes.)</p>
    175 <p>Some of those chips also contain some logic to perform usual computations such
    176   as motion detection, step detection and 9-axis sensor fusion.</p>
    177 <p>Although the CDD power and accuracy requirements and recommendations target the
    178   Android sensor and not the physical sensors, those requirements impact the
    179   choice of physical sensors. For example, the accuracy requirement on the game
    180   rotation vector has implications on the required accuracy for the physical
    181   gyroscope. It is up to the device manufacturer to derive the requirements for
    182   physical sensors.</p>
    183