Home | History | Annotate | Download | only in opensles
      1 <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
      2 "http://www.w3.org/TR/html4/loose.dtd">
      3 <html>
      4 
      5 <head>
      6 <title>OpenSL ES for Android</title>
      7 </head>
      8 
      9 <body>
     10 
     11 <h1>OpenSL ES for Android</h1>
     12 
     13 This article describes the Android native audio APIs based on the
     14 Khronos Group OpenSL ES&#8482; 1.0.1 standard.
     15 <p>
     16 Unless otherwise noted,
     17 all features are available at Android API level 9 (Android platform
     18 version 2.3) and higher.
     19 Some features are only available at Android API level 14 (Android
     20 platform version 4.0) and higher; these are noted.
     21 <p>
     22 OpenSL ES provides a C language interface that is also callable from C++, and
     23 exposes features similar to the audio portions of these Android APIs
     24 callable from Java programming language code:
     25 <ul>
     26 <li><a href="http://developer.android.com/reference/android/media/MediaPlayer.html">
     27 android.media.MediaPlayer</a>
     28 <li><a href="http://developer.android.com/reference/android/media/MediaRecorder.html">
     29 android.media.MediaRecorder</a>
     30 </ul>
     31 
     32 As with all of the Android Native Development Kit (NDK), the primary
     33 purpose of OpenSL ES for Android is to facilitate the implementation
     34 of shared libraries to be called from Java programming language code via Java Native
     35 Interface (JNI).  NDK is not intended for writing pure C/C++
     36 applications.  That said, OpenSL ES is a full-featured API, and we
     37 expect that you should be able to accomplish most of your audio
     38 needs using only this API, without up-calls to code running in the Dalvik VM.
     39 <p>
     40 (Throughout this document, "Dalvik" can be considered to also refer to "ART").
     41 
     42 <p>
     43 Note: though based on OpenSL ES, the Android native audio API
     44 is <i>not</i> a conforming implementation of any OpenSL ES 1.0.1
     45 profile (game, music, or phone). This is because Android does not
     46 implement all of the features required by any one of the profiles.
     47 Any known cases where Android behaves differently than the specification
     48 are described in section "Android extensions" below.
     49 
     50 <h2>Getting started</h2>
     51 
     52 <h3>Example code</h3>
     53 
     54 <h4>Recommended</h4>
     55 
     56 Supported and tested example code, usable as a model
     57 for your own code, is located in NDK folder
     58 <code>platforms/android-9/samples/native-audio/</code>.
     59 
     60 <h4>Not recommended</h4>
     61 
     62 The OpenSL ES 1.0.1 specification contains example code in the
     63 appendices (see section "References" below for the link to this
     64 specification).  However, the examples in Appendix B: Sample Code
     65 and Appendix C: Use Case Sample Code use features
     66 not supported by Android. Some examples also contain
     67 typographical errors, or use APIs that are likely to change.
     68 Proceed with caution in referring to these;
     69 though the code may be helpful in understanding the full OpenSL ES
     70 standard, it should not be used as is with Android.
     71 
     72 <h3>Adding OpenSL ES to your application source code</h3>
     73 
     74 OpenSL ES is a C API, but is callable from both C and C++ code.
     75 <p>
     76 At a minimum, add the following line to your code:
     77 <pre>
     78 #include &lt;SLES/OpenSLES.h&gt;
     79 </pre>
     80 
     81 If you use Android extensions, also include this header:
     82 <pre>
     83 #include &lt;SLES/OpenSLES_Android.h&gt;
     84 </pre>
     85 which automatically includes these headers as well (you don't need to
     86 include these, they are shown as an aid in learning the API):
     87 <pre>
     88 #include &lt;SLES/OpenSLES_AndroidConfiguration.h&gt;
     89 #include &lt;SLES/OpenSLES_AndroidMetadata.h&gt;
     90 </pre>
     91 
     92 <h3>Makefile</h3>
     93 
     94 Modify your Android.mk as follows:
     95 <pre>
     96 LOCAL_LDLIBS += -lOpenSLES
     97 </pre>
     98 
     99 <h3>Audio content</h3>
    100 
    101 There are many ways to package audio content for your
    102 application, including:
    103 
    104 <dl>
    105 
    106 <dt>Resources</dt>
    107 <dd>
    108 By placing your audio files into the <code>res/raw/</code> folder,
    109 they can be accessed easily by the associated APIs for
    110 <a href="http://developer.android.com/reference/android/content/res/Resources.html">
    111 Resources</a>.  However there is no direct native access to resources,
    112 so you will need to write Java programming language code to copy them out before use.
    113 </dd>
    114 
    115 <dt>Assets</dt>
    116 <dd>
    117 By placing your audio files into the <code>assets/</code> folder,
    118 they will be directly accessible by the Android native asset manager
    119 APIs.  See the header files <code>android/asset_manager.h</code>
    120 and <code>android/asset_manager_jni.h</code> for more information
    121 on these APIs.  The example code
    122 located in NDK folder
    123 <code>platforms/android-9/samples/native-audio/</code> uses these
    124 native asset manager APIs in conjunction with the Android file
    125 descriptor data locator.
    126 </dd>
    127 
    128 <dt>Network</dt>
    129 <dd>
    130 You can use the URI data locator to play audio content directly from the
    131 network. However, be sure to read section "Security and permissions" below.
    132 </dd>
    133 
    134 <dt>Local filesystem</dt>
    135 <dd>
    136 The URI data locator supports the <code>file:</code> scheme for local files,
    137 provided the files are accessible by the application.
    138 Note that the Android security framework restricts file access via
    139 the Linux user ID and group ID mechanism.
    140 </dd>
    141 
    142 <dt>Recorded</dt>
    143 <dd>Your application can record audio data from the microphone input,
    144 store this content, and then play it back later.
    145 The example code uses this method for the "Playback" clip.
    146 </dd>
    147 
    148 <dt>Compiled and linked inline</dt>
    149 <dd>
    150 You can link your audio content directly into the shared library,
    151 and then play it using an audio player with buffer queue data locator.  This is most
    152 suitable for short PCM format clips.  The example code uses this
    153 technique for the "Hello" and "Android" clips. The PCM data was
    154 converted to hex strings using a <code>bin2c</code> tool (not supplied).
    155 </dd>
    156 
    157 <dt>Real-time synthesis</dt>
    158 <dd>
    159 Your application can synthesize PCM data on the fly and then play it
    160 using an audio player with buffer queue data locator.  This is a
    161 relatively advanced technique, and the details of audio synthesis
    162 are beyond the scope of this article.
    163 </dd>
    164 
    165 </dl>
    166 
    167 Finding or creating useful audio content for your application is
    168 beyond the scope of this article, but see the "References" section
    169 below for some suggested web search terms.
    170 <p>
    171 Note that it is your responsibility to ensure that you are legally
    172 permitted to play or record content, and that there may be privacy
    173 considerations for recording content.
    174 
    175 <h3>Debugging</h3>
    176 
    177 For robustness, we recommend that you examine the <code>SLresult</code>
    178 value which is returned by most APIs. Use of <code>assert</code>
    179 vs. more advanced error handling logic is a matter of coding style
    180 and the particular API; see the Wikipedia article on
    181 <a href="http://en.wikipedia.org/wiki/Assertion_(computing)">assert</a>
    182 for more information. In the supplied example, we have used <code>assert</code>
    183 for "impossible" conditions which would indicate a coding error, and
    184 explicit error handling for others which are more likely to occur
    185 in production.
    186 <p>
    187 Many API errors result in a log entry, in addition to the non-zero
    188 result code. These log entries provide additional detail which can
    189 be especially useful for the more complex APIs such as
    190 <code>Engine::CreateAudioPlayer</code>.
    191 <p>
    192 Use <a href="http://developer.android.com/guide/developing/tools/adb.html">
    193 adb logcat</a>, the
    194 <a href="http://developer.android.com/guide/developing/eclipse-adt.html">
    195 Eclipse ADT plugin</a> LogCat pane, or
    196 <a href="http://developer.android.com/guide/developing/tools/ddms.html#logcat">
    197 ddms logcat</a> to see the log.
    198 
    199 <h2>Supported features from OpenSL ES 1.0.1</h2>
    200 
    201 This section summarizes available features. In some
    202 cases, there are limitations which are described in the next
    203 sub-section.
    204 
    205 <h3>Global entry points</h3>
    206 
    207 Supported global entry points:
    208 <ul>
    209 <li><code>slCreateEngine</code>
    210 <li><code>slQueryNumSupportedEngineInterfaces</code>
    211 <li><code>slQuerySupportedEngineInterfaces</code>
    212 </ul>
    213 
    214 <h3>Objects and interfaces</h3>
    215 
    216 The following figure indicates objects and interfaces supported by
    217 Android's OpenSL ES implementation.  A green cell means the feature
    218 is supported.
    219 
    220 <p>
    221 <img src="chart1.png" alt="Supported objects and interfaces">
    222 
    223 <h3>Limitations</h3>
    224 
    225 This section details limitations with respect to the supported
    226 objects and interfaces from the previous section.
    227 
    228 <h4>Buffer queue data locator</h4>
    229 
    230 An audio player or recorder with buffer queue data locator supports
    231 PCM data format only.
    232 
    233 <h4>Device data locator</h4>
    234 
    235 The only supported use of an I/O device data locator is when it is
    236 specified as the data source for <code>Engine::CreateAudioRecorder</code>.
    237 It should be initialized using these values, as shown in the example:
    238 <pre>
    239 SLDataLocator_IODevice loc_dev =
    240   {SL_DATALOCATOR_IODEVICE, SL_IODEVICE_AUDIOINPUT,
    241   SL_DEFAULTDEVICEID_AUDIOINPUT, NULL};
    242 </pre>
    243 
    244 <h4>Dynamic interface management</h4>
    245 
    246 <code>RemoveInterface</code> and <code>ResumeInterface</code> are not supported.
    247 
    248 <h4>Effect combinations</h4>
    249 
    250 It is meaningless to have both environmental reverb and preset
    251 reverb on the same output mix.
    252 <p>
    253 The platform may ignore effect requests if it estimates that the
    254 CPU load would be too high.
    255 
    256 <h4>Effect send</h4>
    257 
    258 <code>SetSendLevel</code> supports a single send level per audio player.
    259 
    260 <h4>Environmental reverb</h4>
    261 
    262 Environmental reverb does not support the <code>reflectionsDelay</code>,
    263 <code>reflectionsLevel</code>, or <code>reverbDelay</code> fields of
    264 <code>struct SLEnvironmentalReverbSettings</code>.
    265 
    266 <h4>MIME data format</h4>
    267 
    268 The MIME data format can be used with URI data locator only, and only
    269 for player (not recorder).
    270 <p>
    271 The Android implementation of OpenSL ES requires that <code>mimeType</code>
    272 be initialized to either <code>NULL</code> or a valid UTF-8 string,
    273 and that <code>containerType</code> be initialized to a valid value.
    274 In the absence of other considerations, such as portability to other
    275 implementations, or content format which cannot be identified by header,
    276 we recommend that you
    277 set the <code>mimeType</code> to <code>NULL</code> and <code>containerType</code>
    278 to <code>SL_CONTAINERTYPE_UNSPECIFIED</code>.
    279 <p>
    280 Supported formats include WAV PCM, WAV alaw, WAV ulaw, MP3, Ogg
    281 Vorbis, AAC LC, HE-AACv1 (aacPlus), HE-AACv2 (enhanced aacPlus),
    282 AMR, and FLAC [provided these are supported by the overall platform,
    283 and AAC formats must be located within an MP4 or ADTS container].
    284 MIDI is not supported.
    285 WMA is not part of the open source release, and compatibility
    286 with Android OpenSL ES has not been verified.
    287 <p>
    288 The Android implementation of OpenSL ES does not support direct
    289 playback of DRM or encrypted content; if you want to play this, you
    290 will need to convert to cleartext in your application before playing,
    291 and enforce any DRM restrictions in your application.
    292 
    293 <h4>Object</h4>
    294 
    295 <code>Resume</code>, <code>RegisterCallback</code>,
    296 <code>AbortAsyncOperation</code>, <code>SetPriority</code>,
    297 <code>GetPriority</code>, and <code>SetLossOfControlInterfaces</code>
    298 are not supported.
    299 
    300 <h4>PCM data format</h4>
    301 
    302 The PCM data format can be used with buffer queues only. Supported PCM
    303 playback configurations are 8-bit unsigned or 16-bit signed, mono
    304 or stereo, little endian byte ordering, and these sample rates:
    305 8000, 11025, 12000, 16000, 22050, 24000, 32000, 44100, or 48000 Hz.
    306 For recording, the supported configurations are device-dependent,
    307 however generally 16000 Hz mono 16-bit signed is usually available.
    308 <p>
    309 Note that the field <code>samplesPerSec</code> is actually in
    310 units of milliHz, despite the misleading name. To avoid accidentally
    311 using the wrong value, you should initialize this field using one
    312 of the symbolic constants defined for this purpose (such as
    313 <code>SL_SAMPLINGRATE_44_1</code> etc.)
    314 <p>
    315 For API level 21 and above, see section "Floating-point data" below.
    316 
    317 <h4>Playback rate</h4>
    318 
    319 The supported playback rate range(s) and capabilities may vary depending
    320 on the platform version and implementation, and so should be determined
    321 at runtime by querying with <code>PlaybackRate::GetRateRange</code>
    322 or <code>PlaybackRate::GetCapabilitiesOfRate</code>.
    323 <p>
    324 That said, some guidance on typical rate ranges may be useful:
    325 In Android 2.3 a single playback rate range from 500 per mille to 2000 per mille
    326 inclusive is typically supported, with property
    327 <code>SL_RATEPROP_NOPITCHCORAUDIO</code>.
    328 In Android 4.0 the same rate range is typically supported for a data source
    329 in PCM format, and a unity rate range for other formats.
    330 
    331 <h4>Record</h4>
    332 
    333 The <code>SL_RECORDEVENT_HEADATLIMIT</code> and
    334 <code>SL_RECORDEVENT_HEADMOVING</code> events are not supported.
    335 
    336 <h4>Seek</h4>
    337 
    338 <code>SetLoop</code> enables whole file looping. The <code>startPos</code>
    339 parameter should be zero and the <code>endPos</code> parameter should
    340 be <code>SL_TIME_UNKNOWN</code>.
    341 
    342 <h4>URI data locator</h4>
    343 
    344 The URI data locator can be used with MIME data format only, and
    345 only for an audio player (not audio recorder). Supported schemes
    346 are <code>http:</code> and <code>file:</code>.
    347 A missing scheme defaults to the <code>file:</code> scheme. Other
    348 schemes such as <code>https:</code>, <code>ftp:</code>, and
    349 <code>content:</code> are not supported.
    350 <code>rtsp:</code> is not verified.
    351 
    352 <h3>Data structures</h3>
    353 
    354 Android supports these OpenSL ES 1.0.1 data structures:
    355 <ul>
    356 <li>SLDataFormat_MIME
    357 <li>SLDataFormat_PCM
    358 <li>SLDataLocator_BufferQueue
    359 <li>SLDataLocator_IODevice
    360 <li>SLDataLocator_OutputMix
    361 <li>SLDataLocator_URI
    362 <li>SLDataSink
    363 <li>SLDataSource
    364 <li>SLEngineOption
    365 <li>SLEnvironmentalReverbSettings
    366 <li>SLInterfaceID
    367 </ul>
    368 
    369 <h3>Platform configuration</h3>
    370 
    371 OpenSL ES for Android is designed for multi-threaded applications,
    372 and is thread-safe.
    373 <p>
    374 OpenSL ES for Android supports a single engine per application, and
    375 up to 32 objects. Available device memory and CPU may further
    376 restrict the usable number of objects.
    377 <p>
    378 <code>slCreateEngine</code> recognizes, but ignores, these engine options:
    379 <ul>
    380 <li><code>SL_ENGINEOPTION_THREADSAFE</code>
    381 <li><code>SL_ENGINEOPTION_LOSSOFCONTROL</code>
    382 </ul>
    383 
    384 OpenMAX AL and OpenSL ES may be used together in the same application.
    385 In this case, there is internally a single shared engine object,
    386 and the 32 object limit is shared between OpenMAX AL and OpenSL ES.
    387 The application should first create both engines, then use both engines,
    388 and finally destroy both engines.  The implementation maintains a
    389 reference count on the shared engine, so that it is correctly destroyed
    390 at the second destroy.
    391 
    392 <h2>Planning for future versions of OpenSL ES</h2>
    393 
    394 The Android native audio APIs are based on Khronos
    395 Group OpenSL ES 1.0.1 (see section "References" below).
    396 Khronos has released
    397 a revised version 1.1 of the standard. The revised version
    398 includes new features, clarifications, correction of
    399 typographical errors, and some incompatibilities. Most of the expected
    400 incompatibilities are relatively minor, or are in areas of OpenSL ES
    401 not supported by Android. However, even a small change
    402 can be significant for an application developer, so it is important
    403 to prepare for this.
    404 <p>
    405 The Android team is committed to preserving future API binary
    406 compatibility for developers to the extent feasible. It is our
    407 intention to continue to support future binary compatibility of the
    408 1.0.1-based API, even as we add support for later versions of the
    409 standard. An application developed with this version should
    410 work on future versions of the Android platform, provided that
    411 you follow the guidelines listed in section "Planning for
    412 binary compatibility" below.
    413 <p>
    414 Note that future source compatibility will <i>not</i> be a goal. That is,
    415 if you upgrade to a newer version of the NDK, you may need to modify
    416 your application source code to conform to the new API. We expect
    417 that most such changes will be minor; see details below.
    418 
    419 <h3>Planning for binary compatibility</h3>
    420 
    421 We recommend that your application follow these guidelines,
    422 to improve future binary compatibility:
    423 <ul>
    424 <li>
    425 Use only the documented subset of Android-supported features from
    426 OpenSL ES 1.0.1.
    427 <li>
    428 Do not depend on a particular result code for an unsuccessful
    429 operation; be prepared to deal with a different result code.
    430 <li>
    431 Application callback handlers generally run in a restricted context,
    432 and should be written to perform their work quickly and then return
    433 as soon as possible. Do not do complex operations within a callback
    434 handler. For example, within a buffer queue completion callback,
    435 you can enqueue another buffer, but do not create an audio player.
    436 <li>
    437 Callback handlers should be prepared to be called more or less
    438 frequently, to receive additional event types, and should ignore
    439 event types that they do not recognize. Callbacks that are configured
    440 with an event mask of enabled event types should be prepared to be
    441 called with multiple event type bits set simultaneously.
    442 Use "&amp;" to test for each event bit rather than a switch case.
    443 <li>
    444 Use prefetch status and callbacks as a general indication of progress, but do
    445 not depend on specific hard-coded fill levels or callback sequence.
    446 The meaning of the prefetch status fill level, and the behavior for
    447 errors that are detected during prefetch, may change.
    448 <li>
    449 See section "Buffer queue behavior" below.
    450 </ul>
    451 
    452 <h3>Planning for source compatibility</h3>
    453 
    454 As mentioned, source code incompatibilities are expected in the next
    455 version of OpenSL ES from Khronos Group. Likely areas of change include:
    456 
    457 <ul>
    458 <li>The buffer queue interface is expected to have significant changes,
    459 especially in the areas of <code>BufferQueue::Enqueue</code>, the parameter
    460 list for <code>slBufferQueueCallback</code>,
    461 and the name of field <code>SLBufferQueueState.playIndex</code>.
    462 We recommend that your application code use Android simple buffer
    463 queues instead, because we do not plan to change that API.
    464 In the example code supplied with the NDK, we have used
    465 Android simple buffer queues for playback for this reason.
    466 (We also use Android simple buffer queue for recording and decode to PCM, but
    467 that is because standard OpenSL ES 1.0.1 does not support record or decode to
    468 a buffer queue data sink.)
    469 <li>Addition of <code>const</code> to input parameters passed by reference,
    470 and to <code>SLchar *</code> struct fields used as input values.
    471 This should not require any changes to your code.
    472 <li>Substitution of unsigned types for some parameters that are
    473 currently signed.  You may need to change a parameter type from
    474 <code>SLint32</code> to <code>SLuint32</code> or similar, or add a cast.
    475 <li><code>Equalizer::GetPresetName</code> will copy the string to
    476 application memory instead of returning a pointer to implementation
    477 memory. This will be a significant change, so we recommend that you
    478 either avoid calling this method, or isolate your use of it.
    479 <li>Additional fields in struct types. For output parameters, these
    480 new fields can be ignored, but for input parameters the new fields
    481 will need to be initialized. Fortunately, these are expected to all
    482 be in areas not supported by Android.
    483 <li>Interface
    484 <a href="http://en.wikipedia.org/wiki/Globally_unique_identifier">
    485 GUIDs</a> will change. Refer to interfaces by symbolic name rather than GUID
    486 to avoid a dependency.
    487 <li><code>SLchar</code> will change from <code>unsigned char</code>
    488 to <code>char</code>. This primarily affects the URI data locator
    489 and MIME data format.
    490 <li><code>SLDataFormat_MIME.mimeType</code> will be renamed to <code>pMimeType</code>,
    491 and <code>SLDataLocator_URI.URI</code> will be renamed to <code>pURI</code>.
    492 We recommend that you initialize the <code>SLDataFormat_MIME</code>
    493 and <code>SLDataLocator_URI</code>
    494 data structures using a brace-enclosed comma-separated list of values,
    495 rather than by field name, to isolate your code from this change.
    496 In the example code we have used this technique.
    497 <li><code>SL_DATAFORMAT_PCM</code> does not permit the application
    498 to specify the representation of the data as signed integer, unsigned
    499 integer, or floating-point. The Android implementation assumes that
    500 8-bit data is unsigned integer and 16-bit is signed integer.  In
    501 addition, the field <code>samplesPerSec</code> is a misnomer, as
    502 the actual units are milliHz. These issues are expected to be
    503 addressed in the next OpenSL ES version, which will introduce a new
    504 extended PCM data format that permits the application to explicitly
    505 specify the representation, and corrects the field name.  As this
    506 will be a new data format, and the current PCM data format will
    507 still be available (though deprecated), it should not require any
    508 immediate changes to your code.
    509 </ul>
    510 
    511 <h2>Android extensions</h2>
    512 
    513 The API for Android extensions is defined in <code>SLES/OpenSLES_Android.h</code>
    514 and the header files that it includes.
    515 Consult that file for details on these extensions. Unless otherwise
    516 noted, all interfaces are "explicit".
    517 <p>
    518 Note that use these extensions will limit your application's
    519 portability to other OpenSL ES implementations. If this is a concern,
    520 we advise that you avoid using them, or isolate your use of these
    521 with <code>#ifdef</code> etc.
    522 <p>
    523 The following figure shows which Android-specific interfaces and
    524 data locators are available for each object type.
    525 
    526 <p>
    527 <img src="chart2.png" alt="Android extensions">
    528 
    529 <h3>Android configuration interface</h3>
    530 
    531 The Android configuration interface provides a means to set
    532 platform-specific parameters for objects. Unlike other OpenSL ES
    533 1.0.1 interfaces, the Android configuration interface is available
    534 prior to object realization. This permits the object to be configured
    535 and then realized. Header file <code>SLES/OpenSLES_AndroidConfiguration.h</code>
    536 documents the available configuration keys and values:
    537 <ul>
    538 <li>stream type for audio players (default <code>SL_ANDROID_STREAM_MEDIA</code>)
    539 <li>record profile for audio recorders (default <code>SL_ANDROID_RECORDING_PRESET_GENERIC</code>)
    540 </ul>
    541 Here is an example code fragment that sets the Android audio stream type on an audio player:
    542 <pre>
    543 // CreateAudioPlayer and specify SL_IID_ANDROIDCONFIGURATION
    544 // in the required interface ID array. Do not realize player yet.
    545 // ...
    546 SLAndroidConfigurationItf playerConfig;
    547 result = (*playerObject)-&gt;GetInterface(playerObject,
    548     SL_IID_ANDROIDCONFIGURATION, &amp;playerConfig);
    549 assert(SL_RESULT_SUCCESS == result);
    550 SLint32 streamType = SL_ANDROID_STREAM_ALARM;
    551 result = (*playerConfig)-&gt;SetConfiguration(playerConfig,
    552     SL_ANDROID_KEY_STREAM_TYPE, &amp;streamType, sizeof(SLint32));
    553 assert(SL_RESULT_SUCCESS == result);
    554 // ...
    555 // Now realize the player here.
    556 </pre>
    557 Similar code can be used to configure the preset for an audio recorder.
    558 
    559 <h3>Android effects interfaces</h3>
    560 
    561 The Android effect, effect send, and effect capabilities interfaces provide
    562 a generic mechanism for an application to query and use device-specific
    563 audio effects. A device manufacturer should document any available
    564 device-specific audio effects.
    565 <p>
    566 Portable applications should use the OpenSL ES 1.0.1 APIs
    567 for audio effects instead of the Android effect extensions.
    568 
    569 <h3>Android file descriptor data locator</h3>
    570 
    571 The Android file descriptor data locator permits the source for an
    572 audio player to be specified as an open file descriptor with read
    573 access. The data format must be MIME.
    574 <p>
    575 This is especially useful in conjunction with the native asset manager.
    576 
    577 <h3>Android simple buffer queue data locator and interface</h3>
    578 
    579 The Android simple buffer queue data locator and interface are
    580 identical to the OpenSL ES 1.0.1 buffer queue locator and interface,
    581 except that Android simple buffer queues may be used with both audio
    582 players and audio recorders, and are limited to PCM data format.
    583 [OpenSL ES 1.0.1 buffer queues are for audio players only, and are not
    584 restricted to PCM data format.]
    585 <p>
    586 For recording, the application should enqueue empty buffers. Upon
    587 notification of completion via a registered callback, the filled
    588 buffer is available for the application to read.
    589 <p>
    590 For playback there is no difference. But for future source code
    591 compatibility, we suggest that applications use Android simple
    592 buffer queues instead of OpenSL ES 1.0.1 buffer queues.
    593 
    594 <h3>Dynamic interfaces at object creation</h3>
    595 
    596 For convenience, the Android implementation of OpenSL ES 1.0.1
    597 permits dynamic interfaces to be specified at object creation time,
    598 as an alternative to adding these interfaces after object creation
    599 with <code>DynamicInterfaceManagement::AddInterface</code>.
    600 
    601 <h3>Buffer queue behavior</h3>
    602 
    603 The OpenSL ES 1.0.1 specification requires that "On transition to
    604 the <code>SL_PLAYSTATE_STOPPED</code> state the play cursor is
    605 returned to the beginning of the currently playing buffer." The
    606 Android implementation does not necessarily conform to this
    607 requirement. For Android, it is unspecified whether a transition
    608 to <code>SL_PLAYSTATE_STOPPED</code> operates as described, or
    609 leaves the play cursor unchanged.
    610 <p>
    611 We recommend that you do not rely on either behavior; after a
    612 transition to <code>SL_PLAYSTATE_STOPPED</code>, you should explicitly
    613 call <code>BufferQueue::Clear</code>. This will place the buffer
    614 queue into a known state.
    615 <p>
    616 A corollary is that it is unspecified whether buffer queue callbacks
    617 are called upon transition to <code>SL_PLAYSTATE_STOPPED</code> or by
    618 <code>BufferQueue::Clear</code>.
    619 We recommend that you do not rely on either behavior; be prepared
    620 to receive a callback in these cases, but also do not depend on
    621 receiving one.
    622 <p>
    623 It is expected that a future version of OpenSL ES will clarify these
    624 issues. However, upgrading to that version would result in source
    625 code incompatibilities (see section "Planning for source compatibility"
    626 above).
    627 
    628 <h3>Reporting of extensions</h3>
    629 
    630 <code>Engine::QueryNumSupportedExtensions</code>,
    631 <code>Engine::QuerySupportedExtension</code>,
    632 <code>Engine::IsExtensionSupported</code> report these extensions:
    633 <ul>
    634 <li><code>ANDROID_SDK_LEVEL_#</code>
    635 where # is the platform API level, 9 or higher
    636 </ul>
    637 
    638 <h3>Decode audio to PCM</h3>
    639 
    640 <b>Note:</b>
    641 For decoding an encoded stream to PCM without immediate playback,
    642 <a
    643 href="http://developer.android.com/reference/android/media/MediaCodec.html">android.media.MediaCodec</a>
    644 is recommended for new applications at API level 16 and above.  The NDK
    645 equivalent (in &lt;media/NdkMedia*.h&gt;) is recommended for new native
    646 applications at API level 21 and above.
    647 <p>
    648 
    649 Note: this feature is available at API level 14 and higher.
    650 <p>
    651 A standard audio player plays back to an audio device, and the data sink
    652 is specified as an output mix.
    653 However, as an Android extension, an audio player instead
    654 acts as a decoder if the data source is specified as a URI or Android
    655 file descriptor data locator with MIME data format, and the data sink is
    656 an Android simple buffer queue data locator with PCM data format.
    657 <p>
    658 This feature is primarily intended for games to pre-load their
    659 audio assets when changing to a new game level, similar to
    660 <code>android.media.SoundPool</code>.
    661 <p>
    662 The application should initially enqueue a set of empty buffers to the Android simple
    663 buffer queue, which will be filled with PCM data.  The Android simple
    664 buffer queue callback is invoked after each buffer is filled. The
    665 callback handler should process the PCM data, re-enqueue the
    666 now-empty buffer, and then return.  The application is responsible for
    667 keeping track of decoded buffers; the callback parameter list does not include
    668 sufficient information to indicate which buffer was filled or which buffer to enqueue next.
    669 <p>
    670 The end of stream is determined implicitly by the data source.
    671 At the end of stream a <code>SL_PLAYEVENT_HEADATEND</code> event is
    672 delivered. The Android simple buffer queue callback will no longer
    673 be called after all consumed data is decoded.
    674 <p>
    675 The sink's PCM data format typically matches that of the encoded data source
    676 with respect to sample rate, channel count, and bit depth. However, the platform
    677 implementation is permitted to decode to a different sample rate, channel count, or bit depth.
    678 There is a provision to detect the actual PCM format; see section "Determining
    679 the format of decoded PCM data via metadata" below.
    680 <p>
    681 Decode to PCM supports pause and initial seek.  Volume control, effects,
    682 looping, and playback rate are not supported.
    683 <p>
    684 Depending on the platform implementation, decoding may require resources
    685 that cannot be left idle.  Therefore it is not recommended to starve the
    686 decoder by failing to provide a sufficient number of empty PCM buffers,
    687 e.g. by returning from the Android simple buffer queue callback without
    688 enqueueing another empty buffer.  The result of decoder starvation is
    689 unspecified; the implementation may choose to either drop the decoded
    690 PCM data, pause the decoding process, or in severe cases terminate
    691 the decoder.
    692 
    693 <h3>Decode streaming ADTS AAC to PCM</h3>
    694 
    695 Note: this feature is available at API level 14 and higher.
    696 <p>
    697 An audio player acts as a streaming decoder if the data source is an
    698 Android buffer queue data locator with MIME data format, and the data
    699 sink is an Android simple buffer queue data locator with PCM data format.
    700 The MIME data format should be configured as:
    701 <dl>
    702 <dt>container</dt>
    703 <dd><code>SL_CONTAINERTYPE_RAW</code>
    704 <dt>MIME type string
    705 <dd><code>"audio/vnd.android.aac-adts"</code> (macro <code>SL_ANDROID_MIME_AACADTS</code>)
    706 </dl>
    707 <p>
    708 This feature is primarily intended for streaming media applications that
    709 deal with AAC audio, but need to apply custom processing of the audio
    710 prior to playback.  Most applications that need to decode audio to PCM
    711 should use the method of the previous section "Decode audio to PCM",
    712 as it is simpler and handles more audio formats.  The technique described
    713 here is a more specialized approach, to be used only if both of these
    714 conditions apply:
    715 <ul>
    716 <li>the compressed audio source is a stream of AAC frames contained by ADTS headers
    717 <li>the application manages this stream, that is the data is <i>not</i> located within
    718 a network resource identified by URI or within a local file identified by file descriptor.
    719 </ul>
    720 The application should initially enqueue a set of filled buffers to the Android buffer queue.
    721 Each buffer contains one or more complete ADTS AAC frames.
    722 The Android buffer queue callback is invoked after each buffer is emptied.
    723 The callback handler should re-fill and re-enqueue the buffer, and then return.
    724 The application need not keep track of encoded buffers; the callback parameter
    725 list does include sufficient information to indicate which buffer to enqueue next.
    726 The end of stream is explicitly marked by enqueuing an EOS item.
    727 After EOS, no more enqueues are permitted.
    728 <p>
    729 It is not recommended to starve the decoder by failing to provide full
    730 ADTS AAC buffers, e.g. by returning from the Android buffer queue callback
    731 without enqueueing another full buffer.  The result of decoder starvation
    732 is unspecified.
    733 <p>
    734 In all respects except for the data source, the streaming decode method is similar
    735 to that of the previous section:
    736 <ul>
    737 <li>initially enqueue a set of empty buffers to the Android simple buffer queue
    738 <li>the Android simple buffer queue callback is invoked after each buffer is filled with PCM data;
    739 the callback handler should process the PCM data and then re-enqueue another empty buffer
    740 <li>the <code>SL_PLAYEVENT_HEADATEND</code> event is delivered at end of stream
    741 <li>the actual PCM format should be detected using metadata rather than by making an assumption
    742 <li>the same limitations apply with respect to volume control, effects, etc.
    743 <li>starvation for lack of empty PCM buffers is not recommended
    744 </ul>
    745 <p>
    746 Despite the similarity in names, an Android buffer queue is <i>not</i>
    747 the same as an Android simple buffer queue.  The streaming decoder
    748 uses both kinds of buffer queues: an Android buffer queue for the ADTS
    749 AAC data source, and an Android simple buffer queue for the PCM data
    750 sink.  The Android simple buffer queue API is described in this document
    751 in section "Android simple buffer queue data locator and interface".
    752 The Android buffer queue API is described in the Android native media
    753 API documentation, located in <a href="../openmaxal/index.html">docs/openmaxal/index.html</a>.
    754 
    755 <h3>Determining the format of decoded PCM data via metadata</h3>
    756 
    757 The metadata extraction interface <code>SLMetadataExtractionItf</code>
    758 is a standard OpenSL ES 1.0.1 interface, not an Android extension.
    759 However, the particular metadata keys that
    760 indicate the actual format of decoded PCM data are specific to Android,
    761 and are defined in header <code>SLES/OpenSLES_AndroidMetadata.h</code>.
    762 <p>
    763 The metadata key indices are available immediately after
    764 <code>Object::Realize</code>. Yet the associated values are not
    765 available until after the first encoded data has been decoded.  A good
    766 practice is to query for the key indices in the main thread after Realize,
    767 and to read the PCM format metadata values in the Android simple
    768 buffer queue callback handler the first time it is called.
    769 <p>
    770 The OpenSL ES 1.0.1 metadata extraction interface
    771 <code>SLMetadataExtractionItf</code> is admittedly cumbersome, as it
    772 requires a multi-step process to first determine key indices and then
    773 to get the key values.  Consult the example code for snippets showing
    774 how to work with this interface.
    775 <p>
    776 Metadata key names are stable.  But the key indices are not documented
    777 and are subject to change.  An application should not assume that indices
    778 are persistent across different execution runs, and should not assume that
    779 indices are shared for different object instances within the same run.
    780 
    781 <h3>Floating-point data</h3>
    782 
    783 As of API level 21 and above, data can be supplied to an AudioPlayer in
    784 single-precision floating-point format.
    785 <p>
    786 Example code fragment, to be used during the Engine::CreateAudioPlayer process:
    787 <pre>
    788 #include &lt;SLES/OpenSLES_Android.h&gt;
    789 ...
    790 SLAndroidDataFormat_PCM_EX pcm;
    791 pcm.formatType = SL_ANDROID_DATAFORMAT_PCM_EX;
    792 pcm.numChannels = 2;
    793 pcm.sampleRate = SL_SAMPLINGRATE_44_1;
    794 pcm.bitsPerSample = 32;
    795 pcm.containerSize = 32;
    796 pcm.channelMask = SL_SPEAKER_FRONT_LEFT | SL_SPEAKER_FRONT_RIGHT;
    797 pcm.endianness = SL_BYTEORDER_LITTLEENDIAN;
    798 pcm.representation = SL_ANDROID_PCM_REPRESENTATION_FLOAT;
    799 ...
    800 SLDataSource audiosrc;
    801 audiosrc.pLocator = ...
    802 audiosrc.pFormat = &amp;pcm;
    803 </pre>
    804 
    805 <h2>Programming notes</h2>
    806 
    807 These notes supplement the OpenSL ES 1.0.1 specification,
    808 available in the "References" section below.
    809 
    810 <h3>Objects and interface initialization</h3>
    811 
    812 Two aspects of the OpenSL ES programming model that may be unfamiliar
    813 to new developers are the distinction between objects and interfaces,
    814 and the initialization sequence.
    815 <p>
    816 Briefly, an OpenSL ES object is similar to the object concept
    817 in programming languages such as Java and C++, except an OpenSL ES
    818 object is <i>only</i> visible via its associated interfaces. This
    819 includes the initial interface for all objects, called
    820 <code>SLObjectItf</code>.  There is no handle for an object itself,
    821 only a handle to the <code>SLObjectItf</code> interface of the object.
    822 <p>
    823 An OpenSL ES object is first "created", which returns an
    824 <code>SLObjectItf</code>, then "realized". This is similar to the
    825 common programming pattern of first constructing an object (which
    826 should never fail other than for lack of memory or invalid parameters),
    827 and then completing initialization (which may fail due to lack of
    828 resources).  The realize step gives the implementation a
    829 logical place to allocate additional resources if needed.
    830 <p>
    831 As part of the API to create an object, an application specifies
    832 an array of desired interfaces that it plans to acquire later. Note
    833 that this array does <i>not</i> automatically acquire the interfaces;
    834 it merely indicates a future intention to acquire them.  Interfaces
    835 are distinguished as "implicit" or "explicit".  An explicit interface
    836 <i>must</i> be listed in the array if it will be acquired later.
    837 An implicit interface need not be listed in the object create array,
    838 but there is no harm in listing it there.  OpenSL ES has one more
    839 kind of interface called "dynamic", which does not need to be
    840 specified in the object create array, and can be added later after
    841 the object is created.  The Android implementation provides a
    842 convenience feature to avoid this complexity; see section "Dynamic
    843 interfaces at object creation" above.
    844 <p>
    845 After the object is created and realized, the application should
    846 acquire interfaces for each feature it needs, using
    847 <code>GetInterface</code> on the initial <code>SLObjectItf</code>.
    848 <p>
    849 Finally, the object is available for use via its interfaces, though
    850 note that some objects require further setup. In particular, an
    851 audio player with URI data source needs a bit more preparation in
    852 order to detect connection errors. See the next section
    853 "Audio player prefetch" for details.
    854 <p>
    855 After your application is done with the object, you should explicitly
    856 destroy it; see section "Destroy" below.
    857 
    858 <h3>Audio player prefetch</h3>
    859 
    860 For an audio player with URI data source, <code>Object::Realize</code> allocates resources
    861 but does not connect to the data source (i.e. "prepare") or begin
    862 pre-fetching data. These occur once the player state is set to
    863 either <code>SL_PLAYSTATE_PAUSED</code> or <code>SL_PLAYSTATE_PLAYING</code>.
    864 <p>
    865 Note that some information may still be unknown until relatively
    866 late in this sequence. In particular, initially
    867 <code>Player::GetDuration</code> will return <code>SL_TIME_UNKNOWN</code>
    868 and <code>MuteSolo::GetChannelCount</code> will either return successfully
    869 with channel count zero
    870 or the error result <code>SL_RESULT_PRECONDITIONS_VIOLATED</code>.
    871 These APIs will return the proper values once they are known.
    872 <p>
    873 Other properties that are initially unknown include the sample rate
    874 and actual media content type based on examining the content's header
    875 (as opposed to the application-specified MIME type and container type).
    876 These too, are determined later during prepare / prefetch, but there are
    877 no APIs to retrieve them.
    878 <p>
    879 The prefetch status interface is useful for detecting when all
    880 information is available. Or, your application can poll periodically.
    881 Note that some information may <i>never</i> be known, for example,
    882 the duration of a streaming MP3.
    883 <p>
    884 The prefetch status interface is also useful for detecting errors.
    885 Register a callback and enable at least the
    886 <code>SL_PREFETCHEVENT_FILLLEVELCHANGE</code> and
    887 <code>SL_PREFETCHEVENT_STATUSCHANGE</code> events. If both of these
    888 events are delivered simultaneously, and
    889 <code>PrefetchStatus::GetFillLevel</code> reports a zero level, and
    890 <code>PrefetchStatus::GetPrefetchStatus</code> reports
    891 <code>SL_PREFETCHSTATUS_UNDERFLOW</code>, then this indicates a
    892 non-recoverable error in the data source.
    893 This includes the inability to connect to the data source because
    894 the local filename does not exist or the network URI is invalid.
    895 <p>
    896 The next version of OpenSL ES is expected to add more explicit
    897 support for handling errors in the data source. However, for future
    898 binary compatibility, we intend to continue to support the current
    899 method for reporting a non-recoverable error.
    900 <p>
    901 In summary, a recommended code sequence is:
    902 <ul>
    903 <li>Engine::CreateAudioPlayer
    904 <li>Object:Realize
    905 <li>Object::GetInterface for SL_IID_PREFETCHSTATUS
    906 <li>PrefetchStatus::SetCallbackEventsMask
    907 <li>PrefetchStatus::SetFillUpdatePeriod
    908 <li>PrefetchStatus::RegisterCallback
    909 <li>Object::GetInterface for SL_IID_PLAY
    910 <li>Play::SetPlayState to SL_PLAYSTATE_PAUSED or SL_PLAYSTATE_PLAYING
    911 <li>preparation and prefetching occur here; during this time your
    912 callback will be called with periodic status updates
    913 </ul>
    914 
    915 <h3>Destroy</h3>
    916 
    917 Be sure to destroy all objects on exit from your application.  Objects
    918 should be destroyed in reverse order of their creation, as it is
    919 not safe to destroy an object that has any dependent objects.
    920 For example, destroy in this order: audio players and recorders,
    921 output mix, then finally the engine.
    922 <p>
    923 OpenSL ES does not support automatic garbage collection or
    924 <a href="http://en.wikipedia.org/wiki/Reference_counting">reference counting</a>
    925 of interfaces. After you call <code>Object::Destroy</code>, all extant
    926 interfaces derived from the associated object become <i>undefined</i>.
    927 <p>
    928 The Android OpenSL ES implementation does not detect the incorrect
    929 use of such interfaces.
    930 Continuing to use such interfaces after the object is destroyed will
    931 cause your application to crash or behave in unpredictable ways.
    932 <p>
    933 We recommend that you explicitly set both the primary object interface
    934 and all associated interfaces to NULL as part of your object
    935 destruction sequence, to prevent the accidental misuse of a stale
    936 interface handle.
    937 
    938 <h3>Stereo panning</h3>
    939 
    940 When <code>Volume::EnableStereoPosition</code> is used to enable
    941 stereo panning of a mono source, there is a 3 dB reduction in total
    942 <a href="http://en.wikipedia.org/wiki/Sound_power_level">
    943 sound power level</a>.  This is needed to permit the total sound
    944 power level to remain constant as the source is panned from one
    945 channel to the other. Therefore, don't enable stereo positioning
    946 if you don't need it.  See the Wikipedia article on
    947 <a href="http://en.wikipedia.org/wiki/Panning_(audio)">audio panning</a>
    948 for more information.
    949 
    950 <h3>Callbacks and threads</h3>
    951 
    952 Callback handlers are generally called <i>synchronously</i> with
    953 respect to the event, that is, at the moment and location where the
    954 event is detected by the implementation. But this point is
    955 <i>asynchronous</i> with respect to the application. Thus you should
    956 use a non-blocking synchronization mechanism to control access
    957 to any variables shared between the application and the callback
    958 handler. In the example code, such as for buffer queues, we have
    959 either omitted this synchronization or used blocking synchronization in
    960 the interest of simplicity. However, proper non-blocking synchronization
    961 would be critical for any production code.
    962 <p>
    963 Callback handlers are called from internal
    964 non-application thread(s) which are not attached to the Dalvik virtual machine and thus
    965 are ineligible to use JNI. Because these internal threads are
    966 critical to the integrity of the OpenSL ES implementation, a callback
    967 handler should also not block or perform excessive work.
    968 <p>
    969 If your callback handler needs to use JNI, or execute work that is
    970 not proportional to the callback, the handler should instead post an
    971 event for another thread to process.  Examples of acceptable callback
    972 workload include rendering and enqueuing the next output buffer (for an
    973 AudioPlayer), processing the just-filled input buffer and enqueueing the
    974 next empty buffer (for an AudioRecorder), or simple APIs such as most
    975 of the "Get" family.  See section "Performance" below regarding the workload.
    976 <p>
    977 Note that the converse is safe: a Dalvik application thread which has
    978 entered JNI is allowed to directly call OpenSL ES APIs, including
    979 those which block. However, blocking calls are not recommended from
    980 the main thread, as they may result in the dreaded "Application Not
    981 Responding" (ANR).
    982 <p>
    983 The choice of which thread calls a callback handler is largely left up
    984 to the implementation.  The reason for this flexibility is to permit
    985 future optimizations, especially on multi-core devices.
    986 <p>
    987 The thread on which the callback handler runs is not guaranteed to have
    988 the same identity across different calls.  Therefore do not rely on the
    989 <code>pthread_t</code> returned by <code>pthread_self()</code>, or the
    990 <code>pid_t</code> returned by <code>gettid()</code>, to be consistent
    991 across calls.  Don't use the thread local storage (TLS) APIs such as
    992 <code>pthread_setspecific()</code> and <code>pthread_getspecific()</code>
    993 from a callback, for the same reason.
    994 <p>
    995 The implementation guarantees that concurrent callbacks of the same kind,
    996 for the same object, will not occur.  However, concurrent callbacks of
    997 <i>different</i> kinds for the same object are possible, on different threads.
    998 
    999 <h3>Performance</h3>
   1000 
   1001 As OpenSL ES is a native C API, non-Dalvik application threads which
   1002 call OpenSL ES have no Dalvik-related overhead such as garbage
   1003 collection pauses. With one exception described below, there is no additional performance
   1004 benefit to the use of OpenSL ES other than this. In particular, use
   1005 of OpenSL ES does not guarantee a lower audio latency, higher scheduling
   1006 priority, etc. than what the platform generally provides.
   1007 On the other hand, as the Android platform and specific device
   1008 implementations continue to evolve, an OpenSL ES application can
   1009 expect to benefit from any future system performance improvements.
   1010 <p>
   1011 One such evolution is support for reduced audio output latency.
   1012 The underpinnings for reduced output latency were first included in
   1013 the Android 4.1 platform release ("Jellybean"), and then continued
   1014 progress occurred in the Android 4.2 platform.  These improvements
   1015 are available via OpenSL ES for device implementations that claim feature
   1016 "android.hardware.audio.low_latency". If the device doesn't claim this
   1017 feature but supports API level 9 (Android platform version 2.3) or later,
   1018 then you can still use the OpenSL ES APIs but the output latency may be higher.
   1019 The lower output latency path is used only if the application requests a
   1020 buffer count of 2 or more, and a buffer size and sample rate that are
   1021 compatible with the device's native output configuration.
   1022 These parameters are device-specific and should be obtained as follows.
   1023 <p>
   1024 Beginning with API level 17 (Android platform version 4.2), an application
   1025 can query for the native or optimal output sample rate and buffer size
   1026 for the device's primary output stream.  When combined with the feature
   1027 test just mentioned, an app can now configure itself appropriately for
   1028 lower latency output on devices that claim support.
   1029 <p>
   1030 The recommended sequence is:
   1031 <ol>
   1032 <li>Check for API level 9 or higher, to confirm use of OpenSL ES.
   1033 <li>Check for feature "android.hardware.audio.low_latency" using code such as this:
   1034 <pre>
   1035 import android.content.pm.PackageManager;
   1036 ...
   1037 PackageManager pm = getContext().getPackageManager();
   1038 boolean claimsFeature = pm.hasSystemFeature(PackageManager.FEATURE_AUDIO_LOW_LATENCY);
   1039 </pre>
   1040 <li>Check for API level 17 or higher, to confirm use of
   1041 <code>android.media.AudioManager.getProperty()</code>.
   1042 <li>Get the native or optimal output sample rate and buffer size for this device's primary output
   1043 stream, using code such as this:
   1044 <pre>
   1045 import android.media.AudioManager;
   1046 ...
   1047 AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
   1048 String sampleRate = am.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE));
   1049 String framesPerBuffer = am.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER));
   1050 </pre>
   1051 Note that <code>sampleRate</code> and <code>framesPerBuffer</code>
   1052 are <code>String</code>s.  First check for <code>null</code>
   1053 and then convert to <code>int</code> using <code>Integer.parseInt()</code>.
   1054 <li>Now use OpenSL ES to create an AudioPlayer with PCM buffer queue data locator.
   1055 </ol>
   1056 The number of lower latency audio players is limited. If your application
   1057 requires more than a few audio sources, consider mixing your audio at
   1058 application level.  Be sure to destroy your audio players when your
   1059 activity is paused, as they are a global resource shared with other apps.
   1060 <p>
   1061 To avoid audible glitches, the buffer queue callback handler must execute
   1062 within a small and predictable time window. This typically implies no
   1063 unbounded blocking on mutexes, conditions, or I/O operations. Instead consider
   1064 "try locks", locks and waits with timeouts, and non-blocking algorithms.
   1065 <p>
   1066 The computation required to render the next buffer (for AudioPlayer) or
   1067 consume the previous buffer (for AudioRecord) should take approximately
   1068 the same amount of time for each callback.  Avoid algorithms that execute in
   1069 a non-deterministic amount of time, or are "bursty" in their computations.
   1070 A callback computation is bursty if the CPU time spent in any
   1071 given callback is significantly larger than the average.
   1072 In summary, the ideal is for the CPU execution time of the handler to have
   1073 variance near zero, and for the handler to not block for unbounded times.
   1074 <p>
   1075 Lower latency audio is for these outputs only: on-device speaker, wired
   1076 headphones, wired headset, and line out.
   1077 <p>
   1078 As of API level 21, lower latency audio input is supported on select
   1079 devices. To take advantage of this feature, first confirm that lower
   1080 latency output is available as described above. The capability for lower
   1081 latency output is a prerequisite for the lower latency input feature.
   1082 Then create an AudioRecorder with the same sample rate and buffer size
   1083 as would be used for output.
   1084 <p>
   1085 For simultaneous input and output, separate buffer queue
   1086 completion handlers are used for each side.  There is no guarantee of
   1087 the relative order of these callbacks, or the synchronization of the
   1088 audio clocks, even when both sides use the same sample rate.
   1089 Your application should buffer the data with proper buffer synchronization.
   1090 <p>
   1091 One consequence of potentially independent audio clocks is the need
   1092 for asynchronous sample rate conversion.  A simple (though not ideal
   1093 for audio quality) technique for asynchronous sample rate conversion
   1094 is to duplicate or drop samples as needed near a zero-crossing point.
   1095 More sophisticated conversions are possible.
   1096 
   1097 <h3>Security and permissions</h3>
   1098 
   1099 As far as who can do what, security in Android is done at the
   1100 process level. Java programming language code can't do anything more than native code, nor
   1101 can native code do anything more than Java programming language code. The only differences
   1102 between them are what APIs are available that provide functionality
   1103 that the platform promises to support in the future and across
   1104 different devices.
   1105 <p>
   1106 Applications using OpenSL ES must request whatever permissions they
   1107 would need for similar non-native APIs. For example, if your application
   1108 records audio, then it needs the <code>android.permission.RECORD_AUDIO</code>
   1109 permission. Applications that use audio effects need
   1110 <code>android.permission.MODIFY_AUDIO_SETTINGS</code>. Applications that play
   1111 network URI resources need <code>android.permission.NETWORK</code>.
   1112 <p>
   1113 Depending on the platform version and implementation,
   1114 media content parsers and software codecs may run within the context
   1115 of the Android application that calls OpenSL ES (hardware codecs
   1116 are abstracted, but are device-dependent). Malformed content
   1117 designed to exploit parser and codec vulnerabilities is a known attack
   1118 vector. We recommend that you play media only from trustworthy
   1119 sources, or that you partition your application such that code that
   1120 handles media from untrustworthy sources runs in a relatively
   1121 sandboxed environment.  For example you could process media from
   1122 untrustworthy sources in a separate process. Though both processes
   1123 would still run under the same UID, this separation does make an
   1124 attack more difficult.
   1125 
   1126 <h2>Platform issues</h2>
   1127 
   1128 This section describes known issues in the initial platform
   1129 release which supports these APIs.
   1130 
   1131 <h3>Dynamic interface management</h3>
   1132 
   1133 <code>DynamicInterfaceManagement::AddInterface</code> does not work.
   1134 Instead, specify the interface in the array passed to Create, as
   1135 shown in the example code for environmental reverb.
   1136 
   1137 <h2>References and resources</h2>
   1138 
   1139 Android:
   1140 <ul>
   1141 <li><a href="http://developer.android.com/resources/index.html">
   1142 Android developer resources</a>
   1143 <li><a href="http://groups.google.com/group/android-developers">
   1144 Android developers discussion group</a>
   1145 <li><a href="http://developer.android.com/sdk/ndk/index.html">Android NDK</a>
   1146 <li><a href="http://groups.google.com/group/android-ndk">
   1147 Android NDK discussion group</a> (for developers of native code, including OpenSL ES)
   1148 <li><a href="http://code.google.com/p/android/issues/">
   1149 Android open source bug database</a>
   1150 </ul>
   1151 
   1152 Khronos Group:
   1153 <ul>
   1154 <li><a href="http://www.khronos.org/opensles/">
   1155 Khronos Group OpenSL ES Overview</a>
   1156 <li><a href="http://www.khronos.org/registry/sles/">
   1157 Khronos Group OpenSL ES 1.0.1 specification</a>
   1158 <li><a href="http://www.khronos.org/message_boards/viewforum.php?f=15">
   1159 Khronos Group public message board for OpenSL ES</a>
   1160 (please limit to non-Android questions)
   1161 </ul>
   1162 For convenience, we have included a copy of the OpenSL ES 1.0.1
   1163 specification with the NDK in
   1164 <code>docs/opensles/OpenSL_ES_Specification_1.0.1.pdf</code>.
   1165 
   1166 <p>
   1167 Miscellaneous:
   1168 <ul>
   1169 <li><a href="http://en.wikipedia.org/wiki/Java_Native_Interface">JNI</a>
   1170 <li><a href="http://stackoverflow.com/search?q=android+audio">
   1171 Stack Overflow</a>
   1172 <li>web search for "interactive audio", "game audio", "sound design",
   1173 "audio programming", "audio content", "audio formats", etc.
   1174 <li><a href="http://en.wikipedia.org/wiki/Advanced_Audio_Coding">AAC</a>
   1175 </ul>
   1176 
   1177 </body>
   1178 </html>
   1179