Home | History | Annotate | Download | only in opensles
      1 <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
      2 "http://www.w3.org/TR/html4/loose.dtd">
      3 <html>
      4 
      5 <head>
      6 <title>OpenSL ES for Android</title>
      7 </head>
      8 
      9 <body>
     10 
     11 <h1>OpenSL ES for Android</h1>
     12 
     13 This article describes the Android native audio APIs based on the
     14 Khronos Group OpenSL ES&#8482; 1.0.1 standard.
     15 <p>
     16 Unless otherwise noted,
     17 all features are available at Android API level 9 (Android platform
     18 version 2.3) and higher.
     19 Some features are only available at Android API level 14 (Android
     20 platform version 4.0) and higher; these are noted.
     21 <p>
     22 OpenSL ES provides a C language interface that is also callable from C++, and
     23 exposes features similar to the audio portions of these Android APIs
     24 callable from Java programming language code:
     25 <ul>
     26 <li><a href="http://developer.android.com/reference/android/media/MediaPlayer.html">
     27 android.media.MediaPlayer</a>
     28 <li><a href="http://developer.android.com/reference/android/media/MediaRecorder.html">
     29 android.media.MediaRecorder</a>
     30 </ul>
     31 
     32 As with all of the Android Native Development Kit (NDK), the primary
     33 purpose of OpenSL ES for Android is to facilitate the implementation
     34 of shared libraries to be called from Java programming language code via Java Native
     35 Interface (JNI).  NDK is not intended for writing pure C/C++
     36 applications.  That said, OpenSL ES is a full-featured API, and we
     37 expect that you should be able to accomplish most of your audio
     38 needs using only this API, without up-calls to code running in the Dalvik VM.
     39 
     40 <p>
     41 Note: though based on OpenSL ES, the Android native audio API
     42 is <i>not</i> a conforming implementation of any OpenSL ES 1.0.1
     43 profile (game, music, or phone). This is because Android does not
     44 implement all of the features required by any one of the profiles.
     45 Any known cases where Android behaves differently than the specification
     46 are described in section "Android extensions" below.
     47 
     48 <h2>Getting started</h2>
     49 
     50 <h3>Example code</h3>
     51 
     52 <h4>Recommended</h4>
     53 
     54 Supported and tested example code, usable as a model
     55 for your own code, is located in NDK folder
     56 <code>platforms/android-9/samples/native-audio/</code>.
     57 
     58 <h4>Not recommended</h4>
     59 
     60 The OpenSL ES 1.0.1 specification contains example code in the
     61 appendices (see section "References" below for the link to this
     62 specification).  However, the examples in Appendix B: Sample Code
     63 and Appendix C: Use Case Sample Code use features
     64 not supported by Android. Some examples also contain
     65 typographical errors, or use APIs that are likely to change.
     66 Proceed with caution in referring to these;
     67 though the code may be helpful in understanding the full OpenSL ES
     68 standard, it should not be used as is with Android.
     69 
     70 <h3>Adding OpenSL ES to your application source code</h3>
     71 
     72 OpenSL ES is a C API, but is callable from both C and C++ code.
     73 <p>
     74 At a minimum, add the following line to your code:
     75 <pre>
     76 #include &lt;SLES/OpenSLES.h&gt;
     77 </pre>
     78 
     79 If you use Android extensions, also include this header:
     80 <pre>
     81 #include &lt;SLES/OpenSLES_Android.h&gt;
     82 </pre>
     83 which automatically includes these headers as well (you don't need to
     84 include these, they are shown as an aid in learning the API):
     85 <pre>
     86 #include &lt;SLES/OpenSLES_AndroidConfiguration.h&gt;
     87 #include &lt;SLES/OpenSLES_AndroidMetadata.h&gt;
     88 </pre>
     89 
     90 <h3>Makefile</h3>
     91 
     92 Modify your Android.mk as follows:
     93 <pre>
     94 LOCAL_LDLIBS += libOpenSLES
     95 </pre>
     96 
     97 <h3>Audio content</h3>
     98 
     99 There are many ways to package audio content for your
    100 application, including:
    101 
    102 <dl>
    103 
    104 <dt>Resources</dt>
    105 <dd>
    106 By placing your audio files into the <code>res/raw/</code> folder,
    107 they can be accessed easily by the associated APIs for
    108 <a href="http://developer.android.com/reference/android/content/res/Resources.html">
    109 Resources</a>.  However there is no direct native access to resources,
    110 so you will need to write Java programming language code to copy them out before use.
    111 </dd>
    112 
    113 <dt>Assets</dt>
    114 <dd>
    115 By placing your audio files into the <code>assets/</code> folder,
    116 they will be directly accessible by the Android native asset manager
    117 APIs.  See the header files <code>android/asset_manager.h</code>
    118 and <code>android/asset_manager_jni.h</code> for more information
    119 on these APIs.  The example code
    120 located in NDK folder
    121 <code>platforms/android-9/samples/native-audio/</code> uses these
    122 native asset manager APIs in conjunction with the Android file
    123 descriptor data locator.
    124 </dd>
    125 
    126 <dt>Network</dt>
    127 <dd>
    128 You can use the URI data locator to play audio content directly from the
    129 network. However, be sure to read section "Security and permissions" below.
    130 </dd>
    131 
    132 <dt>Local filesystem</dt>
    133 <dd>
    134 The URI data locator supports the <code>file:</code> scheme for local files,
    135 provided the files are accessible by the application.
    136 Note that the Android security framework restricts file access via
    137 the Linux user ID and group ID mechanism.
    138 </dd>
    139 
    140 <dt>Recorded</dt>
    141 <dd>Your application can record audio data from the microphone input,
    142 store this content, and then play it back later.
    143 The example code uses this method for the "Playback" clip.
    144 </dd>
    145 
    146 <dt>Compiled and linked inline</dt>
    147 <dd>
    148 You can link your audio content directly into the shared library,
    149 and then play it using an audio player with buffer queue data locator.  This is most
    150 suitable for short PCM format clips.  The example code uses this
    151 technique for the "Hello" and "Android" clips. The PCM data was
    152 converted to hex strings using a <code>bin2c</code> tool (not supplied).
    153 </dd>
    154 
    155 <dt>Real-time synthesis</dt>
    156 <dd>
    157 Your application can synthesize PCM data on the fly and then play it
    158 using an audio player with buffer queue data locator.  This is a
    159 relatively advanced technique, and the details of audio synthesis
    160 are beyond the scope of this article.
    161 </dd>
    162 
    163 </dl>
    164 
    165 Finding or creating useful audio content for your application is
    166 beyond the scope of this article, but see the "References" section
    167 below for some suggested web search terms.
    168 <p>
    169 Note that it is your responsibility to ensure that you are legally
    170 permitted to play or record content, and that there may be privacy
    171 considerations for recording content.
    172 
    173 <h3>Debugging</h3>
    174 
    175 For robustness, we recommend that you examine the <code>SLresult</code>
    176 value which is returned by most APIs. Use of <code>assert</code>
    177 vs. more advanced error handling logic is a matter of coding style
    178 and the particular API; see the Wikipedia article on
    179 <a href="http://en.wikipedia.org/wiki/Assertion_(computing)">assert</a>
    180 for more information. In the supplied example, we have used <code>assert</code>
    181 for "impossible" conditions which would indicate a coding error, and
    182 explicit error handling for others which are more likely to occur
    183 in production.
    184 <p>
    185 Many API errors result in a log entry, in addition to the non-zero
    186 result code. These log entries provide additional detail which can
    187 be especially useful for the more complex APIs such as
    188 <code>Engine::CreateAudioPlayer</code>.
    189 <p>
    190 Use <a href="http://developer.android.com/guide/developing/tools/adb.html">
    191 adb logcat</a>, the
    192 <a href="http://developer.android.com/guide/developing/eclipse-adt.html">
    193 Eclipse ADT plugin</a> LogCat pane, or
    194 <a href="http://developer.android.com/guide/developing/tools/ddms.html#logcat">
    195 ddms logcat</a> to see the log.
    196 
    197 <h2>Supported features from OpenSL ES 1.0.1</h2>
    198 
    199 This section summarizes available features. In some
    200 cases, there are limitations which are described in the next
    201 sub-section.
    202 
    203 <h3>Global entry points</h3>
    204 
    205 Supported global entry points:
    206 <ul>
    207 <li><code>slCreateEngine</code>
    208 <li><code>slQueryNumSupportedEngineInterfaces</code>
    209 <li><code>slQuerySupportedEngineInterfaces</code>
    210 </ul>
    211 
    212 <h3>Objects and interfaces</h3>
    213 
    214 The following figure indicates objects and interfaces supported by
    215 Android's OpenSL ES implementation.  A green cell means the feature
    216 is supported.
    217 
    218 <p>
    219 <img src="chart1.png" alt="Supported objects and interfaces">
    220 
    221 <h3>Limitations</h3>
    222 
    223 This section details limitations with respect to the supported
    224 objects and interfaces from the previous section.
    225 
    226 <h4>Buffer queue data locator</h4>
    227 
    228 An audio player or recorder with buffer queue data locator supports
    229 PCM data format only.
    230 
    231 <h4>Device data locator</h4>
    232 
    233 The only supported use of an I/O device data locator is when it is
    234 specified as the data source for <code>Engine::CreateAudioRecorder</code>.
    235 It should be initialized using these values, as shown in the example:
    236 <pre>
    237 SLDataLocator_IODevice loc_dev =
    238   {SL_DATALOCATOR_IODEVICE, SL_IODEVICE_AUDIOINPUT,
    239   SL_DEFAULTDEVICEID_AUDIOINPUT, NULL};
    240 </pre>
    241 
    242 <h4>Dynamic interface management</h4>
    243 
    244 <code>RemoveInterface</code> and <code>ResumeInterface</code> are not supported.
    245 
    246 <h4>Effect combinations</h4>
    247 
    248 It is meaningless to have both environmental reverb and preset
    249 reverb on the same output mix.
    250 <p>
    251 The platform may ignore effect requests if it estimates that the
    252 CPU load would be too high.
    253 
    254 <h4>Effect send</h4>
    255 
    256 <code>SetSendLevel</code> supports a single send level per audio player.
    257 
    258 <h4>Environmental reverb</h4>
    259 
    260 Environmental reverb does not support the <code>reflectionsDelay</code>,
    261 <code>reflectionsLevel</code>, or <code>reverbDelay</code> fields of
    262 <code>struct SLEnvironmentalReverbSettings</code>.
    263 
    264 <h4>MIME data format</h4>
    265 
    266 The MIME data format can be used with URI data locator only, and only
    267 for player (not recorder).
    268 <p>
    269 The Android implementation of OpenSL ES requires that <code>mimeType</code>
    270 be initialized to either <code>NULL</code> or a valid UTF-8 string,
    271 and that <code>containerType</code> be initialized to a valid value.
    272 In the absence of other considerations, such as portability to other
    273 implementations, or content format which cannot be identified by header,
    274 we recommend that you
    275 set the <code>mimeType</code> to <code>NULL</code> and <code>containerType</code>
    276 to <code>SL_CONTAINERTYPE_UNSPECIFIED</code>.
    277 <p>
    278 Supported formats include WAV PCM, WAV alaw, WAV ulaw, MP3, Ogg
    279 Vorbis, AAC LC, HE-AACv1 (aacPlus), HE-AACv2 (enhanced aacPlus),
    280 AMR, and FLAC [provided these are supported by the overall platform,
    281 and AAC formats must be located within an MP4 or ADTS container].
    282 MIDI is not supported.
    283 WMA is not part of the open source release, and compatibility
    284 with Android OpenSL ES has not been verified.
    285 <p>
    286 The Android implementation of OpenSL ES does not support direct
    287 playback of DRM or encrypted content; if you want to play this, you
    288 will need to convert to cleartext in your application before playing,
    289 and enforce any DRM restrictions in your application.
    290 
    291 <h4>Object</h4>
    292 
    293 <code>Resume</code>, <code>RegisterCallback</code>,
    294 <code>AbortAsyncOperation</code>, <code>SetPriority</code>,
    295 <code>GetPriority</code>, and <code>SetLossOfControlInterfaces</code>
    296 are not supported.
    297 
    298 <h4>PCM data format</h4>
    299 
    300 The PCM data format can be used with buffer queues only. Supported PCM
    301 playback configurations are 8-bit unsigned or 16-bit signed, mono
    302 or stereo, little endian byte ordering, and these sample rates:
    303 8000, 11025, 12000, 16000, 22050, 24000, 32000, 44100, or 48000 Hz.
    304 For recording, the supported configurations are device-dependent,
    305 however generally 16000 Hz mono 16-bit signed is usually available.
    306 <p>
    307 Note that the field <code>samplesPerSec</code> is actually in
    308 units of milliHz, despite the misleading name. To avoid accidentally
    309 using the wrong value, you should initialize this field using one
    310 of the symbolic constants defined for this purpose (such as
    311 <code>SL_SAMPLINGRATE_44_1</code> etc.)
    312 
    313 <h4>Playback rate</h4>
    314 
    315 The supported playback rate range(s) and capabilities may vary depending
    316 on the platform version and implementation, and so should be determined
    317 at runtime by querying with <code>PlaybackRate::GetRateRange</code>
    318 or <code>PlaybackRate::GetCapabilitiesOfRate</code>.
    319 <p>
    320 That said, some guidance on typical rate ranges may be useful:
    321 In Android 2.3 a single playback rate range from 500 per mille to 2000 per mille
    322 inclusive is typically supported, with property
    323 <code>SL_RATEPROP_NOPITCHCORAUDIO</code>.
    324 In Android 4.0 the same rate range is typically supported for a data source
    325 in PCM format, and a unity rate range for other formats.
    326 
    327 <h4>Record</h4>
    328 
    329 The <code>SL_RECORDEVENT_HEADATLIMIT</code> and
    330 <code>SL_RECORDEVENT_HEADMOVING</code> events are not supported.
    331 
    332 <h4>Seek</h4>
    333 
    334 <code>SetLoop</code> enables whole file looping. The <code>startPos</code>
    335 parameter should be zero and the <code>endPos</code> parameter should
    336 be <code>SL_TIME_UNKNOWN</code>.
    337 
    338 <h4>URI data locator</h4>
    339 
    340 The URI data locator can be used with MIME data format only, and
    341 only for an audio player (not audio recorder). Supported schemes
    342 are <code>http:</code> and <code>file:</code>.
    343 A missing scheme defaults to the <code>file:</code> scheme. Other
    344 schemes such as <code>https:</code>, <code>ftp:</code>, and
    345 <code>content:</code> are not supported.
    346 <code>rtsp:</code> is not verified.
    347 
    348 <h3>Data structures</h3>
    349 
    350 Android supports these OpenSL ES 1.0.1 data structures:
    351 <ul>
    352 <li>SLDataFormat_MIME
    353 <li>SLDataFormat_PCM
    354 <li>SLDataLocator_BufferQueue
    355 <li>SLDataLocator_IODevice
    356 <li>SLDataLocator_OutputMix
    357 <li>SLDataLocator_URI
    358 <li>SLDataSink
    359 <li>SLDataSource
    360 <li>SLEngineOption
    361 <li>SLEnvironmentalReverbSettings
    362 <li>SLInterfaceID
    363 </ul>
    364 
    365 <h3>Platform configuration</h3>
    366 
    367 OpenSL ES for Android is designed for multi-threaded applications,
    368 and is thread-safe.
    369 <p>
    370 OpenSL ES for Android supports a single engine per application, and
    371 up to 32 objects. Available device memory and CPU may further
    372 restrict the usable number of objects.
    373 <p>
    374 <code>slCreateEngine</code> recognizes, but ignores, these engine options:
    375 <ul>
    376 <li><code>SL_ENGINEOPTION_THREADSAFE</code>
    377 <li><code>SL_ENGINEOPTION_LOSSOFCONTROL</code>
    378 </ul>
    379 
    380 OpenMAX AL and OpenSL ES may be used together in the same application.
    381 In this case, there is internally a single shared engine object,
    382 and the 32 object limit is shared between OpenMAX AL and OpenSL ES.
    383 The application should first create both engines, then use both engines,
    384 and finally destroy both engines.  The implementation maintains a
    385 reference count on the shared engine, so that it is correctly destroyed
    386 at the second destroy.
    387 
    388 <h2>Planning for future versions of OpenSL ES</h2>
    389 
    390 The Android native audio APIs are based on Khronos
    391 Group OpenSL ES 1.0.1 (see section "References" below).
    392 As of the time of this writing, Khronos has recently released
    393 a revised version 1.1 of the standard. The revised version
    394 includes new features, clarifications, correction of
    395 typographical errors, and some incompatibilities. Most of the expected
    396 incompatibilities are relatively minor, or are in areas of OpenSL ES
    397 not supported by Android. However, even a small change
    398 can be significant for an application developer, so it is important
    399 to prepare for this.
    400 <p>
    401 The Android team is committed to preserving future API binary
    402 compatibility for developers to the extent feasible. It is our
    403 intention to continue to support future binary compatibility of the
    404 1.0.1-based API, even as we add support for later versions of the
    405 standard. An application developed with this version should
    406 work on future versions of the Android platform, provided that
    407 you follow the guidelines listed in section "Planning for
    408 binary compatibility" below.
    409 <p>
    410 Note that future source compatibility will <i>not</i> be a goal. That is,
    411 if you upgrade to a newer version of the NDK, you may need to modify
    412 your application source code to conform to the new API. We expect
    413 that most such changes will be minor; see details below.
    414 
    415 <h3>Planning for binary compatibility</h3>
    416 
    417 We recommend that your application follow these guidelines,
    418 to improve future binary compatibility:
    419 <ul>
    420 <li>
    421 Use only the documented subset of Android-supported features from
    422 OpenSL ES 1.0.1.
    423 <li>
    424 Do not depend on a particular result code for an unsuccessful
    425 operation; be prepared to deal with a different result code.
    426 <li>
    427 Application callback handlers generally run in a restricted context,
    428 and should be written to perform their work quickly and then return
    429 as soon as possible. Do not do complex operations within a callback
    430 handler. For example, within a buffer queue completion callback,
    431 you can enqueue another buffer, but do not create an audio player.
    432 <li>
    433 Callback handlers should be prepared to be called more or less
    434 frequently, to receive additional event types, and should ignore
    435 event types that they do not recognize. Callbacks that are configured
    436 with an event mask of enabled event types should be prepared to be
    437 called with multiple event type bits set simultaneously.
    438 Use "&amp;" to test for each event bit rather than a switch case.
    439 <li>
    440 Use prefetch status and callbacks as a general indication of progress, but do
    441 not depend on specific hard-coded fill levels or callback sequence.
    442 The meaning of the prefetch status fill level, and the behavior for
    443 errors that are detected during prefetch, may change.
    444 <li>
    445 See section "Buffer queue behavior" below.
    446 </ul>
    447 
    448 <h3>Planning for source compatibility</h3>
    449 
    450 As mentioned, source code incompatibilities are expected in the next
    451 version of OpenSL ES from Khronos Group. Likely areas of change include:
    452 
    453 <ul>
    454 <li>The buffer queue interface is expected to have significant changes,
    455 especially in the areas of <code>BufferQueue::Enqueue</code>, the parameter
    456 list for <code>slBufferQueueCallback</code>,
    457 and the name of field <code>SLBufferQueueState.playIndex</code>.
    458 We recommend that your application code use Android simple buffer
    459 queues instead, because we do not plan to change that API.
    460 In the example code supplied with the NDK, we have used
    461 Android simple buffer queues for playback for this reason.
    462 (We also use Android simple buffer queue for recording and decode to PCM, but
    463 that is because standard OpenSL ES 1.0.1 does not support record or decode to
    464 a buffer queue data sink.)
    465 <li>Addition of <code>const</code> to input parameters passed by reference,
    466 and to <code>SLchar *</code> struct fields used as input values.
    467 This should not require any changes to your code.
    468 <li>Substitution of unsigned types for some parameters that are
    469 currently signed.  You may need to change a parameter type from
    470 <code>SLint32</code> to <code>SLuint32</code> or similar, or add a cast.
    471 <li><code>Equalizer::GetPresetName</code> will copy the string to
    472 application memory instead of returning a pointer to implementation
    473 memory. This will be a significant change, so we recommend that you
    474 either avoid calling this method, or isolate your use of it.
    475 <li>Additional fields in struct types. For output parameters, these
    476 new fields can be ignored, but for input parameters the new fields
    477 will need to be initialized. Fortunately, these are expected to all
    478 be in areas not supported by Android.
    479 <li>Interface
    480 <a href="http://en.wikipedia.org/wiki/Globally_unique_identifier">
    481 GUIDs</a> will change. Refer to interfaces by symbolic name rather than GUID
    482 to avoid a dependency.
    483 <li><code>SLchar</code> will change from <code>unsigned char</code>
    484 to <code>char</code>. This primarily affects the URI data locator
    485 and MIME data format.
    486 <li><code>SLDataFormat_MIME.mimeType</code> will be renamed to <code>pMimeType</code>,
    487 and <code>SLDataLocator_URI.URI</code> will be renamed to <code>pURI</code>.
    488 We recommend that you initialize the <code>SLDataFormat_MIME</code>
    489 and <code>SLDataLocator_URI</code>
    490 data structures using a brace-enclosed comma-separated list of values,
    491 rather than by field name, to isolate your code from this change.
    492 In the example code we have used this technique.
    493 <li><code>SL_DATAFORMAT_PCM</code> does not permit the application
    494 to specify the representation of the data as signed integer, unsigned
    495 integer, or floating-point. The Android implementation assumes that
    496 8-bit data is unsigned integer and 16-bit is signed integer.  In
    497 addition, the field <code>samplesPerSec</code> is a misnomer, as
    498 the actual units are milliHz. These issues are expected to be
    499 addressed in the next OpenSL ES version, which will introduce a new
    500 extended PCM data format that permits the application to explicitly
    501 specify the representation, and corrects the field name.  As this
    502 will be a new data format, and the current PCM data format will
    503 still be available (though deprecated), it should not require any
    504 immediate changes to your code.
    505 </ul>
    506 
    507 <h2>Android extensions</h2>
    508 
    509 The API for Android extensions is defined in <code>SLES/OpenSLES_Android.h</code>
    510 and the header files that it includes.
    511 Consult that file for details on these extensions. Unless otherwise
    512 noted, all interfaces are "explicit".
    513 <p>
    514 Note that use these extensions will limit your application's
    515 portability to other OpenSL ES implementations. If this is a concern,
    516 we advise that you avoid using them, or isolate your use of these
    517 with <code>#ifdef</code> etc.
    518 <p>
    519 The following figure shows which Android-specific interfaces and
    520 data locators are available for each object type.
    521 
    522 <p>
    523 <img src="chart2.png" alt="Android extensions">
    524 
    525 <h3>Android configuration interface</h3>
    526 
    527 The Android configuration interface provides a means to set
    528 platform-specific parameters for objects. Unlike other OpenSL ES
    529 1.0.1 interfaces, the Android configuration interface is available
    530 prior to object realization. This permits the object to be configured
    531 and then realized. Header file <code>SLES/OpenSLES_AndroidConfiguration.h</code>
    532 documents the available configuration keys and values:
    533 <ul>
    534 <li>stream type for audio players (default <code>SL_ANDROID_STREAM_MEDIA</code>)
    535 <li>record profile for audio recorders (default <code>SL_ANDROID_RECORDING_PRESET_GENERIC</code>)
    536 </ul>
    537 Here is an example code fragment that sets the Android audio stream type on an audio player:
    538 <pre>
    539 // CreateAudioPlayer and specify SL_IID_ANDROIDCONFIGURATION
    540 // in the required interface ID array. Do not realize player yet.
    541 // ...
    542 SLAndroidConfigurationItf playerConfig;
    543 result = (*playerObject)-&gt;GetInterface(playerObject,
    544     SL_IID_ANDROIDCONFIGURATION, &amp;playerConfig);
    545 assert(SL_RESULT_SUCCESS == result);
    546 SLint32 streamType = SL_ANDROID_STREAM_ALARM;
    547 result = (*playerConfig)-&gt;SetConfiguration(playerConfig,
    548     SL_ANDROID_KEY_STREAM_TYPE, &amp;streamType, sizeof(SLint32));
    549 assert(SL_RESULT_SUCCESS == result);
    550 // ...
    551 // Now realize the player here.
    552 </pre>
    553 Similar code can be used to configure the preset for an audio recorder.
    554 
    555 <h3>Android effects interfaces</h3>
    556 
    557 The Android effect, effect send, and effect capabilities interfaces provide
    558 a generic mechanism for an application to query and use device-specific
    559 audio effects. A device manufacturer should document any available
    560 device-specific audio effects.
    561 <p>
    562 Portable applications should use the OpenSL ES 1.0.1 APIs
    563 for audio effects instead of the Android effect extensions.
    564 
    565 <h3>Android file descriptor data locator</h3>
    566 
    567 The Android file descriptor data locator permits the source for an
    568 audio player to be specified as an open file descriptor with read
    569 access. The data format must be MIME.
    570 <p>
    571 This is especially useful in conjunction with the native asset manager.
    572 
    573 <h3>Android simple buffer queue data locator and interface</h3>
    574 
    575 The Android simple buffer queue data locator and interface are
    576 identical to the OpenSL ES 1.0.1 buffer queue locator and interface,
    577 except that Android simple buffer queues may be used with both audio
    578 players and audio recorders, and are limited to PCM data format.
    579 [OpenSL ES 1.0.1 buffer queues are for audio players only, and are not
    580 restricted to PCM data format.]
    581 <p>
    582 For recording, the application should enqueue empty buffers. Upon
    583 notification of completion via a registered callback, the filled
    584 buffer is available for the application to read.
    585 <p>
    586 For playback there is no difference. But for future source code
    587 compatibility, we suggest that applications use Android simple
    588 buffer queues instead of OpenSL ES 1.0.1 buffer queues.
    589 
    590 <h3>Dynamic interfaces at object creation</h3>
    591 
    592 For convenience, the Android implementation of OpenSL ES 1.0.1
    593 permits dynamic interfaces to be specified at object creation time,
    594 as an alternative to adding these interfaces after object creation
    595 with <code>DynamicInterfaceManagement::AddInterface</code>.
    596 
    597 <h3>Buffer queue behavior</h3>
    598 
    599 The OpenSL ES 1.0.1 specification requires that "On transition to
    600 the <code>SL_PLAYSTATE_STOPPED</code> state the play cursor is
    601 returned to the beginning of the currently playing buffer." The
    602 Android implementation does not necessarily conform to this
    603 requirement. For Android, it is unspecified whether a transition
    604 to <code>SL_PLAYSTATE_STOPPED</code> operates as described, or
    605 leaves the play cursor unchanged.
    606 <p>
    607 We recommend that you do not rely on either behavior; after a
    608 transition to <code>SL_PLAYSTATE_STOPPED</code>, you should explicitly
    609 call <code>BufferQueue::Clear</code>. This will place the buffer
    610 queue into a known state.
    611 <p>
    612 A corollary is that it is unspecified whether buffer queue callbacks
    613 are called upon transition to <code>SL_PLAYSTATE_STOPPED</code> or by
    614 <code>BufferQueue::Clear</code>.
    615 We recommend that you do not rely on either behavior; be prepared
    616 to receive a callback in these cases, but also do not depend on
    617 receiving one.
    618 <p>
    619 It is expected that a future version of OpenSL ES will clarify these
    620 issues. However, upgrading to that version would result in source
    621 code incompatibilities (see section "Planning for source compatibility"
    622 above).
    623 
    624 <h3>Reporting of extensions</h3>
    625 
    626 <code>Engine::QueryNumSupportedExtensions</code>,
    627 <code>Engine::QuerySupportedExtension</code>,
    628 <code>Engine::IsExtensionSupported</code> report these extensions:
    629 <ul>
    630 <li><code>ANDROID_SDK_LEVEL_#</code>
    631 where # is the platform API level, 9 or higher
    632 </ul>
    633 
    634 <h3>Decode audio to PCM</h3>
    635 
    636 Note: this feature is available at API level 14 and higher.
    637 <p>
    638 A standard audio player plays back to an audio device, and the data sink
    639 is specified as an output mix.
    640 However, as an Android extension, an audio player instead
    641 acts as a decoder if the data source is specified as a URI or Android
    642 file descriptor data locator with MIME data format, and the data sink is
    643 an Android simple buffer queue data locator with PCM data format.
    644 <p>
    645 This feature is primarily intended for games to pre-load their
    646 audio assets when changing to a new game level, similar to
    647 <code>android.media.SoundPool</code>.
    648 <p>
    649 The application should initially enqueue a set of empty buffers to the Android simple
    650 buffer queue, which will be filled with PCM data.  The Android simple
    651 buffer queue callback is invoked after each buffer is filled. The
    652 callback handler should process the PCM data, re-enqueue the
    653 now-empty buffer, and then return.  The application is responsible for
    654 keeping track of decoded buffers; the callback parameter list does not include
    655 sufficient information to indicate which buffer was filled or which buffer to enqueue next.
    656 <p>
    657 The end of stream is determined implicitly by the data source.
    658 At the end of stream a <code>SL_PLAYEVENT_HEADATEND</code> event is
    659 delivered. The Android simple buffer queue callback will no longer
    660 be called after all consumed data is decoded.
    661 <p>
    662 The sink's PCM data format typically matches that of the encoded data source
    663 with respect to sample rate, channel count, and bit depth. However, the platform
    664 implementation is permitted to decode to a different sample rate, channel count, or bit depth.
    665 There is a provision to detect the actual PCM format; see section "Determining
    666 the format of decoded PCM data via metadata" below.
    667 <p>
    668 Decode to PCM supports pause and initial seek.  Volume control, effects,
    669 looping, and playback rate are not supported.
    670 <p>
    671 Depending on the platform implementation, decoding may require resources
    672 that cannot be left idle.  Therefore it is not recommended to starve the
    673 decoder by failing to provide a sufficient number of empty PCM buffers,
    674 e.g. by returning from the Android simple buffer queue callback without
    675 enqueueing another empty buffer.  The result of decoder starvation is
    676 unspecified; the implementation may choose to either drop the decoded
    677 PCM data, pause the decoding process, or in severe cases terminate
    678 the decoder.
    679 
    680 <h3>Decode streaming ADTS AAC to PCM</h3>
    681 
    682 Note: this feature is available at API level 14 and higher.
    683 <p>
    684 An audio player acts as a streaming decoder if the data source is an
    685 Android buffer queue data locator with MIME data format, and the data
    686 sink is an Android simple buffer queue data locator with PCM data format.
    687 The MIME data format should be configured as:
    688 <dl>
    689 <dt>container</dt>
    690 <dd><code>SL_CONTAINERTYPE_RAW</code>
    691 <dt>MIME type string
    692 <dd><code>"audio/vnd.android.aac-adts"</code> (macro <code>SL_ANDROID_MIME_AACADTS</code>)
    693 </dl>
    694 <p>
    695 This feature is primarily intended for streaming media applications that
    696 deal with AAC audio, but need to apply custom processing of the audio
    697 prior to playback.  Most applications that need to decode audio to PCM
    698 should use the method of the previous section "Decode audio to PCM",
    699 as it is simpler and handles more audio formats.  The technique described
    700 here is a more specialized approach, to be used only if both of these
    701 conditions apply:
    702 <ul>
    703 <li>the compressed audio source is a stream of AAC frames contained by ADTS headers
    704 <li>the application manages this stream, that is the data is <i>not</i> located within
    705 a network resource identified by URI or within a local file identified by file descriptor.
    706 </ul>
    707 The application should initially enqueue a set of filled buffers to the Android buffer queue.
    708 Each buffer contains one or more complete ADTS AAC frames.
    709 The Android buffer queue callback is invoked after each buffer is emptied.
    710 The callback handler should re-fill and re-enqueue the buffer, and then return.
    711 The application need not keep track of encoded buffers; the callback parameter
    712 list does include sufficient information to indicate which buffer to enqueue next.
    713 The end of stream is explicitly marked by enqueuing an EOS item.
    714 After EOS, no more enqueues are permitted.
    715 <p>
    716 It is not recommended to starve the decoder by failing to provide full
    717 ADTS AAC buffers, e.g. by returning from the Android buffer queue callback
    718 without enqueueing another full buffer.  The result of decoder starvation
    719 is unspecified.
    720 <p>
    721 In all respects except for the data source, the streaming decode method is similar
    722 to that of the previous section:
    723 <ul>
    724 <li>initially enqueue a set of empty buffers to the Android simple buffer queue
    725 <li>the Android simple buffer queue callback is invoked after each buffer is filled with PCM data;
    726 the callback handler should process the PCM data and then re-enqueue another empty buffer
    727 <li>the <code>SL_PLAYEVENT_HEADATEND</code> event is delivered at end of stream
    728 <li>the actual PCM format should be detected using metadata rather than by making an assumption
    729 <li>the same limitations apply with respect to volume control, effects, etc.
    730 <li>starvation for lack of empty PCM buffers is not recommended
    731 </ul>
    732 <p>
    733 Despite the similarity in names, an Android buffer queue is <i>not</i>
    734 the same as an Android simple buffer queue.  The streaming decoder
    735 uses both kinds of buffer queues: an Android buffer queue for the ADTS
    736 AAC data source, and an Android simple buffer queue for the PCM data
    737 sink.  The Android simple buffer queue API is described in this document
    738 in section "Android simple buffer queue data locator and interface".
    739 The Android buffer queue API is described in the Android native media
    740 API documentation, located in <a href="../openmaxal/index.html">docs/openmaxal/index.html</a>.
    741 
    742 <h3>Determining the format of decoded PCM data via metadata</h3>
    743 
    744 The metadata extraction interface <code>SLMetadataExtractionItf</code>
    745 is a standard OpenSL ES 1.0.1 interface, not an Android extension.
    746 However, the particular metadata keys that
    747 indicate the actual format of decoded PCM data are specific to Android,
    748 and are defined in header <code>SLES/OpenSLES_AndroidMetadata.h</code>.
    749 <p>
    750 The metadata key indices are available immediately after
    751 <code>Object::Realize</code>. Yet the associated values are not
    752 available until after the first encoded data has been decoded.  A good
    753 practice is to query for the key indices in the main thread after Realize,
    754 and to read the PCM format metadata values in the Android simple
    755 buffer queue callback handler the first time it is called.
    756 <p>
    757 The OpenSL ES 1.0.1 metadata extraction interface
    758 <code>SLMetadataExtractionItf</code> is admittedly cumbersome, as it
    759 requires a multi-step process to first determine key indices and then
    760 to get the key values.  Consult the example code for snippets showing
    761 how to work with this interface.
    762 <p>
    763 Metadata key names are stable.  But the key indices are not documented
    764 and are subject to change.  An application should not assume that indices
    765 are persistent across different execution runs, and should not assume that
    766 indices are shared for different object instances within the same run.
    767 
    768 <h2>Programming notes</h2>
    769 
    770 These notes supplement the OpenSL ES 1.0.1 specification,
    771 available in the "References" section below.
    772 
    773 <h3>Objects and interface initialization</h3>
    774 
    775 Two aspects of the OpenSL ES programming model that may be unfamiliar
    776 to new developers are the distinction between objects and interfaces,
    777 and the initialization sequence.
    778 <p>
    779 Briefly, an OpenSL ES object is similar to the object concept
    780 in programming languages such as Java and C++, except an OpenSL ES
    781 object is <i>only</i> visible via its associated interfaces. This
    782 includes the initial interface for all objects, called
    783 <code>SLObjectItf</code>.  There is no handle for an object itself,
    784 only a handle to the <code>SLObjectItf</code> interface of the object.
    785 <p>
    786 An OpenSL ES object is first "created", which returns an
    787 <code>SLObjectItf</code>, then "realized". This is similar to the
    788 common programming pattern of first constructing an object (which
    789 should never fail other than for lack of memory or invalid parameters),
    790 and then completing initialization (which may fail due to lack of
    791 resources).  The realize step gives the implementation a
    792 logical place to allocate additional resources if needed.
    793 <p>
    794 As part of the API to create an object, an application specifies
    795 an array of desired interfaces that it plans to acquire later. Note
    796 that this array does <i>not</i> automatically acquire the interfaces;
    797 it merely indicates a future intention to acquire them.  Interfaces
    798 are distinguished as "implicit" or "explicit".  An explicit interface
    799 <i>must</i> be listed in the array if it will be acquired later.
    800 An implicit interface need not be listed in the object create array,
    801 but there is no harm in listing it there.  OpenSL ES has one more
    802 kind of interface called "dynamic", which does not need to be
    803 specified in the object create array, and can be added later after
    804 the object is created.  The Android implementation provides a
    805 convenience feature to avoid this complexity; see section "Dynamic
    806 interfaces at object creation" above.
    807 <p>
    808 After the object is created and realized, the application should
    809 acquire interfaces for each feature it needs, using
    810 <code>GetInterface</code> on the initial <code>SLObjectItf</code>.
    811 <p>
    812 Finally, the object is available for use via its interfaces, though
    813 note that some objects require further setup. In particular, an
    814 audio player with URI data source needs a bit more preparation in
    815 order to detect connection errors. See the next section
    816 "Audio player prefetch" for details.
    817 <p>
    818 After your application is done with the object, you should explicitly
    819 destroy it; see section "Destroy" below.
    820 
    821 <h3>Audio player prefetch</h3>
    822 
    823 For an audio player with URI data source, <code>Object::Realize</code> allocates resources
    824 but does not connect to the data source (i.e. "prepare") or begin
    825 pre-fetching data. These occur once the player state is set to
    826 either <code>SL_PLAYSTATE_PAUSED</code> or <code>SL_PLAYSTATE_PLAYING</code>.
    827 <p>
    828 Note that some information may still be unknown until relatively
    829 late in this sequence. In particular, initially
    830 <code>Player::GetDuration</code> will return <code>SL_TIME_UNKNOWN</code>
    831 and <code>MuteSolo::GetChannelCount</code> will either return successfully
    832 with channel count zero
    833 or the error result <code>SL_RESULT_PRECONDITIONS_VIOLATED</code>.
    834 These APIs will return the proper values once they are known.
    835 <p>
    836 Other properties that are initially unknown include the sample rate
    837 and actual media content type based on examining the content's header
    838 (as opposed to the application-specified MIME type and container type).
    839 These too, are determined later during prepare / prefetch, but there are
    840 no APIs to retrieve them.
    841 <p>
    842 The prefetch status interface is useful for detecting when all
    843 information is available. Or, your application can poll periodically.
    844 Note that some information may <i>never</i> be known, for example,
    845 the duration of a streaming MP3.
    846 <p>
    847 The prefetch status interface is also useful for detecting errors.
    848 Register a callback and enable at least the
    849 <code>SL_PREFETCHEVENT_FILLLEVELCHANGE</code> and
    850 <code>SL_PREFETCHEVENT_STATUSCHANGE</code> events. If both of these
    851 events are delivered simultaneously, and
    852 <code>PrefetchStatus::GetFillLevel</code> reports a zero level, and
    853 <code>PrefetchStatus::GetPrefetchStatus</code> reports
    854 <code>SL_PREFETCHSTATUS_UNDERFLOW</code>, then this indicates a
    855 non-recoverable error in the data source.
    856 This includes the inability to connect to the data source because
    857 the local filename does not exist or the network URI is invalid.
    858 <p>
    859 The next version of OpenSL ES is expected to add more explicit
    860 support for handling errors in the data source. However, for future
    861 binary compatibility, we intend to continue to support the current
    862 method for reporting a non-recoverable error.
    863 <p>
    864 In summary, a recommended code sequence is:
    865 <ul>
    866 <li>Engine::CreateAudioPlayer
    867 <li>Object:Realize
    868 <li>Object::GetInterface for SL_IID_PREFETCHSTATUS
    869 <li>PrefetchStatus::SetCallbackEventsMask
    870 <li>PrefetchStatus::SetFillUpdatePeriod
    871 <li>PrefetchStatus::RegisterCallback
    872 <li>Object::GetInterface for SL_IID_PLAY
    873 <li>Play::SetPlayState to SL_PLAYSTATE_PAUSED or SL_PLAYSTATE_PLAYING
    874 <li>preparation and prefetching occur here; during this time your
    875 callback will be called with periodic status updates
    876 </ul>
    877 
    878 <h3>Destroy</h3>
    879 
    880 Be sure to destroy all objects on exit from your application.  Objects
    881 should be destroyed in reverse order of their creation, as it is
    882 not safe to destroy an object that has any dependent objects.
    883 For example, destroy in this order: audio players and recorders,
    884 output mix, then finally the engine.
    885 <p>
    886 OpenSL ES does not support automatic garbage collection or
    887 <a href="http://en.wikipedia.org/wiki/Reference_counting">reference counting</a>
    888 of interfaces. After you call <code>Object::Destroy</code>, all extant
    889 interfaces derived from the associated object become <i>undefined</i>.
    890 <p>
    891 The Android OpenSL ES implementation does not detect the incorrect
    892 use of such interfaces.
    893 Continuing to use such interfaces after the object is destroyed will
    894 cause your application to crash or behave in unpredictable ways.
    895 <p>
    896 We recommend that you explicitly set both the primary object interface
    897 and all associated interfaces to NULL as part of your object
    898 destruction sequence, to prevent the accidental misuse of a stale
    899 interface handle.
    900 
    901 <h3>Stereo panning</h3>
    902 
    903 When <code>Volume::EnableStereoPosition</code> is used to enable
    904 stereo panning of a mono source, there is a 3 dB reduction in total
    905 <a href="http://en.wikipedia.org/wiki/Sound_power_level">
    906 sound power level</a>.  This is needed to permit the total sound
    907 power level to remain constant as the source is panned from one
    908 channel to the other. Therefore, don't enable stereo positioning
    909 if you don't need it.  See the Wikipedia article on
    910 <a href="http://en.wikipedia.org/wiki/Panning_(audio)">audio panning</a>
    911 for more information.
    912 
    913 <h3>Callbacks and threads</h3>
    914 
    915 Callback handlers are generally called <i>synchronously</i> with
    916 respect to the event, that is, at the moment and location where the
    917 event is detected by the implementation. But this point is
    918 <i>asynchronous</i> with respect to the application. Thus you should
    919 use a mutex or other synchronization mechanism to control access
    920 to any variables shared between the application and the callback
    921 handler. In the example code, such as for buffer queues, we have
    922 omitted this synchronization in the interest of simplicity. However,
    923 proper mutual exclusion would be critical for any production code.
    924 <p>
    925 Callback handlers are called from internal
    926 non-application thread(s) which are not attached to the Dalvik virtual machine and thus
    927 are ineligible to use JNI. Because these internal threads are
    928 critical to the integrity of the OpenSL ES implementation, a callback
    929 handler should also not block or perform excessive work. Therefore,
    930 if your callback handler needs to use JNI or do anything significant
    931 (e.g. beyond an <code>Enqueue</code> or something else simple such as the "Get"
    932 family), the handler should instead post an event for another thread
    933 to process.
    934 <p>
    935 Note that the converse is safe: a Dalvik application thread which has
    936 entered JNI is allowed to directly call OpenSL ES APIs, including
    937 those which block. However, blocking calls are not recommended from
    938 the main thread, as they may result in the dreaded "Application Not
    939 Responding" (ANR).
    940 
    941 <h3>Performance</h3>
    942 
    943 As OpenSL ES is a native C API, non-Dalvik application threads which
    944 call OpenSL ES have no Dalvik-related overhead such as garbage
    945 collection pauses. With one exception described below, there is no additional performance
    946 benefit to the use of OpenSL ES other than this. In particular, use
    947 of OpenSL ES does not guarantee a lower audio latency, higher scheduling
    948 priority, etc. than what the platform generally provides.
    949 On the other hand, as the Android platform and specific device
    950 implementations continue to evolve, an OpenSL ES application can
    951 expect to benefit from any future system performance improvements.
    952 <p>
    953 One such evolution is support for reduced audio output latency.
    954 The underpinnings for reduced output latency were first included in
    955 the Android 4.1 platform release ("Jellybean"), and then continued
    956 progress occurred in the Android 4.2 platform.  These improvements
    957 are available via OpenSL ES for device implementations that claim feature
    958 "android.hardware.audio.low_latency". If the device doesn't claim this
    959 feature but supports API level 9 (Android platform version 2.3) or later,
    960 then you can still use the OpenSL ES APIs but the output latency may be higher.
    961 The lower output latency path is used only if the application requests a
    962 buffer count of 2 or more, and a buffer size and sample rate that are
    963 compatible with the device's native output configuration.
    964 These parameters are device-specific and should be obtained as follows.
    965 <p>
    966 Beginning with API level 17 (Android platform version 4.2), an application
    967 can query for the native or optimal output sample rate and buffer size
    968 for the device's primary output stream.  When combined with the feature
    969 test just mentioned, an app can now configure itself appropriately for
    970 lower latency output on devices that claim support.
    971 <p>
    972 The recommended sequence is:
    973 <ol>
    974 <li>Check for API level 9 or higher, to confirm use of OpenSL ES.
    975 <li>Check for feature "android.hardware.audio.low_latency" using code such as this:
    976 <pre>
    977 import android.content.pm.PackageManager;
    978 ...
    979 PackageManager pm = getContext().getPackageManager();
    980 boolean claimsFeature = pm.hasSystemFeature(PackageManager.FEATURE_AUDIO_LOW_LATENCY);
    981 </pre>
    982 <li>Check for API level 17 or higher, to confirm use of
    983 <code>android.media.AudioManager.getProperty()</code>.
    984 <li>Get the native or optimal output sample rate and buffer size for this device's primary output
    985 stream, using code such as this:
    986 <pre>
    987 import android.media.AudioManager;
    988 ...
    989 AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
    990 String sampleRate = am.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE));
    991 String framesPerBuffer = am.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER));
    992 </pre>
    993 Note that <code>sampleRate</code> and <code>framesPerBuffer</code>
    994 are <code>String</code>s.  First check for <code>null</code>
    995 and then convert to <code>int</code> using <code>Integer.parseInt()</code>.
    996 <li>Now use OpenSL ES to create an AudioPlayer with PCM buffer queue data locator.
    997 </ol>
    998 The number of lower latency audio players is limited. If your application
    999 requires more than a few audio sources, consider mixing your audio at
   1000 application level.  Be sure to destroy your audio players when your
   1001 activity is paused, as they are a global resource shared with other apps.
   1002 <p>
   1003 To avoid audible glitches, the buffer queue callback handler must execute
   1004 within a small and predictable time window. This typically implies no
   1005 blocking on mutexes, conditions, or I/O operations. Instead consider
   1006 "try locks" and non-blocking algorithms. Avoid algorithms that
   1007 execute in a non-deterministic amount of time, or are "bursty" in
   1008 their computations.
   1009 <p>
   1010 Lower latency audio is for these outputs only: on-device speaker, wired
   1011 headphones, wired headset, and line out.
   1012 
   1013 <h3>Security and permissions</h3>
   1014 
   1015 As far as who can do what, security in Android is done at the
   1016 process level. Java programming language code can't do anything more than native code, nor
   1017 can native code do anything more than Java programming language code. The only differences
   1018 between them are what APIs are available that provide functionality
   1019 that the platform promises to support in the future and across
   1020 different devices.
   1021 <p>
   1022 Applications using OpenSL ES must request whatever permissions they
   1023 would need for similar non-native APIs. For example, if your application
   1024 records audio, then it needs the <code>android.permission.RECORD_AUDIO</code>
   1025 permission. Applications that use audio effects need
   1026 <code>android.permission.MODIFY_AUDIO_SETTINGS</code>. Applications that play
   1027 network URI resources need <code>android.permission.NETWORK</code>.
   1028 <p>
   1029 Depending on the platform version and implementation,
   1030 media content parsers and software codecs may run within the context
   1031 of the Android application that calls OpenSL ES (hardware codecs
   1032 are abstracted, but are device-dependent). Malformed content
   1033 designed to exploit parser and codec vulnerabilities is a known attack
   1034 vector. We recommend that you play media only from trustworthy
   1035 sources, or that you partition your application such that code that
   1036 handles media from untrustworthy sources runs in a relatively
   1037 sandboxed environment.  For example you could process media from
   1038 untrustworthy sources in a separate process. Though both processes
   1039 would still run under the same UID, this separation does make an
   1040 attack more difficult.
   1041 
   1042 <h2>Platform issues</h2>
   1043 
   1044 This section describes known issues in the initial platform
   1045 release which supports these APIs.
   1046 
   1047 <h3>Dynamic interface management</h3>
   1048 
   1049 <code>DynamicInterfaceManagement::AddInterface</code> does not work.
   1050 Instead, specify the interface in the array passed to Create, as
   1051 shown in the example code for environmental reverb.
   1052 
   1053 <h2>References and resources</h2>
   1054 
   1055 Android:
   1056 <ul>
   1057 <li><a href="http://developer.android.com/resources/index.html">
   1058 Android developer resources</a>
   1059 <li><a href="http://groups.google.com/group/android-developers">
   1060 Android developers discussion group</a>
   1061 <li><a href="http://developer.android.com/sdk/ndk/index.html">Android NDK</a>
   1062 <li><a href="http://groups.google.com/group/android-ndk">
   1063 Android NDK discussion group</a> (for developers of native code, including OpenSL ES)
   1064 <li><a href="http://code.google.com/p/android/issues/">
   1065 Android open source bug database</a>
   1066 </ul>
   1067 
   1068 Khronos Group:
   1069 <ul>
   1070 <li><a href="http://www.khronos.org/opensles/">
   1071 Khronos Group OpenSL ES Overview</a>
   1072 <li><a href="http://www.khronos.org/registry/sles/">
   1073 Khronos Group OpenSL ES 1.0.1 specification</a>
   1074 <li><a href="http://www.khronos.org/message_boards/viewforum.php?f=15">
   1075 Khronos Group public message board for OpenSL ES</a>
   1076 (please limit to non-Android questions)
   1077 </ul>
   1078 For convenience, we have included a copy of the OpenSL ES 1.0.1
   1079 specification with the NDK in
   1080 <code>docs/opensles/OpenSL_ES_Specification_1.0.1.pdf</code>.
   1081 
   1082 <p>
   1083 Miscellaneous:
   1084 <ul>
   1085 <li><a href="http://en.wikipedia.org/wiki/Java_Native_Interface">JNI</a>
   1086 <li><a href="http://stackoverflow.com/search?q=android+audio">
   1087 Stack Overflow</a>
   1088 <li>web search for "interactive audio", "game audio", "sound design",
   1089 "audio programming", "audio content", "audio formats", etc.
   1090 <li><a href="http://en.wikipedia.org/wiki/Advanced_Audio_Coding">AAC</a>
   1091 </ul>
   1092 
   1093 </body>
   1094 </html>
   1095