1 page.title=Android 4.3 APIs 2 excludeFromSuggestions=true 3 sdk.platform.version=4.3 4 sdk.platform.apiLevel=18 5 @jd:body 6 7 8 <div id="qv-wrapper"> 9 <div id="qv"> 10 11 <h2>In this document 12 <a href="#" onclick="hideNestedItems('#toc43',this);return false;" class="header-toggle"> 13 <span class="more">show more</span> 14 <span class="less" style="display:none">show less</span></a></h2> 15 16 <ol id="toc43" class="hide-nested"> 17 <li><a href="#ApiLevel">Update your target API level</a></li> 18 <li><a href="#Behaviors">Important Behavior Changes</a> 19 <ol> 20 <li><a href="#BehaviorsIntents">If your app uses implicit intents...</a></li> 21 <li><a href="#BehaviorsAccounts">If your app depends on accounts...</a></li> 22 <li><a href="#BehaviorsVideoView">If your app uses VideoView...</a></li> 23 </ol> 24 </li> 25 <li><a href="#RestrictedProfiles">Restricted Profiles</a> 26 <ol> 27 <li><a href="#AccountsInProfile">Supporting accounts in a restricted profile</a></li> 28 </ol> 29 </li> 30 <li><a href="#Wireless">Wireless and Connectivity</a> 31 <ol> 32 <li><a href="#BTLE">Bluetooth Low Energy (Smart Ready)</a></li> 33 <li><a href="#WiFiScan">Wi-Fi scan-only mode</a></li> 34 <li><a href="#WiFiConfig">Wi-Fi configuration</a></li> 35 <li><a href="#QuickResponse">Quick response for incoming calls</a></li> 36 </ol> 37 </li> 38 <li><a href="#Multimedia">Multimedia</a> 39 <ol> 40 <li><a href="#MediaExtractor">MediaExtractor and MediaCodec enhancements</a></li> 41 <li><a href="#DRM">Media DRM</a></li> 42 <li><a href="#EncodingSurface">Video encoding from a Surface</a></li> 43 <li><a href="#MediaMuxing">Media muxing</a></li> 44 <li><a href="#ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</a></li> 45 </ol> 46 </li> 47 <li><a href="#Graphics">Graphics</a> 48 <ol> 49 <li><a href="#OpenGL">Support for OpenGL ES 3.0</a></li> 50 <li><a href="#MipMap">Mipmapping for drawables</a></li> 51 </ol> 52 </li> 53 <li><a href="#UI">User Interface</a> 54 <ol> 55 <li><a href="#ViewOverlay">View overlays</a></li> 56 <li><a href="#OpticalBounds">Optical bounds layout</a></li> 57 <li><a href="#AnimationRect">Animation for Rect values</a></li> 58 <li><a href="#AttachFocus">Window attach and focus listener</a></li> 59 <li><a href="#Overscan">TV overscan support</a></li> 60 <li><a href="#Orientation">Screen orientation</a></li> 61 <li><a href="#RotationAnimation">Rotation animations</a></li> 62 </ol> 63 </li> 64 <li><a href="#UserInput">User Input</a> 65 <ol> 66 <li><a href="#Sensors">New sensor types</a></li> 67 </ol> 68 </li> 69 <li><a href="#NotificationListener">Notification Listener</a></li> 70 <li><a href="#Contacts">Contacts Provider</a> 71 <ol> 72 <li><a href="#Contactables">Query for "contactables"</a></li> 73 <li><a href="#ContactsDelta">Query for contacts deltas</a></li> 74 </ol> 75 </li> 76 <li><a href="#Localization">Localization</a> 77 <ol> 78 <li><a href="#BiDi">Improved support for bi-directional text</a></li> 79 </ol> 80 </li> 81 <li><a href="#A11yService">Accessibility Services</a> 82 <ol> 83 <li><a href="#A11yKeyEvents">Handle key events</a></li> 84 <li><a href="#A11yText">Select text and copy/paste</a></li> 85 <li><a href="#A11yFeatures">Declare accessibility features</a></li> 86 </ol> 87 </li> 88 <li><a href="#Testing">Testing and Debugging</a> 89 <ol> 90 <li><a href="#UiAutomation">Automated UI testing</a></li> 91 <li><a href="#Systrace">Systrace events for apps</a></li> 92 </ol> 93 </li> 94 <li><a href="#Security">Security</a> 95 <ol> 96 <li><a href="#KeyStore">Android key store for app-private keys</a></li> 97 <li><a href="#HardwareKeyChain">Hardware credential storage</a></li> 98 </ol> 99 </li> 100 <li><a href="#Manifest">Manifest Declarations</a> 101 <ol> 102 <li><a href="#ManifestFeatures">Declarable required features</a></li> 103 <li><a href="#ManifestPermissions">User permissions</a></li> 104 </ol> 105 </li> 106 </ol> 107 108 <h2>See also</h2> 109 <ol> 110 <li><a href="{@docRoot}sdk/api_diff/18/changes.html">API 111 Differences Report »</a> </li> 112 <li><a 113 href="{@docRoot}tools/support-library/index.html">Support Library</a></li> 114 </ol> 115 116 </div> 117 </div> 118 119 120 121 <p>API Level: {@sdkPlatformApiLevel}</p> 122 123 <p>Android {@sdkPlatformVersion} ({@link android.os.Build.VERSION_CODES#JELLY_BEAN_MR2}) 124 is an update to the Jelly Bean release that offers new features for users and app 125 developers. This document provides an introduction to the most notable 126 new APIs.</p> 127 128 <p>As an app developer, you should download the Android {@sdkPlatformVersion} system image 129 and SDK platform from the <a href="{@docRoot}tools/help/sdk-manager.html">SDK Manager</a> as 130 soon as possible. If you don't have a device running Android {@sdkPlatformVersion} on which to 131 test your app, use the Android {@sdkPlatformVersion} system 132 image to test your app on the <a href="{@docRoot}tools/devices/emulator.html">Android emulator</a>. 133 Then build your apps against the Android {@sdkPlatformVersion} platform to begin using the 134 latest APIs.</p> 135 136 137 <h3 id="ApiLevel">Update your target API level</h3> 138 139 <p>To better optimize your app for devices running Android {@sdkPlatformVersion}, 140 you should set your <a 141 href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> to 142 <code>"{@sdkPlatformApiLevel}"</code>, install it on an Android {@sdkPlatformVersion} system image, 143 test it, then publish an update with this change.</p> 144 145 <p>You can use APIs in Android {@sdkPlatformVersion} while also supporting older versions by adding 146 conditions to your code that check for the system API level before executing 147 APIs not supported by your <a 148 href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a>. 149 To learn more about maintaining backward compatibility, read <a 150 href="{@docRoot}training/basics/supporting-devices/platforms.html">Supporting Different 151 Platform Versions</a>.</p> 152 153 <p>Various APIs are also available in the Android <a 154 href="{@docRoot}tools/support-library/index.html">Support Library</a> that allow you to implement 155 new features on older versions of the platform.</p> 156 157 <p>For more information about how API levels work, read <a 158 href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#ApiLevels">What is API 159 Level?</a></p> 160 161 162 163 164 165 <h2 id="Behaviors">Important Behavior Changes</h2> 166 167 <p>If you have previously published an app for Android, be aware that your app might 168 be affected by changes in Android {@sdkPlatformVersion}.</p> 169 170 171 <h3 id="BehaviorsIntents">If your app uses implicit intents...</h3> 172 173 <p>Your app might misbehave in a restricted profile environment.</p> 174 175 <p>Users in a <a href="#RestrictedProfiles">restricted profile</a> environment might not 176 have all the standard Android apps available. For example, a restricted profile might have the 177 web browser and camera app disabled. So your app should not make assumptions about which apps are 178 available, because if you call {@link android.app.Activity#startActivity startActivity()} without 179 verifying whether an app is available to handle the {@link android.content.Intent}, 180 your app might crash in a restricted profile.</p> 181 182 <p>When using an implicit intent, you should always verify that an app is available to handle the intent by calling {@link android.content.Intent#resolveActivity resolveActivity()} or {@link android.content.pm.PackageManager#queryIntentActivities queryIntentActivities()}. For example:</p> 183 184 <pre> 185 Intent intent = new Intent(Intent.ACTION_SEND); 186 ... 187 if (intent.resolveActivity(getPackageManager()) != null) { 188 startActivity(intent); 189 } else { 190 Toast.makeText(context, R.string.app_not_available, Toast.LENGTH_LONG).show(); 191 } 192 </pre> 193 194 195 <h3 id="BehaviorsAccounts">If your app depends on accounts...</h3> 196 197 <p>Your app might misbehave in a restricted profile environment.</p> 198 199 <p>Users within a restricted profile environment do not have access to user accounts by default. 200 If your app depends on an {@link android.accounts.Account}, then your app might crash or behave 201 unexpectedly when used in a restricted profile.</p> 202 203 <p>If you'd like to prevent restricted profiles from using your app entirely because your 204 app depends on account information that's sensitive, specify the <a 205 href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code 206 android:requiredAccountType}</a> attribute in your manifest's <a 207 href="{@docRoot}guide/topics/manifest/application-element.html">{@code <application>}</a> 208 element.</p> 209 210 <p>If youd like to allow restricted profiles to continue using your app even though they cant 211 create their own accounts, then you can either disable your app features that require an account 212 or allow restricted profiles to access the accounts created by the primary user. For more 213 information, see the section 214 below about <a href="#AccountsInProfile">Supporting accounts in a restricted profile</a>.</p> 215 216 217 <h3 id="BehaviorsVideoView">If your app uses VideoView...</h3> 218 219 <p>Your video might appear smaller on Android 4.3.</p> 220 221 <p>On previous versions of Android, the {@link android.widget.VideoView} widget incorrectly 222 calculated the {@code "wrap_content"} value for {@link android.R.attr#layout_height} and {@link 223 android.R.attr#layout_width} to be the same as {@code "match_parent"}. So while using {@code 224 "wrap_content"} for the height or width may have previously provided your desired video layout, 225 doing so may result in a much smaller video on Android 4.3 and higher. To fix the issue, replace 226 {@code "wrap_content"} with {@code "match_parent"} and verify your video appears as expected on 227 Android 4.3 as well as on older versions.</p> 228 229 230 231 232 233 234 <h2 id="RestrictedProfiles">Restricted Profiles</h2> 235 236 <p>On Android tablets, users can now create restricted profiles based on the primary user. 237 When users create a restricted profile, they can enable restrictions such as which apps are 238 available to the profile. A new set of APIs in Android 4.3 also allow you to build fine-grain 239 restriction settings for the apps you develop. For example, by using the new APIs, you can 240 allow users to control what type of content is available within your app when running in a 241 restricted profile environment.</p> 242 243 <p>The UI for users to control the restrictions you've built is managed by the system's 244 Settings application. To make your app's restriction settings appear to the user, 245 you must declare the restrictions your app provides by creating a {@link 246 android.content.BroadcastReceiver} that receives the {@link android.content.Intent#ACTION_GET_RESTRICTION_ENTRIES} intent. The system invokes this intent to query 247 all apps for available restrictions, then builds the UI to allow the primary user to 248 manage restrictions for each restricted profile. </p> 249 250 <p>In the {@link android.content.BroadcastReceiver#onReceive onReceive()} method of 251 your {@link android.content.BroadcastReceiver}, you must create a {@link 252 android.content.RestrictionEntry} for each restriction your app provides. Each {@link 253 android.content.RestrictionEntry} defines a restriction title, description, and one of the 254 following data types:</p> 255 256 <ul> 257 <li>{@link android.content.RestrictionEntry#TYPE_BOOLEAN} for a restriction that is 258 either true or false. 259 <li>{@link android.content.RestrictionEntry#TYPE_CHOICE} for a restriction that has 260 multiple choices that are mutually exclusive (radio button choices). 261 <li>{@link android.content.RestrictionEntry#TYPE_MULTI_SELECT} for a restriction that 262 has multiple choices that are <em>not</em> mutually exclusive (checkbox choices). 263 </ul> 264 265 <p>You then put all the {@link android.content.RestrictionEntry} objects into an {@link 266 java.util.ArrayList} and put it into the broadcast receiver's result as the value for the 267 {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} extra.</p> 268 269 <p>The system creates the UI for your app's restrictions in the Settings app and saves each 270 restriction with the unique key you provided for each {@link android.content.RestrictionEntry} 271 object. When the user opens your app, you can query for any current restrictions by 272 calling {@link android.os.UserManager#getApplicationRestrictions getApplicationRestrictions()}. 273 This returns a {@link android.os.Bundle} containing the key-value pairs for each restriction 274 you defined with the {@link android.content.RestrictionEntry} objects.</p> 275 276 <p>If you want to provide more specific restrictions that can't be handled by boolean, single 277 choice, and multi-choice values, then you can create an activity where the user can specify the 278 restrictions and allow users to open that activity from the restriction settings. In your 279 broadcast receiver, include the {@link android.content.Intent#EXTRA_RESTRICTIONS_INTENT} extra 280 in the result {@link android.os.Bundle}. This extra must specify an {@link android.content.Intent} 281 indicating the {@link android.app.Activity} class to launch (use the 282 {@link android.os.Bundle#putParcelable putParcelable()} method to pass {@link 283 android.content.Intent#EXTRA_RESTRICTIONS_INTENT} with the intent). 284 When the primary user enters your activity to set custom restrictions, your 285 activity must then return a result containing the restriction values in an extra using either 286 the {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} or {@link 287 android.content.Intent#EXTRA_RESTRICTIONS_BUNDLE} key, depending on whether you specify 288 {@link android.content.RestrictionEntry} objects or key-value pairs, respectively.</p> 289 290 291 <h3 id="AccountsInProfile">Supporting accounts in a restricted profile</h3> 292 293 <p>Any accounts added to the primary user are available to a restricted profile, but the 294 accounts are not accessible from the {@link android.accounts.AccountManager} APIs by default. 295 If you attempt to add an account with {@link android.accounts.AccountManager} while in a restricted 296 profile, you will get a failure result. Due to these restrictions, you have the following 297 three options:</p> 298 299 <li><strong>Allow access to the owners accounts from a restricted profile.</strong> 300 <p>To get access to an account from a restricted profile, you must add the <a href="{@docRoot}guide/topics/manifest/application-element.html#restrictedAccountType">{@code android:restrictedAccountType}</a> attribute to the <a 301 href="{@docRoot}guide/topics/manifest/application-element.html"><application></a> tag:</p> 302 <pre> 303 <application ... 304 android:restrictedAccountType="com.example.account.type" > 305 </pre> 306 307 <p class="caution"><strong>Caution:</strong> Enabling this attribute provides your 308 app access to the primary user's accounts from restricted profiles. So you should allow this 309 only if the information displayed by your app does not reveal personally identifiable 310 information (PII) thats considered sensitive. The system settings will inform the primary 311 user that your app grants restricted profiles to their accounts, so it should be clear to the user 312 that account access is important for your app's functionality. If possible, you should also 313 provide adequate restriction controls for the primary user that define how much account access 314 is allowed in your app.</p> 315 </li> 316 317 318 <li><strong>Disable certain functionality when unable to modify accounts.</strong> 319 <p>If you want to use accounts, but dont actually require them for your apps primary 320 functionality, you can check for account availability and disable features when not available. 321 You should first check if there is an existing account available. If not, then query whether 322 its possible to create a new account by calling {@link 323 android.os.UserManager#getUserRestrictions()} and check the {@link 324 android.os.UserManager#DISALLOW_MODIFY_ACCOUNTS} extra in the result. If it is {@code true}, 325 then you should disable whatever functionality of your app requires access to accounts. 326 For example:</p> 327 <pre> 328 UserManager um = (UserManager) context.getSystemService(Context.USER_SERVICE); 329 Bundle restrictions = um.getUserRestrictions(); 330 if (restrictions.getBoolean(UserManager.DISALLOW_MODIFY_ACCOUNTS, false)) { 331 // cannot add accounts, disable some functionality 332 } 333 </pre> 334 <p class="note"><strong>Note:</strong> In this scenario, you should <em>not</em> declare 335 any new attributes in your manifest file.</p> 336 </li> 337 338 <li><strong>Disable your app when unable to access private accounts.</strong> 339 <p>If its instead important that your app not be available to restricted profiles because 340 your app depends on sensitive personal information in an account (and because restricted profiles 341 currently cannot add new accounts), add 342 the <a href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code 343 android:requiredAccountType}</a> attribute to the <a 344 href="{@docRoot}guide/topics/manifest/application-element.html"><application></a> tag:</p> 345 <pre> 346 <application ... 347 android:requiredAccountType="com.example.account.type" > 348 </pre> 349 <p>For example, the Gmail app uses this attribute to disable itself for restricted profiles, 350 because the owner's personal email should not be available to restricted profiles.</p> 351 </li> 352 353 354 355 <h2 id="Wireless">Wireless and Connectivity</h2> 356 357 358 <h3 id="BTLE">Bluetooth Low Energy (Smart Ready)</h3> 359 360 <p>Android now supports Bluetooth Low Energy (LE) with new APIs in {@link android.bluetooth}. 361 With the new APIs, you can build Android apps that communicate with Bluetooth Low Energy 362 peripherals such as heart rate monitors and pedometers.</p> 363 364 <p>Because Bluetooth LE is a hardware feature that is not available on all 365 Android-powered devices, you must declare in your manifest file a <a 366 href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code <uses-feature>}</a> 367 element for {@code "android.hardware.bluetooth_le"}:</p> 368 <pre> 369 <uses-feature android:name="android.hardware.bluetooth_le" android:required="true" /> 370 </pre> 371 372 <p>If you're already familiar with Android's Classic Bluetooth APIs, notice that using the 373 Bluetooth LE APIs has some differences. Most importantly is that there's now a {@link 374 android.bluetooth.BluetoothManager} class that you should use for some high level operations 375 such as acquiring a {@link android.bluetooth.BluetoothAdapter}, getting a list of connected 376 devices, and checking the state of a device. For example, here's how you should now get the 377 {@link android.bluetooth.BluetoothAdapter}:</p> 378 <pre> 379 final BluetoothManager bluetoothManager = 380 (BluetoothManager) getSystemService(Context.BLUETOOTH_SERVICE); 381 mBluetoothAdapter = bluetoothManager.getAdapter(); 382 </pre> 383 384 <p>To discover Bluetooth LE peripherals, call {@link android.bluetooth.BluetoothAdapter#startLeScan 385 startLeScan()} on the {@link android.bluetooth.BluetoothAdapter}, passing it an implementation 386 of the {@link android.bluetooth.BluetoothAdapter.LeScanCallback} interface. When the Bluetooth 387 adapter detects a Bluetooth LE peripheral, your {@link 388 android.bluetooth.BluetoothAdapter.LeScanCallback} implementation receives a call to the 389 {@link android.bluetooth.BluetoothAdapter.LeScanCallback#onLeScan onLeScan()} method. This 390 method provides you with a {@link android.bluetooth.BluetoothDevice} object representing the 391 detected device, the RSSI value for the device, and a byte array containing the device's 392 advertisement record.</p> 393 394 <p>If you want to scan for only specific types of peripherals, you can instead call {@link 395 android.bluetooth.BluetoothAdapter#startLeScan startLeScan()} and include an array of {@link 396 java.util.UUID} objects that specify the GATT services your app supports.</p> 397 398 <p class="note"><strong>Note:</strong> You can only scan for Bluetooth LE devices <em>or</em> 399 scan for Classic Bluetooth devices using previous APIs. You cannot scan for both LE and Classic 400 Bluetooth devices at once.</p> 401 402 <p>To then connect to a Bluetooth LE peripheral, call {@link 403 android.bluetooth.BluetoothDevice#connectGatt connectGatt()} on the corresponding 404 {@link android.bluetooth.BluetoothDevice} object, passing it an implementation of 405 {@link android.bluetooth.BluetoothGattCallback}. Your implementation of {@link 406 android.bluetooth.BluetoothGattCallback} receives callbacks regarding the connectivity 407 state with the device and other events. It's during the {@link 408 android.bluetooth.BluetoothGattCallback#onConnectionStateChange onConnectionStateChange()} 409 callback that you can begin communicating with the device if the method passes {@link 410 android.bluetooth.BluetoothProfile#STATE_CONNECTED} as the new state.</p> 411 412 <p>Accessing Bluetooth features on a device also requires that your app request certain 413 Bluetooth user permissions. For more information, see the <a 414 href="{@docRoot}guide/topics/connectivity/bluetooth-le.html">Bluetooth Low Energy</a> API guide.</p> 415 416 417 <h3 id="WiFiScan">Wi-Fi scan-only mode</h3> 418 419 <p>When attempting to identify the user's location, Android may use Wi-Fi to help determine 420 the location by scanning nearby access points. However, users often keep Wi-Fi turned off to 421 conserve battery, resulting in location data that's less accurate. Android now includes a 422 scan-only mode that allows the device Wi-Fi to scan access points to help obtain the location 423 without connecting to an access point, thus greatly reducing battery usage.</p> 424 425 <p>If you want to acquire the user's location but Wi-Fi is currently off, you can request the 426 user to enable Wi-Fi scan-only mode by calling {@link android.content.Context#startActivity 427 startActivity()} with the action {@link 428 android.net.wifi.WifiManager#ACTION_REQUEST_SCAN_ALWAYS_AVAILABLE}.</p> 429 430 431 <h3 id="WiFiConfig">Wi-Fi configuration</h3> 432 433 <p>New {@link android.net.wifi.WifiEnterpriseConfig} APIs allow enterprise-oriented services to 434 automate Wi-Fi configuration for managed devices.</p> 435 436 437 <h3 id="QuickResponse">Quick response for incoming calls</h3> 438 439 <p>Since Android 4.0, a feature called "Quick response" allows users to respond to incoming 440 calls with an immediate text message without needing to pick up the call or unlock the device. 441 Until now, these quick messages were always handled by the default Messaging app. Now any app 442 can declare its capability to handle these messages by creating a {@link android.app.Service} 443 with an intent filter for {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}.</p> 444 445 <p>When the user responds to an incoming call with a quick response, the Phone app sends 446 the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} intent with a URI 447 describing the recipient (the caller) and the {@link android.content.Intent#EXTRA_TEXT} extra 448 with the message the user wants to send. When your service receives the intent, it should deliver 449 the message and immediately stop itself (your app should not show an activity).</p> 450 451 <p>In order to receive this intent, you must declare the {@link 452 android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE} permission.</p> 453 454 455 456 <h2 id="Multimedia">Multimedia</h2> 457 458 <h3 id="MediaExtractor">MediaExtractor and MediaCodec enhancements</h3> 459 460 <p>Android now makes it easier for you to write your own Dynamic Adaptive 461 Streaming over HTTP (DASH) players in accordance with the ISO/IEC 23009-1 standard, 462 using existing APIs in {@link android.media.MediaCodec} and {@link 463 android.media.MediaExtractor}. The framework underlying these APIs has been updated to support 464 parsing of fragmented MP4 files, but your app is still responsible for parsing the MPD metadata 465 and passing the individual streams to {@link android.media.MediaExtractor}.</p> 466 467 <p>If you want to use DASH with encrypted content, notice that the {@link android.media.MediaExtractor#getSampleCryptoInfo getSampleCryptoInfo()} method returns the {@link 468 android.media.MediaCodec.CryptoInfo} metadata describing the structure of each encrypted media 469 sample. Also, the {@link android.media.MediaExtractor#getPsshInfo()} method has been added to 470 {@link android.media.MediaExtractor} so you can access the PSSH metadata for your DASH media. 471 This method returns a map of {@link java.util.UUID} objects to bytes, with the 472 {@link java.util.UUID} specifying the crypto scheme, and the bytes being the data specific 473 to that scheme.</p> 474 475 476 <h3 id="DRM">Media DRM</h3> 477 478 <p>The new {@link android.media.MediaDrm} class provides a modular solution for digital rights 479 management (DRM) with your media content by separating DRM concerns from media playback. For 480 instance, this API separation allows you to play back Widevine-encrypted content without having 481 to use the Widevine media format. This DRM solution also supports DASH Common Encryption so you 482 can use a variety of DRM schemes with your streaming content.</p> 483 484 <p>You can use {@link android.media.MediaDrm} to obtain opaque key-request messages and process 485 key-response messages from the server for license acquisition and provisioning. Your app is 486 responsible for handling the network communication with the servers; the {@link 487 android.media.MediaDrm} class provides only the ability to generate and process the messages.</p> 488 489 <p>The {@link android.media.MediaDrm} APIs are intended to be used in conjunction with the 490 {@link android.media.MediaCodec} APIs that were introduced in Android 4.1 (API level 16), 491 including {@link android.media.MediaCodec} for encoding and decoding your content, {@link 492 android.media.MediaCrypto} for handling encrypted content, and {@link android.media.MediaExtractor} 493 for extracting and demuxing your content.</p> 494 495 <p>You must first construct {@link android.media.MediaExtractor} and 496 {@link android.media.MediaCodec} objects. You can then access the DRM-scheme-identifying 497 {@link java.util.UUID}, typically from metadata in the content, and use it to construct an 498 instance of a {@link android.media.MediaDrm} object with its constructor.</p> 499 500 501 <h3 id="EncodingSurface">Video encoding from a Surface</h3> 502 503 <p>Android 4.1 (API level 16) added the {@link android.media.MediaCodec} class for low-level 504 encoding and decoding of media content. When encoding video, Android 4.1 required that you provide 505 the media with a {@link java.nio.ByteBuffer} array, but Android 4.3 now allows you to use a {@link 506 android.view.Surface} as the input to an encoder. For instance, this allows you to encode input 507 from an existing video file or using frames generated from OpenGL ES.</p> 508 509 <p>To use a {@link android.view.Surface} as the input to your encoder, first call {@link 510 android.media.MediaCodec#configure configure()} for your {@link android.media.MediaCodec}. 511 Then call {@link android.media.MediaCodec#createInputSurface()} to receive the {@link 512 android.view.Surface} upon which you can stream your media.</p> 513 514 <p>For example, you can use the given {@link android.view.Surface} as the window for an OpenGL 515 context by passing it to {@link android.opengl.EGL14#eglCreateWindowSurface 516 eglCreateWindowSurface()}. Then while rendering the surface, call {@link 517 android.opengl.EGL14#eglSwapBuffers eglSwapBuffers()} to pass the frame to the {@link 518 android.media.MediaCodec}.</p> 519 520 <p>To begin encoding, call {@link android.media.MediaCodec#start()} on the {@link 521 android.media.MediaCodec}. When done, call {@link android.media.MediaCodec#signalEndOfInputStream} 522 to terminate encoding, and call {@link android.view.Surface#release()} on the 523 {@link android.view.Surface}.</p> 524 525 526 <h3 id="MediaMuxing">Media muxing</h3> 527 528 <p>The new {@link android.media.MediaMuxer} class enables multiplexing between one audio stream 529 and one video stream. These APIs serve as a counterpart to the {@link android.media.MediaExtractor} 530 class added in Android 4.2 for de-multiplexing (demuxing) media.</p> 531 532 <p>Supported output formats are defined in {@link android.media.MediaMuxer.OutputFormat}. Currently, 533 MP4 is the only supported output format and {@link android.media.MediaMuxer} currently supports 534 only one audio stream and/or one video stream at a time.</p> 535 536 <p>{@link android.media.MediaMuxer} is mostly designed to work with {@link android.media.MediaCodec} 537 so you can perform video processing through {@link android.media.MediaCodec} then save the 538 output to an MP4 file through {@link android.media.MediaMuxer}. You can also use {@link 539 android.media.MediaMuxer} in combination with {@link android.media.MediaExtractor} to perform 540 media editing without the need to encode or decode.</p> 541 542 543 <h3 id="ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</h3> 544 545 <p>In Android 4.0 (API level 14), the {@link android.media.RemoteControlClient} was added to 546 enable media playback controls from remote control clients such as the controls available on the 547 lock screen. Android 4.3 now provides the ability for such controllers to display the playback 548 position and controls for scrubbing the playback. If you've enabled remote control for your 549 media app with the {@link android.media.RemoteControlClient} APIs, then you can allow playback 550 scrubbing by implementing two new interfaces.</p> 551 552 <p>First, you must enable the {@link 553 android.media.RemoteControlClient#FLAG_KEY_MEDIA_POSITION_UPDATE} flag by passing it to 554 {@link android.media.RemoteControlClient#setTransportControlFlags setTransportControlsFlags()}.</p> 555 556 <p>Then implement the following two new interfaces:</p> 557 <dl> 558 <dt>{@link android.media.RemoteControlClient.OnGetPlaybackPositionListener}</dt> 559 <dd>This includes the callback {@link android.media.RemoteControlClient.OnGetPlaybackPositionListener#onGetPlaybackPosition}, which requests the current position 560 of your media when the remote control needs to update the progress in its UI.</dd> 561 562 <dt>{@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener}</dt> 563 <dd>This includes the callback {@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener#onPlaybackPositionUpdate onPlaybackPositionUpdate()}, which 564 tells your app the new time code for your media when the user scrubs the playback with the 565 remote control UI. 566 <p>Once you update your playback with the new position, call {@link 567 android.media.RemoteControlClient#setPlaybackState setPlaybackState()} to indicate the 568 new playback state, position, and speed.</p> 569 </dd> 570 </dl> 571 572 <p>With these interfaces defined, you can set them for your {@link 573 android.media.RemoteControlClient} by calling {@link android.media.RemoteControlClient#setOnGetPlaybackPositionListener setOnGetPlaybackPositionListener()} and 574 {@link android.media.RemoteControlClient#setPlaybackPositionUpdateListener 575 setPlaybackPositionUpdateListener()}, respectively.</p> 576 577 578 579 <h2 id="Graphics">Graphics</h2> 580 581 <h3 id="OpenGL">Support for OpenGL ES 3.0</h3> 582 583 <p>Android 4.3 adds Java interfaces and native support for OpenGL ES 3.0. Key new functionality 584 provided in OpenGL ES 3.0 includes:</p> 585 <ul> 586 <li>Acceleration of advanced visual effects</li> 587 <li>High quality ETC2/EAC texture compression as a standard feature</li> 588 <li>A new version of the GLSL ES shading language with integer and 32-bit floating point support</li> 589 <li>Advanced texture rendering</li> 590 <li>Broader standardization of texture size and render-buffer formats</li> 591 </ul> 592 593 <p>The Java interface for OpenGL ES 3.0 on Android is provided with {@link android.opengl.GLES30}. 594 When using OpenGL ES 3.0, be sure that you declare it in your manifest file with the 595 <a href="{@docRoot}guide/topics/manifest/uses-feature-element.html"><uses-feature></a> 596 tag and the {@code android:glEsVersion} attribute. For example:</p> 597 <pre> 598 <manifest> 599 <uses-feature android:glEsVersion="0x00030000" /> 600 ... 601 </manifest> 602 </pre> 603 604 <p>And remember to specify the OpenGL ES context by calling {@link 605 android.opengl.GLSurfaceView#setEGLContextClientVersion setEGLContextClientVersion()}, 606 passing {@code 3} as the version.</p> 607 608 <p>For more information about using OpenGL ES, including how to check the device's supported 609 OpenGL ES version at runtime, see the <a href="{@docRoot}guide/topics/graphics/opengl.html" 610 >OpenGL ES</a> API guide.</p> 611 612 613 <h3 id="MipMap">Mipmapping for drawables</h3> 614 615 <p>Using a mipmap as the source for your bitmap or drawable is a simple way to provide a 616 quality image and various image scales, which can be particularly useful if you expect your 617 image to be scaled during an animation.</p> 618 619 <p>Android 4.2 (API level 17) added support for mipmaps in the {@link android.graphics.Bitmap} 620 class—Android swaps the mip images in your {@link android.graphics.Bitmap} when you've 621 supplied a mipmap source and have enabled {@link android.graphics.Bitmap#setHasMipMap 622 setHasMipMap()}. Now in Android 4.3, you can enable mipmaps for a {@link 623 android.graphics.drawable.BitmapDrawable} object as well, by providing a mipmap asset and 624 setting the {@code android:mipMap} attribute in a bitmap resource file or by calling {@link 625 android.graphics.drawable.BitmapDrawable#hasMipMap hasMipMap()}. 626 </p> 627 628 629 630 <h2 id="UI">User Interface</h2> 631 632 <h3 id="ViewOverlay">View overlays</h3> 633 634 <p>The new {@link android.view.ViewOverlay} class provides a transparent layer on top of 635 a {@link android.view.View} on which you can add visual content and which does not affect 636 the layout hierarchy. You can get a {@link android.view.ViewOverlay} for any {@link 637 android.view.View} by calling {@link android.view.View#getOverlay}. The overlay 638 always has the same size and position as its host view (the view from which it was created), 639 allowing you to add content that appears in front of the host view, but which cannot extend 640 the bounds of that host view. 641 </p> 642 643 <p>Using a {@link android.view.ViewOverlay} is particularly useful when you want to create 644 animations such as sliding a view outside of its container or moving items around the screen 645 without affecting the view hierarchy. However, because the usable area of an overlay is 646 restricted to the same area as its host view, if you want to animate a view moving outside 647 its position in the layout, you must use an overlay from a parent view that has the desired 648 layout bounds.</p> 649 650 <p>When you create an overlay for a widget view such as a {@link android.widget.Button}, you 651 can add {@link android.graphics.drawable.Drawable} objects to the overlay by calling 652 {@link android.view.ViewOverlay#add(Drawable)}. If you call {@link 653 android.view.ViewGroup#getOverlay} for a layout view, such as {@link android.widget.RelativeLayout}, 654 the object returned is a {@link android.view.ViewGroupOverlay}. The 655 {@link android.view.ViewGroupOverlay} class is a subclass 656 of {@link android.view.ViewOverlay} that also allows you to add {@link android.view.View} 657 objects by calling {@link android.view.ViewGroupOverlay#add(View)}. 658 </p> 659 660 <p class="note"><strong>Note:</strong> All drawables and views that you add to an overlay 661 are visual only. They cannot receive focus or input events.</p> 662 663 <p>For example, the following code animates a view sliding to the right by placing the view 664 in the parent view's overlay, then performing a translation animation on that view:</p> 665 <pre> 666 View view = findViewById(R.id.view_to_remove); 667 ViewGroup container = (ViewGroup) view.getParent(); 668 container.getOverlay().add(view); 669 ObjectAnimator anim = ObjectAnimator.ofFloat(view, "translationX", container.getRight()); 670 anim.start(); 671 </pre> 672 673 674 <h3 id="OpticalBounds">Optical bounds layout</h3> 675 676 <p>For views that contain nine-patch background images, you can now specify that they should 677 be aligned with neighboring views based on the "optical" bounds of the background image rather 678 than the "clip" bounds of the view.</p> 679 680 <p>For example, figures 1 and 2 each show the same layout, but the version in figure 1 is 681 using clip bounds (the default behavior), while figure 2 is using optical bounds. Because the 682 nine-patch images used for the button and the photo frame include padding around the edges, 683 they dont appear to align with each other or the text when using clip bounds.</p> 684 685 <p class="note"><strong>Note:</strong> The screenshot in figures 1 and 2 have the "Show 686 layout bounds" developer setting enabled. For each view, red lines indicate the optical 687 bounds, blue lines indicate the clip bounds, and pink indicates margins.</p> 688 689 <script type="text/javascript"> 690 function toggleOpticalImages(mouseover) { 691 692 $("img.optical-img").each(function() { 693 $img = $(this); 694 var index = $img.attr('src').lastIndexOf("/") + 1; 695 var path = $img.attr('src').substr(0,index); 696 var name = $img.attr('src').substr(index); 697 var splitname; 698 var highres = false; 699 if (name.indexOf("@2x") != -1) { 700 splitname = name.split("@2x."); 701 highres = true; 702 } else { 703 splitname = name.split("."); 704 } 705 706 var newname; 707 if (mouseover) { 708 if (highres) { 709 newname = splitname[0] + "-normal (a] 2x.png"; 710 } else { 711 newname = splitname[0] + "-normal.png"; 712 } 713 } else { 714 if (highres) { 715 newname = splitname[0].split("-normal")[0] + "@2x.png"; 716 } else { 717 newname = splitname[0].split("-normal")[0] + ".png"; 718 } 719 } 720 721 $img.attr('src', path + newname); 722 723 }); 724 } 725 </script> 726 727 <p class="table-caption"><em>Mouse over to hide the layout bounds.</em></p> 728 <div style="float:left;width:296px"> 729 <img src="{@docRoot}images/tools/clipbounds@2x.png" width="296" alt="" class="optical-img" 730 onmouseover="toggleOpticalImages(true)" onmouseout="toggleOpticalImages(false)" /> 731 <p class="img-caption"><strong>Figure 1.</strong> Layout using clip bounds (default).</p> 732 </div> 733 <div style="float:left;width:296px;margin-left:60px"> 734 <img src="{@docRoot}images/tools/opticalbounds@2x.png" width="296" alt="" class="optical-img" 735 onmouseover="toggleOpticalImages(true)" onmouseout="toggleOpticalImages(false)" /> 736 <p class="img-caption"><strong>Figure 2.</strong> Layout using optical bounds.</p> 737 </div> 738 739 740 <p style="clear:left">To align the views based on their optical bounds, set the {@code android:layoutMode} attribute to {@code "opticalBounds"} in one of the parent layouts. For example:</p> 741 742 <pre> 743 <LinearLayout android:layoutMode="opticalBounds" ... > 744 </pre> 745 746 747 <div class="figure" style="width:155px"> 748 <img src="{@docRoot}images/tools/ninepatch_opticalbounds@2x.png" width="121" alt="" /> 749 <p class="img-caption"><strong>Figure 3.</strong> Zoomed view of the Holo button nine-patch with 750 optical bounds. 751 </p> 752 </div> 753 754 <p>For this to work, the nine-patch images applied to the background of your views must specify 755 the optical bounds using red lines along the bottom and right-side of the nine-patch file (as 756 shown in figure 3). The red lines indicate the region that should be subtracted from 757 the clip bounds, leaving the optical bounds of the image.</p> 758 759 <p>When you enable optical bounds for a {@link android.view.ViewGroup} in your layout, all 760 descendant views inherit the optical bounds layout mode unless you override it for a group by 761 setting {@code android:layoutMode} to {@code "clipBounds"}. All layout elements also honor the 762 optical bounds of their child views, adapting their own bounds based on the optical bounds of 763 the views within them. However, layout elements (subclasses of {@link android.view.ViewGroup}) 764 currently do not support optical bounds for nine-patch images applied to their own background.</p> 765 766 <p>If you create a custom view by subclassing {@link android.view.View}, {@link android.view.ViewGroup}, or any subclasses thereof, your view will inherit these optical bound behaviors.</p> 767 768 <p class="note"><strong>Note:</strong> All widgets supported by the Holo theme have been updated 769 with optical bounds, including {@link android.widget.Button}, {@link android.widget.Spinner}, 770 {@link android.widget.EditText}, and others. So you can immediately benefit by setting the 771 {@code android:layoutMode} attribute to {@code "opticalBounds"} if your app applies a Holo theme 772 ({@link android.R.style#Theme_Holo Theme.Holo}, {@link android.R.style#Theme_Holo_Light 773 Theme.Holo.Light}, etc.). 774 </p> 775 776 <p>To specify optical bounds for your own nine-patch images with the <a 777 href="{@docRoot}tools/help/draw9patch.html">Draw 9-patch</a> tool, hold CTRL when clicking on 778 the border pixels.</p> 779 780 781 782 783 <h3 id="AnimationRect">Animation for Rect values</h3> 784 785 <p>You can now animate between two {@link android.graphics.Rect} values with the new {@link 786 android.animation.RectEvaluator}. This new class is an implementation of {@link 787 android.animation.TypeEvaluator} that you can pass to {@link 788 android.animation.ValueAnimator#setEvaluator ValueAnimator.setEvaluator()}. 789 </p> 790 791 <h3 id="AttachFocus">Window attach and focus listener</h3> 792 793 <p>Previously, if you wanted to listen for when your view attached/detached to the window or 794 when its focus changed, you needed to override the {@link android.view.View} class to 795 implement {@link android.view.View#onAttachedToWindow onAttachedToWindow()} and {@link 796 android.view.View#onDetachedFromWindow onDetachedFromWindow()}, or {@link 797 android.view.View#onWindowFocusChanged onWindowFocusChanged()}, respectively. 798 </p> 799 800 <p>Now, to receive attach and detach events you can instead implement {@link 801 android.view.ViewTreeObserver.OnWindowAttachListener} and set it on a view with 802 {@link android.view.ViewTreeObserver#addOnWindowAttachListener addOnWindowAttachListener()}. 803 And to receive focus events, you can implement {@link 804 android.view.ViewTreeObserver.OnWindowFocusChangeListener} and set it on a view with 805 {@link android.view.ViewTreeObserver#addOnWindowFocusChangeListener 806 addOnWindowFocusChangeListener()}. 807 </p> 808 809 810 <h3 id="Overscan">TV overscan support</h3> 811 812 <p>To be sure your app fills the entire screen on every television, you can now enable overscan 813 for you app layout. Overscan mode is determined by the {@link android.view.WindowManager.LayoutParams#FLAG_LAYOUT_IN_OVERSCAN} flag, which you can enable with platform themes such as 814 {@link android.R.style#Theme_DeviceDefault_NoActionBar_Overscan} or by enabling the 815 {@link android.R.attr#windowOverscan} style in a custom theme.</p> 816 817 818 <h3 id="Orientation">Screen orientation</h3> 819 820 <p>The <a 821 href="{@docRoot}guide/topics/manifest/activity-element.html">{@code <activity>}</a> 822 tag's <a 823 href="{@docRoot}guide/topics/manifest/activity-element.html#screen">{@code screenOrientation}</a> 824 attribute now supports additional values to honor the user's preference for auto-rotation:</p> 825 <dl> 826 <dt>{@code "userLandscape"}</dt> 827 <dd>Behaves the same as {@code "sensorLandscape"}, except if the user disables auto-rotate 828 then it locks in the normal landscape orientation and will not flip. 829 </dd> 830 831 <dt>{@code "userPortrait"}</dt> 832 <dd>Behaves the same as {@code "sensorPortrait"}, except if the user disables auto-rotate then 833 it locks in the normal portrait orientation and will not flip. 834 </dd> 835 836 <dt>{@code "fullUser"}</dt> 837 <dd>Behaves the same as {@code "fullSensor"} and allows rotation in all four directions, except 838 if the user disables auto-rotate then it locks in the user's preferred orientation. 839 </dd></dl> 840 841 <p>Additionally, you can now also declare {@code "locked"} to lock your app's orientation into 842 the screen's current orientation.</p> 843 844 845 <h3 id="RotationAnimation">Rotation animations</h3> 846 847 <p>The new {@link android.view.WindowManager.LayoutParams#rotationAnimation} field in 848 {@link android.view.WindowManager} allows you to select between one of three animations you 849 want to use when the system switches screen orientations. The three animations are:</p> 850 <ul> 851 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_CROSSFADE}</li> 852 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_JUMPCUT}</li> 853 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_ROTATE}</li> 854 </ul> 855 856 <p class="note"><strong>Note:</strong> These animations are available only if you've set your activity to use "fullscreen" mode, which you can enable with themes such as {@link android.R.style#Theme_Holo_NoActionBar_Fullscreen Theme.Holo.NoActionBar.Fullscreen}.</p> 857 858 <p>For example, here's how you can enable the "crossfade" animation:</p> 859 <pre> 860 protected void onCreate(Bundle savedInstanceState) { 861 super.onCreate(savedInstanceState); 862 863 WindowManager.LayoutParams params = getWindow().getAttributes(); 864 params.rotationAnimation = WindowManager.LayoutParams.ROTATION_ANIMATION_CROSSFADE; 865 getWindow().setAttributes(params); 866 ... 867 } 868 </pre> 869 870 871 <h2 id="UserInput">User Input</h2> 872 873 <h3 id="Sensors">New sensor types</h3> 874 <p>The new {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} sensor allows you to detect the device's rotations without worrying about magnetic interferences. Unlike the {@link android.hardware.Sensor#TYPE_ROTATION_VECTOR} sensor, the {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} is not based on magnetic north.</p> 875 876 <p>The new {@link android.hardware.Sensor#TYPE_GYROSCOPE_UNCALIBRATED} and {@link 877 android.hardware.Sensor#TYPE_MAGNETIC_FIELD_UNCALIBRATED} sensors provide raw sensor data without 878 consideration for bias estimations. That is, the existing {@link 879 android.hardware.Sensor#TYPE_GYROSCOPE} and {@link android.hardware.Sensor#TYPE_MAGNETIC_FIELD} 880 sensors provide sensor data that takes into account estimated bias from gyro-drift and hard iron 881 in the device, respectively. Whereas the new "uncalibrated" versions of these sensors instead provide 882 the raw sensor data and offer the estimated bias values separately. These sensors allow you to 883 provide your own custom calibration for the sensor data by enhancing the estimated bias with 884 external data.</p> 885 886 887 888 <h2 id="NotificationListener">Notification Listener</h2> 889 890 <p>Android 4.3 adds a new service class, {@link android.service.notification.NotificationListenerService}, that allows your app to receive information about new notifications as they are posted by the system. </p> 891 892 <p>If your app currently uses the accessibility service APIs to access system notifications, you should update your app to use these APIs instead.</p> 893 894 895 896 897 <h2 id="Contacts">Contacts Provider</h2> 898 899 <h3 id="Contactables">Query for "contactables"</h3> 900 901 <p>The new Contacts Provider query, {@link android.provider.ContactsContract.CommonDataKinds.Contactables#CONTENT_URI Contactables.CONTENT_URI}, provides an efficient way to get one {@link android.database.Cursor} that contains all email addresses and phone numbers belonging to all contacts matching the specified query.</p> 902 903 904 <h3 id="ContactsDelta">Query for contacts deltas</h3> 905 906 <p>New APIs have been added to Contacts Provider that allow you to efficiently query recent changes to the contacts data. Previously, your app could be notified when something in the contacts data changed, but you would not know exactly what changed and would need to retrieve all contacts then iterate through them to discover the change.</p> 907 908 <p>To track changes to inserts and updates, you can now include the {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP} parameter with your selection to query only the contacts that have changed since the last time you queried the provider.</p> 909 910 <p>To track which contacts have been deleted, the new table {@link android.provider.ContactsContract.DeletedContacts} provides a log of contacts that have been deleted (but each contact deleted is held in this table for a limited time). Similar to {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP}, you can use the new selection parameter, {@link android.provider.ContactsContract.DeletedContacts#CONTACT_DELETED_TIMESTAMP} to check which contacts have been deleted since the last time you queried the provider. The table also contains the constant {@link android.provider.ContactsContract.DeletedContacts#DAYS_KEPT_MILLISECONDS} containing the number of days (in milliseconds) that the log will be kept.</p> 911 912 <p>Additionally, the Contacts Provider now broadcasts the {@link 913 android.provider.ContactsContract.Intents#CONTACTS_DATABASE_CREATED} action when the user 914 clears the contacts storage through the system settings menu, effectively recreating the 915 Contacts Provider database. Its intended to signal apps that they need to drop all the contact 916 information theyve stored and reload it with a new query.</p> 917 918 <p>For sample code using these APIs to check for changes to the contacts, look in the ApiDemos 919 sample available in the <a href="{@docRoot}tools/samples/index.html">SDK Samples</a> download.</p> 920 921 922 <h2 id="Localization">Localization</h2> 923 924 <h3 id="BiDi">Improved support for bi-directional text</h3> 925 926 <p>Previous versions of Android support right-to-left (RTL) languages and layout, 927 but sometimes don't properly handle mixed-direction text. So Android 4.3 adds the {@link 928 android.text.BidiFormatter} APIs that help you properly format text with opposite-direction 929 content without garbling any parts of it.</p> 930 931 <p>For example, when you want to create a sentence with a string variable, such as "Did you mean 932 15 Bay Street, Laurel, CA?", you normally pass a localized string resource and the variable to 933 {@link java.lang.String#format String.format()}:</p> 934 <pre> 935 Resources res = getResources(); 936 String suggestion = String.format(res.getString(R.string.did_you_mean), address); 937 </pre> 938 939 <p>However, if the locale is Hebrew, then the formatted string comes out like this:</p> 940 941 <p dir="rtl"> 15 Bay Street, Laurel, CA?</p> 942 943 <p>That's wrong because the "15" should be left of "Bay Street." The solution is to use {@link 944 android.text.BidiFormatter} and its {@link android.text.BidiFormatter#unicodeWrap(String) 945 unicodeWrap()} method. For example, the code above becomes:</p> 946 <pre> 947 Resources res = getResources(); 948 BidiFormatter bidiFormatter = BidiFormatter.getInstance(); 949 String suggestion = String.format(res.getString(R.string.did_you_mean), 950 bidiFormatter.unicodeWrap(address)); 951 </pre> 952 953 <p> 954 By default, {@link android.text.BidiFormatter#unicodeWrap(String) unicodeWrap()} uses the 955 first-strong directionality estimation heuristic, which can get things wrong if the first 956 signal for text direction does not represent the appropriate direction for the content as a whole. 957 If necessary, you can specify a different heuristic by passing one of the {@link 958 android.text.TextDirectionHeuristic} constants from {@link android.text.TextDirectionHeuristics} 959 to {@link android.text.BidiFormatter#unicodeWrap(String,TextDirectionHeuristic) unicodeWrap()}.</p> 960 961 <p class="note"><strong>Note:</strong> These new APIs are also available for previous versions 962 of Android through the Android <a href="{@docRoot}tools/support-library/index.html">Support 963 Library</a>, with the {@link android.support.v4.text.BidiFormatter} class and related APIs.</p> 964 965 966 967 <h2 id="A11yService">Accessibility Services</h2> 968 969 <h3 id="A11yKeyEvents">Handle key events</h3> 970 971 <p>An {@link android.accessibilityservice.AccessibilityService} can now receive a callback for 972 key input events with the {@link android.accessibilityservice.AccessibilityService#onKeyEvent 973 onKeyEvent()} callback method. This allows your accessibility service to handle input for 974 key-based input devices such as a keyboard and translate those events to special actions that 975 previously may have been possible only with touch input or the device's directional pad.</p> 976 977 978 <h3 id="A11yText">Select text and copy/paste</h3> 979 980 <p>The {@link android.view.accessibility.AccessibilityNodeInfo} now provides APIs that allow 981 an {@link android.accessibilityservice.AccessibilityService} to select, cut, copy, and paste 982 text in a node.</p> 983 984 <p>To specify the selection of text to cut or copy, your accessibility service can use the new 985 action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_SET_SELECTION}, passing 986 with it the selection start and end position with {@link 987 android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_START_INT} and {@link 988 android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_END_INT}. 989 Alternatively you can select text by manipulating the cursor position using the existing 990 action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_NEXT_AT_MOVEMENT_GRANULARITY} 991 (previously only for moving the cursor position), and adding the argument {@link 992 android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_EXTEND_SELECTION_BOOLEAN}.</p> 993 994 <p>You can then cut or copy with {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_CUT}, 995 {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_COPY}, then later paste with 996 {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_PASTE}.</p> 997 998 999 <p class="note"><strong>Note:</strong> These new APIs are also available for previous versions 1000 of Android through the Android <a href="{@docRoot}tools/support-library/index.html">Support 1001 Library</a>, with the {@link android.support.v4.view.accessibility.AccessibilityNodeInfoCompat} 1002 class.</p> 1003 1004 1005 1006 <h3 id="A11yFeatures">Declare accessibility features</h3> 1007 1008 <p>Beginning with Android 4.3, an accessibility service must declare accessibility capabilities 1009 in its metadata file in order to use certain accessibility features. If the capability is not 1010 requested in the metadata file, then the feature will be a no-op. To declare your service's 1011 accessibility capabilities, you must use XML attributes that correspond to the various 1012 "capability" constants in the {@link android.accessibilityservice.AccessibilityServiceInfo} 1013 class.</p> 1014 1015 <p>For example, if a service does not request the {@link android.R.styleable#AccessibilityService_canRequestFilterKeyEvents flagRequestFilterKeyEvents} capability, 1016 then it will not receive key events.</p> 1017 1018 1019 <h2 id="Testing">Testing and Debugging</h2> 1020 1021 <h3 id="UiAutomation">Automated UI testing</h3> 1022 1023 <p>The new {@link android.app.UiAutomation} class provides APIs that allow you to simulate user 1024 actions for test automation. By using the platform's {@link 1025 android.accessibilityservice.AccessibilityService} APIs, the {@link android.app.UiAutomation} 1026 APIs allow you to inspect the screen content and inject arbitrary keyboard and touch events.</p> 1027 1028 <p>To get an instance of {@link android.app.UiAutomation}, call {@link 1029 android.app.Instrumentation#getUiAutomation Instrumentation.getUiAutomation()}. In order 1030 for this to work, you must supply the {@code -w} option with the {@code instrument} command 1031 when running your {@link android.test.InstrumentationTestCase} from <a 1032 href="{@docRoot}tools/help/adb.html#am">{@code adb shell}</a>.</p> 1033 1034 <p>With the {@link android.app.UiAutomation} instance, you can execute arbitrary events to test 1035 your app by calling {@link android.app.UiAutomation#executeAndWaitForEvent 1036 executeAndWaitForEvent()}, passing it a {@link java.lang.Runnable} to perform, a timeout 1037 period for the operation, and an implementation of the {@link 1038 android.app.UiAutomation.AccessibilityEventFilter} interface. It's within your {@link 1039 android.app.UiAutomation.AccessibilityEventFilter} implementation that you'll receive a call 1040 that allows you to filter the events that you're interested in and determine the success or 1041 failure of a given test case.</p> 1042 1043 <p>To observe all the events during a test, create an implementation of {@link 1044 android.app.UiAutomation.OnAccessibilityEventListener} and pass it to {@link 1045 android.app.UiAutomation#setOnAccessibilityEventListener setOnAccessibilityEventListener()}. 1046 Your listener interface then receives a call to {@link 1047 android.app.UiAutomation.OnAccessibilityEventListener#onAccessibilityEvent onAccessibilityEvent()} 1048 each time an event occurs, receiving an {@link android.view.accessibility.AccessibilityEvent} object 1049 that describes the event.</p> 1050 1051 <p>There is a variety of other operations that the {@link android.app.UiAutomation} APIs expose 1052 at a very low level to encourage the development of UI test tools such as <a href="{@docRoot}tools/help/uiautomator/index.html">uiautomator</a>. For instance, 1053 {@link android.app.UiAutomation} can also:</p> 1054 <ul> 1055 <li>Inject input events 1056 <li>Change the orientation of the screen 1057 <li>Take screenshots 1058 </ul> 1059 1060 <p>And most importantly for UI test tools, the {@link android.app.UiAutomation} APIs work 1061 across application boundaries, unlike those in {@link android.app.Instrumentation}.</p> 1062 1063 1064 <h3 id="Systrace">Systrace events for apps</h3> 1065 1066 <p>Android 4.3 adds the {@link android.os.Trace} class with two static methods, 1067 {@link android.os.Trace#beginSection beginSection()} and {@link android.os.Trace#endSection()}, 1068 which allow you to define blocks of code to include with the systrace report. By creating 1069 sections of traceable code in your app, the systrace logs provide you a much more detailed 1070 analysis of where slowdown occurs within your app.</p> 1071 1072 <p>For information about using the Systrace tool, read <a href="{@docRoot}tools/debugging/systrace.html">Analyzing Display and Performance with Systrace</a>.</p> 1073 1074 1075 <h2 id="Security">Security</h2> 1076 1077 <h3 id="KeyStore">Android key store for app-private keys</h3> 1078 1079 <p>Android now offers a custom Java Security Provider in the {@link java.security.KeyStore} 1080 facility, called Android Key Store, which allows you to generate and save private keys that 1081 may be seen and used by only your app. To load the Android Key Store, pass 1082 {@code "AndroidKeyStore"} to {@link java.security.KeyStore#getInstance(String) 1083 KeyStore.getInstance()}.</p> 1084 1085 <p>To manage your app's private credentials in the Android Key Store, generate a new key with 1086 {@link java.security.KeyPairGenerator} with {@link android.security.KeyPairGeneratorSpec}. First 1087 get an instance of {@link java.security.KeyPairGenerator} by calling {@link 1088 java.security.KeyPairGenerator#getInstance getInstance()}. Then call 1089 {@link java.security.KeyPairGenerator#initialize initialize()}, passing it an instance of 1090 {@link android.security.KeyPairGeneratorSpec}, which you can get using 1091 {@link android.security.KeyPairGeneratorSpec.Builder KeyPairGeneratorSpec.Builder}. 1092 Finally, get your {@link java.security.KeyPair} by calling {@link 1093 java.security.KeyPairGenerator#generateKeyPair generateKeyPair()}.</p> 1094 1095 1096 <h3 id="HardwareKeyChain">Hardware credential storage</h3> 1097 1098 <p>Android also now supports hardware-backed storage for your {@link android.security.KeyChain} 1099 credentials, providing more security by making the keys unavailable for extraction. That is, once 1100 keys are in a hardware-backed key store (Secure Element, TPM, or TrustZone), they can be used for 1101 cryptographic operations but the private key material cannot be exported. Even the OS kernel 1102 cannot access this key material. While not all Android-powered devices support storage on 1103 hardware, you can check at runtime if hardware-backed storage is available by calling 1104 {@link android.security.KeyChain#isBoundKeyAlgorithm KeyChain.IsBoundKeyAlgorithm()}.</p> 1105 1106 1107 1108 <h2 id="Manifest">Manifest Declarations</h2> 1109 1110 <h3 id="ManifestFeatures">Declarable required features</h3> 1111 1112 <p>The following values are now supported in the <a 1113 href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code <uses-feature>}</a> 1114 element so you can ensure that your app is installed only on devices that provide the features 1115 your app needs.</p> 1116 1117 <dl> 1118 <dt>{@link android.content.pm.PackageManager#FEATURE_APP_WIDGETS}</dt> 1119 <dd>Declares that your app provides an app widget and should be installed only on devices that 1120 include a Home screen or similar location where users can embed app widgets. 1121 Example: 1122 <pre> 1123 <uses-feature android:name="android.software.app_widgets" android:required="true" /> 1124 </pre> 1125 </dd> 1126 1127 <dt>{@link android.content.pm.PackageManager#FEATURE_HOME_SCREEN}</dt> 1128 <dd>Declares that your app behaves as a Home screen replacement and should be installed only on 1129 devices that support third-party Home screen apps. 1130 Example: 1131 <pre> 1132 <uses-feature android:name="android.software.home_screen" android:required="true" /> 1133 </pre> 1134 </dd> 1135 1136 <dt>{@link android.content.pm.PackageManager#FEATURE_INPUT_METHODS}</dt> 1137 <dd>Declares that your app provides a custom input method (a keyboard built with {@link 1138 android.inputmethodservice.InputMethodService}) and should be installed only on devices that 1139 support third-party input methods. 1140 Example: 1141 <pre> 1142 <uses-feature android:name="android.software.input_methods" android:required="true" /> 1143 </pre> 1144 </dd> 1145 1146 <dt>{@link android.content.pm.PackageManager#FEATURE_BLUETOOTH_LE}</dt> 1147 <dd>Declares that your app uses Bluetooth Low Energy APIs and should be installed only on devices 1148 that are capable of communicating with other devices via Bluetooth Low Energy. 1149 Example: 1150 <pre> 1151 <uses-feature android:name="android.software.bluetooth_le" android:required="true" /> 1152 </pre> 1153 </dd> 1154 </dl> 1155 1156 1157 <h3 id="ManifestPermissions">User permissions</h3> 1158 <p>The following values are now supported in the <a 1159 href="{@docRoot}guide/topics/manifest/uses-permission-element.html">{@code <uses-permission>}</a> 1160 to declare the 1161 permissions your app requires in order to access certain APIs.</p> 1162 1163 <dl> 1164 <dt>{@link android.Manifest.permission#BIND_NOTIFICATION_LISTENER_SERVICE} 1165 </dt> 1166 <dd>Required to use the new {@link android.service.notification.NotificationListenerService} APIs. 1167 </dd> 1168 1169 <dt>{@link android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE}</dt> 1170 <dd>Required to receive the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} 1171 intent.</dd> 1172 </dl> 1173 1174 1175 1176 1177 <p class="note">For a detailed view of all API changes in Android 4.3, see the 1178 <a href="{@docRoot}sdk/api_diff/18/changes.html">API Differences Report</a>.</p> 1179 1180 1181 1182