Home | History | Annotate | Download | only in versions

Lines Matching full:link

122 <p>Android {@sdkPlatformVersion} ({@link android.os.Build.VERSION_CODES#JELLY_BEAN_MR2})
177 available, because if you call {@link android.app.Activity#startActivity startActivity()} without
178 verifying whether an app is available to handle the {@link android.content.Intent},
181 <p>When using an implicit intent, you should always verify that an app is available to handle the intent by calling {@link android.content.Intent#resolveActivity resolveActivity()} or {@link android.content.pm.PackageManager#queryIntentActivities queryIntentActivities()}. For example:</p>
199 If your app depends on an {@link android.accounts.Account}, then your app might crash or behave
229 you must declare the restrictions your app provides by creating a {@link
230 android.content.BroadcastReceiver} that receives the {@link android.content.Intent#ACTION_GET_RESTRICTION_ENTRIES} intent. The system invokes this intent to query
234 <p>In the {@link android.content.BroadcastReceiver#onReceive onReceive()} method of
235 your {@link android.content.BroadcastReceiver}, you must create a {@link
236 android.content.RestrictionEntry} for each restriction your app provides. Each {@link
241 <li>{@link android.content.RestrictionEntry#TYPE_BOOLEAN} for a restriction that is
243 <li>{@link android.content.RestrictionEntry#TYPE_CHOICE} for a restriction that has
245 <li>{@link android.content.RestrictionEntry#TYPE_MULTI_SELECT} for a restriction that
249 <p>You then put all the {@link android.content.RestrictionEntry} objects into an {@link
251 {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} extra.</p>
254 restriction with the unique key you provided for each {@link android.content.RestrictionEntry}
256 calling {@link android.os.UserManager#getApplicationRestrictions getApplicationRestrictions()}.
257 This returns a {@link android.os.Bundle} containing the key-value pairs for each restriction
258 you defined with the {@link android.content.RestrictionEntry} objects.</p>
263 broadcast receiver, include the {@link android.content.Intent#EXTRA_RESTRICTIONS_INTENT} extra
264 in the result {@link android.os.Bundle}. This extra must specify an {@link android.content.Intent}
265 indicating the {@link android.app.Activity} class to launch (use the
266 {@link android.os.Bundle#putParcelable putParcelable()} method to pass {@link
270 the {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} or {@link
272 {@link android.content.RestrictionEntry} objects or key-value pairs, respectively.</p>
278 accounts are not accessible from the {@link android.accounts.AccountManager} APIs by default.
279 If you attempt to add an account with {@link android.accounts.AccountManager} while in a restricted
306 it?s possible to create a new account by calling {@link
307 android.os.UserManager#getUserRestrictions()} and check the {@link
344 <p>Android now supports Bluetooth Low Energy (LE) with new APIs in {@link
357 Bluetooth LE APIs has some differences. Most importantly is that there's now a {@link
359 such as acquiring a {@link android.bluetooth.BluetoothAdapter}, getting a list of connected
361 {@link android.bluetooth.BluetoothAdapter}:</p>
368 <p>To discover Bluetooth LE peripherals, call {@link android.bluetooth.BluetoothAdapter#startLeScan
369 startLeScan()} on the {@link android.bluetooth.BluetoothAdapter}, passing it an implementation
370 of the {@link android.bluetooth.BluetoothAdapter.LeScanCallback} interface. When the Bluetooth
371 adapter detects a Bluetooth LE peripheral, your {@link
373 {@link android.bluetooth.BluetoothAdapter.LeScanCallback#onLeScan onLeScan()} method. This
374 method provides you with a {@link android.bluetooth.BluetoothDevice} object representing the
378 <p>If you want to scan for only specific types of peripherals, you can instead call {@link
379 android.bluetooth.BluetoothAdapter#startLeScan startLeScan()} and include an array of {@link
386 <p>To then connect to a Bluetooth LE peripheral, call {@link
388 {@link android.bluetooth.BluetoothDevice} object, passing it an implementation of
389 {@link android.bluetooth.BluetoothGattCallback}. Your implementation of {@link
391 state with the device and other events. It's during the {@link
393 callback that you can begin communicating with the device if the method passes {@link
410 user to enable Wi-Fi scan-only mode by calling {@link android.content.Context#startActivity
411 startActivity()} with the action {@link
417 <p>New {@link android.net.wifi.WifiEnterpriseConfig} APIs allow enterprise-oriented services to
426 can declare its capability to handle these messages by creating a {@link android.app.Service}
427 with an intent filter for {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}.</p>
430 the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} intent with a URI
431 describing the recipient (the caller) and the {@link android.content.Intent#EXTRA_TEXT} extra
435 <p>In order to receive this intent, you must declare the {@link
445 ISO/IEC 23009-1 standard, using existing APIs in {@link android.media.MediaCodec} and {@link
448 and passing the individual streams to {@link android.media.MediaExtractor}.</p>
450 <p>If you want to use DASH with encrypted content, notice that the {@link android.media.MediaExtractor#getSampleCryptoInfo getSampleCryptoInfo()} method returns the {@link
452 sample. Also, the {@link android.media.MediaExtractor#getPsshInfo()} method has been added to
453 {@link android.media.MediaExtractor} so you can access the PSSH metadata for your DASH media.
454 This method returns a map of {@link java.util.UUID} objects to bytes, with the
455 {@link java.util.UUID} specifying the crypto scheme, and the bytes being the data specific
461 <p>The new {@link android.media.MediaDrm} class provides a modular solution for digital rights
467 <p>You can use {@link android.media.MediaDrm} to obtain opaque key-request messages and process
469 responsible for handling the network communication with the servers; the {@link
472 <p>The {@link android.media.MediaDrm} APIs are intended to be used in conjunction with the
473 {@link android.media.MediaCodec} APIs that were introduced in Android 4.1 (API level 16),
474 including {@link android.media.MediaCodec} for encoding and decoding your content, {@link
475 android.media.MediaCrypto} for handling encrypted content, and {@link android.media.MediaExtractor}
478 <p>You must first construct {@link android.media.MediaExtractor} and
479 {@link android.media.MediaCodec} objects. You can then access the DRM-scheme-identifying
480 {@link java.util.UUID}, typically from metadata in the content, and use it to construct an
481 instance of a {@link android.media.MediaDrm} object with its constructor.</p>
486 <p>Android 4.1 (API level 16) added the {@link android.media.MediaCodec} class for low-level
488 the media with a {@link java.nio.ByteBuffer} array, but Android 4.3 now allows you to use a {@link
492 <p>To use a {@link android.view.Surface} as the input to your encoder, first call {@link
493 android.media.MediaCodec#configure configure()} for your {@link android.media.MediaCodec}.
494 Then call {@link android.media.MediaCodec#createInputSurface()} to receive the {@link
497 <p>For example, you can use the given {@link android.view.Surface} as the window for an OpenGL
498 context by passing it to {@link android.opengl.EGL14#eglCreateWindowSurface
499 eglCreateWindowSurface()}. Then while rendering the surface, call {@link
500 android.opengl.EGL14#eglSwapBuffers eglSwapBuffers()} to pass the frame to the {@link
503 <p>To begin encoding, call {@link android.media.MediaCodec#start()} on the {@link
504 android.media.MediaCodec}. When done, call {@link android.media.MediaCodec#signalEndOfInputStream}
505 to terminate encoding, and call {@link android.view.Surface#release()} on the
506 {@link android.view.Surface}.</p>
511 <p>The new {@link android.media.MediaMuxer} class enables multiplexing between one audio stream
512 and one video stream. These APIs serve as a counterpart to the {@link android.media.MediaExtractor}
515 <p>Supported output formats are defined in {@link android.media.MediaMuxer.OutputFormat}. Currently,
516 MP4 is the only supported output format and {@link android.media.MediaMuxer} currently supports
519 <p>{@link android.media.MediaMuxer} is mostly designed to work with {@link android.media.MediaCodec}
520 so you can perform video processing through {@link android.media.MediaCodec} then save the
521 output to an MP4 file through {@link android.media.MediaMuxer}. You can also use {@link
522 android.media.MediaMuxer} in combination with {@link android.media.MediaExtractor} to perform
528 <p>In Android 4.0 (API level 14), the {@link android.media.RemoteControlClient} was added to
532 media app with the {@link android.media.RemoteControlClient} APIs, then you can allow playback
535 <p>First, you must enable the {@link
537 {@link android.media.RemoteControlClient#setTransportControlFlags setTransportControlsFlags()}.</p>
541 <dt>{@link android.media.RemoteControlClient.OnGetPlaybackPositionListener}</dt>
542 <dd>This includes the callback {@link android.media.RemoteControlClient.OnGetPlaybackPositionListener#onGetPlaybackPosition}, which requests the current position
545 <dt>{@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener}</dt>
546 <dd>This includes the callback {@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener#onPlaybackPositionUpdate onPlaybackPositionUpdate()}, which
549 <p>Once you update your playback with the new position, call {@link
555 <p>With these interfaces defined, you can set them for your {@link
556 android.media.RemoteControlClient} by calling {@link android.media.RemoteControlClient#setOnGetPlaybackPositionListener setOnGetPlaybackPositionListener()} and
557 {@link android.media.RemoteControlClient#setPlaybackPositionUpdateListener
576 <p>The Java interface for OpenGL ES 3.0 on Android is provided with {@link android.opengl.GLES30}.
587 <p>And remember to specify the OpenGL ES context by calling {@link android.opengl.GLSurfaceView#setEGLContextClientVersion setEGLContextClientVersion()}, passing {@code 3} as the version.</p>
596 <p>Android 4.2 (API level 17) added support for mipmaps in the {@link android.graphics.Bitmap}
597 class&mdash;Android swaps the mip images in your {@link android.graphics.Bitmap} when you've
598 supplied a mipmap source and have enabled {@link android.graphics.Bitmap#setHasMipMap
599 setHasMipMap()}. Now in Android 4.3, you can enable mipmaps for a {@link
601 setting the {@code android:mipMap} attribute in a bitmap resource file or by calling {@link
611 <p>The new {@link android.view.ViewOverlay} class provides a transparent layer on top of
612 a {@link android.view.View} on which you can add visual content and which does not affect
613 the layout hierarchy. You can get a {@link android.view.ViewOverlay} for any {@link
614 android.view.View} by calling {@link android.view.View#getOverlay}. The overlay
620 <p>Using a {@link android.view.ViewOverlay} is particularly useful when you want to create
627 <p>When you create an overlay for a widget view such as a {@link android.widget.Button}, you
628 can add {@link
629 {@link android.view.ViewOverlay#add(Drawable)}. If you call {@link
630 android.view.ViewGroup#getOverlay} for a layout view, such as {@link android.widget.RelativeLayout},
631 the object returned is a {@link android.view.ViewGroupOverlay}. The
632 {@link android.view.ViewGroupOverlay} class is a subclass
633 of {@link android.view.ViewOverlay} that also allows you to add {@link android.view.View}
634 objects by calling {@link android.view.ViewGroupOverlay#add(View)}.
736 <p>When you enable optical bounds for a {@link android.view.ViewGroup} in your layout, all
740 the views within them. However, layout elements (subclasses of {@link android.view.ViewGroup})
743 <p>If you create a custom view by subclassing {@link android.view.View}, {@link android.view.ViewGroup}, or any subclasses thereof, your view will inherit these optical bound behaviors.</p>
746 with optical bounds, including {@link android.widget.Button}, {@link android.widget.Spinner},
747 {@link android.widget.EditText}, and others. So you can immediately benefit by setting the
749 ({@link android.R.style#Theme_Holo Theme.Holo}, {@link android.R.style#Theme_Holo_Light
762 <p>You can now animate between two {@link android.graphics.Rect} values with the new {@link
763 android.animation.RectEvaluator}. This new class is an implementation of {@link
764 android.animation.TypeEvaluator} that you can pass to {@link
771 when its focus changed, you needed to override the {@link android.view.View} class to
772 implement {@link android.view.View#onAttachedToWindow onAttachedToWindow()} and {@link
773 android.view.View#onDetachedFromWindow onDetachedFromWindow()}, or {@link
777 <p>Now, to receive attach and detach events you can instead implement {@link
779 {@link android.view.ViewTreeObserver#addOnWindowAttachListener addOnWindowAttachListener()}.
780 And to receive focus events, you can implement {@link
782 {@link android.view.ViewTreeObserver#addOnWindowFocusChangeListener
790 for you app layout. Overscan mode is determined by the {@link android.view.WindowManager.LayoutParams#FLAG_LAYOUT_IN_OVERSCAN} flag, which you can enable with platform themes such as
791 {@link android.R.style#Theme_DeviceDefault_NoActionBar_Overscan} or by enabling the
792 {@link android.R.attr#windowOverscan} style in a custom theme.</p>
824 <p>The new {@link android.view.WindowManager.LayoutParams#rotationAnimation} field in
825 {@link android.view.WindowManager} allows you to select between one of three animations you
828 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_CROSSFADE}</li>
829 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_JUMPCUT}</li>
830 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_ROTATE}</li>
833 <p class="note"><strong>Note:</strong> These animations are available only if you've set your activity to use "fullscreen" mode, which you can enable with themes such as {@link android.R.style#Theme_Holo_NoActionBar_Fullscreen Theme.Holo.NoActionBar.Fullscreen}.</p>
851 <p>The new {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} sensor allows you to detect the device's rotations without worrying about magnetic interferences. Unlike the {@link android.hardware.Sensor#TYPE_ROTATION_VECTOR} sensor, the {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} is not based on magnetic north.</p>
853 <p>The new {@link android.hardware.Sensor#TYPE_GYROSCOPE_UNCALIBRATED} and {@link
855 consideration for bias estimations. That is, the existing {@link
856 android.hardware.Sensor#TYPE_GYROSCOPE} and {@link android.hardware.Sensor#TYPE_MAGNETIC_FIELD}
867 <p>Android 4.3 adds a new service class, {@link android.service.notification.NotificationListenerService}, that allows your app to receive information about new notifications as they are posted by the system. </p>
878 <p>The new Contacts Provider query, {@link android.provider.ContactsContract.CommonDataKinds.Contactables#CONTENT_URI Contactables.CONTENT_URI}, provides an efficient way to get one {@link android.database.Cursor} that contains all email addresses and phone numbers belonging to all contacts matching the specified query.</p>
885 <p>To track changes to inserts and updates, you can now include the {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP} parameter with your selection to query only the contacts that have changed since the last time you queried the provider.</p>
887 <p>To track which contacts have been deleted, the new table {@link android.provider.ContactsContract.DeletedContacts} provides a log of contacts that have been deleted (but each contact deleted is held in this table for a limited time). Similar to {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP}, you can use the new selection parameter, {@link android.provider.ContactsContract.DeletedContacts#CONTACT_DELETED_TIMESTAMP} to check which contacts have been deleted since the last time you queried the provider. The table also contains the constant {@link android.provider.ContactsContract.DeletedContacts#DAYS_KEPT_MILLISECONDS} containing the number of days (in milliseconds) that the log will be kept.</p>
889 <p>Additionally, the Contacts Provider now broadcasts the {@link
904 but sometimes don't properly handle mixed-direction text. So Android 4.3 adds the {@link
910 {@link java.lang.String#format String.format()}:</p>
920 <p>That's wrong because the "15" should be left of "Bay Street." The solution is to use {@link
921 android.text.BidiFormatter} and its {@link android.text.BidiFormatter#unicodeWrap(String)
931 By default, {@link android.text.BidiFormatter#unicodeWrap(String) unicodeWrap()} uses the
934 If necessary, you can specify a different heuristic by passing one of the {@link
935 android.text.TextDirectionHeuristic} constants from {@link android.text.TextDirectionHeuristics}
936 to {@link android.text.BidiFormatter#unicodeWrap(String,TextDirectionHeuristic) unicodeWrap()}.</p>
940 Library</a>, with the {@link android.support.v4.text.BidiFormatter} class and related APIs.</p>
948 <p>An {@link android.accessibilityservice.AccessibilityService} can now receive a callback for
949 key input events with the {@link android.accessibilityservice.AccessibilityService#onKeyEvent
957 <p>The {@link android.view.accessibility.AccessibilityNodeInfo} now provides APIs that allow
958 an {@link android.accessibilityservice.AccessibilityService} to select, cut, copy, and paste
962 action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_SET_SELECTION}, passing
963 with it the selection start and end position with {@link
964 android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_START_INT} and {@link
967 action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_NEXT_AT_MOVEMENT_GRANULARITY}
968 (previously only for moving the cursor position), and adding the argument {@link
971 <p>You can then cut or copy with {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_CUT},
972 {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_COPY}, then later paste with
973 {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_PASTE}.</p>
978 Library</a>, with the {@link android.support.v4.view.accessibility.AccessibilityNodeInfoCompat}
989 "capability" constants in the {@link android.accessibilityservice.AccessibilityServiceInfo}
992 <p>For example, if a service does not request the {@link android.R.styleable#AccessibilityService_canRequestFilterKeyEvents flagRequestFilterKeyEvents} capability,
1000 <p>The new {@link android.app.UiAutomation} class provides APIs that allow you to simulate user
1001 actions for test automation. By using the platform's {@link
1002 android.accessibilityservice.AccessibilityService} APIs, the {@link android.app.UiAutomation}
1005 <p>To get an instance of {@link android.app.UiAutomation}, call {@link
1008 when running your {@link android.test.InstrumentationTestCase} from <a
1011 <p>With the {@link android.app.UiAutomation} instance, you can execute arbitrary events to test
1012 your app by calling {@link android.app.UiAutomation#executeAndWaitForEvent
1013 executeAndWaitForEvent()}, passing it a {@link java.lang.Runnable} to perform, a timeout
1014 period for the operation, and an implementation of the {@link
1015 android.app.UiAutomation.AccessibilityEventFilter} interface. It's within your {@link
1020 <p>To observe all the events during a test, create an implementation of {@link
1021 android.app.UiAutomation.OnAccessibilityEventListener} and pass it to {@link
1023 Your listener interface then receives a call to {@link
1025 each time an event occurs, receiving an {@link android.view.accessibility.AccessibilityEvent} object
1028 <p>There is a variety of other operations that the {@link android.app.UiAutomation} APIs expose
1030 {@link android.app.UiAutomation} can also:</p>
1037 <p>And most importantly for UI test tools, the {@link android.app.UiAutomation} APIs work
1038 across application boundaries, unlike those in {@link android.app.Instrumentation}.</p>
1043 <p>Android 4.3 adds the {@link android.os.Trace} class with two static methods,
1044 {@link android.os.Trace#beginSection beginSection()} and {@link android.os.Trace#endSection()},
1056 <p>Android now offers a custom Java Security Provider in the {@link java.security.KeyStore}
1059 {@code "AndroidKeyStore"} to {@link java.security.KeyStore#getInstance(String)
1063 {@link java.security.KeyPairGenerator} with {@link android.security.KeyPairGeneratorSpec}. First
1064 get an instance of {@link java.security.KeyPairGenerator} by calling {@link
1066 {@link java.security.KeyPairGenerator#initialize initialize()}, passing it an instance of
1067 {@link android.security.KeyPairGeneratorSpec}, which you can get using
1068 {@link android.security.KeyPairGeneratorSpec.Builder KeyPairGeneratorSpec.Builder}.
1069 Finally, get your {@link java.security.KeyPair} by calling {@link
1075 <p>Android also now supports hardware-backed storage for your {@link android.security.KeyChain}
1081 {@link android.security.KeyChain#isBoundKeyAlgorithm KeyChain.IsBoundKeyAlgorithm()}.</p>
1095 <dt>{@link android.content.pm.PackageManager#FEATURE_APP_WIDGETS}</dt>
1104 <dt>{@link android.content.pm.PackageManager#FEATURE_HOME_SCREEN}</dt>
1113 <dt>{@link android.content.pm.PackageManager#FEATURE_INPUT_METHODS}</dt>
1114 <dd>Declares that your app provides a custom input method (a keyboard built with {@link
1123 <dt>{@link android.content.pm.PackageManager#FEATURE_BLUETOOTH_LE}</dt>
1141 <dt>{@link android.Manifest.permission#BIND_NOTIFICATION_LISTENER_SERVICE}
1143 <dd>Required to use the new {@link android.service.notification.NotificationListenerService} APIs.
1146 <dt>{@link android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE}</dt>
1147 <dd>Required to receive the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}