Articles and samples about developing for the BlackBerry 10 Android Runtime.


    • BlackBerry Integrity Detection: Overview

      Published Feb 09 2018, 2:17 PM by jdreher

      This document discusses the BlackBerry Integrity Detection (BID engine, and how it can be used by third-party Android apps. We will start with some simple questions:


      Question 1: What is BID?
      Blackberry Integrity Detection is a platform-level service. The goal of BID is to detect compromises that has occurred with the system since the last factory reset was done.


      BID maintains a database of past compromises and provides an API to allow calling functions to query it.


      Question 2: Who needs this and why?
      BID is for everyone from BlackBerry DTEK, to second-party development partners to third-party app developers.


      Use Case: In case there is an application that wants to ensure that the system is in right (secure) state. By right state we mean that the system is not compromised at any security level. No attack has occurred and the system is not affected by any including root, jailbreaks etc. This enables help enterprise applications, or other apps that contains secure data of the user or the business, to become alert about the system state.


      This is very important for the security and privacy of information and transactions. It makes the system more secure. And now the BIG question:


      Question 3: How it can be used? What kind of permissions do third-party apps require? Which APIs do the application need to call?
      The BID database is open to every application. Any application can use the content provider URI’s and receive reports related to various failures. No special permissions are required from BlackBerry, and no additional SDKs or developer programs are needed to leverage the service.

      In order to take advantage of BID features, third-party apps only need to implement the BideListener Interface which ensures that the application will handle the events in case new report or certificate is inserted.

      BideHelperAndroid: The functionality for obtaining status reports and failure reports from the BID content provider is provided in this class. Third-party applications only need to use this Helper API.


      Public Methods:
      1) requestStatusReport(BigInteger nonce)

      This call can be used generate an "on-demand" BID report. This report summarizes whether there are any existing failed BID reports, and contains the nonce that the 3rd party specified. The ability to generate this report on demand demonstrates that the BID stack is alive. Internally this function queries the content provider using the URI for status report “content://com.blackberry.bide/status”.

      2) verifyStatusReport(BideStatusReport report, BigInteger nonce, boolean certReqd)

      After a request completes successfully, the caller is responsible for verifying the report object before attempting to access the contents of the report. The BID report is signed with a BID key, backed by a certificate that chains up to a BlackBerry CA. In case where an application tries to read the report without verifying will result in an Exception.


      3) requestAllFailureReports()

      Returns an array of BideFailureReport objects representing the last failed BID reports that BID captured since the device came up out of factory state. Each report needs to be verified using verifyFailureReport api as explained above. Internally this function queries the content provider using the URI for status report "content://com.blackberry.bide/reports"


      4) requestBideCertificate()

      Returns the BID certificate chain, if it's available. This function returns the certificate in the form of a byte array. Internally this function queries the content provider using the URI for status report "content://com.blackberry.bide/reports"


      If an application choose not to use helper APIs and would rather query the Content Provider directly then there are various steps that the application needs to perform.

      a. See BideConstants Class for all of the useful string constants that the application will need. For example, to query all the failure reports:

           Cursor cursor = getContentResolver().query(BideConstants.FAILURE_REPORT_STRING, null, null, null, null);


      b. Then access the rows and columns of the result using standard database methods:


           int kbideReportIndex = cursor.getColumnIndexOrThrow(KBIDE_REPORT);

           String kbideReport = cursor.getString(kbideReportIndex);


      However when using this approach keep in mind that the reports are not validated, and you are responsible to validate the nonce, the hashes, and the signature prior to attempting to access the contents of the reports.


      c. Register manually for BID broadcasts, you will need to write your own subclass of BroadcastReceiver like this:

           public class MyBroadcastReceiver extends BroadcastReceiver {


                public void onReceive(Context context, Intent intent) {

                     if (intent.getAction().equals(BideConstants.CERTIFICATE_AVAILABLE)) {

                          //...process the event

                     } else if (intent.getAction().equals(BideConstants.REPORT_INSERTED)) {

                          //...process the event



      d. Add a receiver tag in theapplication tag of app's AndroidManifest.xml file:





    • BlackBerry Integrity Detection: Usage

      Published Feb 09 2018, 2:17 PM by jdreher

      BlackBerry Integrity Detection: Usage


      BlackBerry Integrity Detection (BID) is a service provided by BlackBerry to make the Android app ecosystem more secure. This service, which is exclusive to BlackBerry devices running Android, can be used by any third-party application developer to verify the security status of the device.


      This document explains how as a third party developer you can easily accomplish this task.


      The BID engine provides ways to query the status of the system.


      ServiceContent Provider


      There are two ways to make these queries:

      1) You can leverage the BidHelper package

      • In this case you just need to import the BidHelper package in your project.
      • You can see the example of this case in the BIDLogin sample.

      2) You can implement the functionality yourself using content providers.

      • The first thing you should do is to look file in BidHelper package.
      • This file has all the useful string constants that you will need. 
      • For example: to query all failure reports:

      Cursor cursor = getContentResolver().query( BidConstants.FAILURE_REPORT_STRING, null, null, null, null);


      • There is one important thing to keep in mind that the reports are not validated, and you are responsible to validate the nonce, the hashes, and the signature prior to attempting to access the contents of the reports.
      • You will need to implement a Broadcast receiver to register manually for BID broadcasts something like below:

      private final class BidReceiver extends BroadcastReceiver {


           public void onReceive(Context context, Intent intent) {

                if (intent.getAction().equals(CERTIFICATE_AVAILABLE)) {

                     // Do Something.


                } else if (intent.getAction().equals(REPORT_INSERTED)) {
                     //Do Something.




      • You will need to add a receiver tag in your AndroidManifest file






    • BlackBerry Deploy fails to install an Android app with result::failure -2

      Published Feb 09 2018, 2:17 PM by jdreher


      When using the BlackBerry Command-line Tools for Android apps SDK, the BlackBerry Deploy and BlackBerry APK Packager commands both fail to load/install a repackaged Android app with an error message consisting of:


      result::failure -2$DeployException: result::failure -2





      When repackaging your Android app with the BlackBerry APK Packager tool, a warning text file (.WRN) is created to provide you with a synopsis of the API compatability of your app against BlackBerry 10.


      The -2 error is typcially the result of the minSdkVersion being set to a higher API level than is currently supported within the BlackBerry Runtime for Android apps on BlackBerry 10. 




      At the time of publishing, up to API level 18 (Android 4.3, Jelly Bean MR2) is supported on BlackBerry 10.3.2 OS. 


      Rebuild and resign your APK file in your Android IDE of choice, with a minSdkVersionValue which reflects Android 4.3/API 18. If your Android app supports API 18 or below, the error message should be suppressed.



    • How to Correct BlackBerry Priv Emulator Rendering Issues

      Published Feb 09 2018, 2:17 PM by msohm



      The original BlackBerry PRIV skin for the Android Emulator had an issue that caused rendering issues on some systems.  Here is a sreenshot that shows an example of what may have been encountered.







      The pixel density specified in the skin was not compatible with all systems.




      Remove the PRIV skin already installed using the following steps.


      1. Delete the PRIV skin previously extracted by deleting the PRIV directory located here:  \platforms\andriod-xx\skins>
      2. Start Android Studio.
      3. Open AVD Manager.
      4. Delete any existing emulators configured to use the PRIV skin by clicking on the down arrow on the right and choosing Delete.  Warning:  This will erase all data associated with that emulator.
      5. Click the "Create Virtual Device..." button.
      6. Right click on the PRIV entry in the list and choose Delete.
      7. Press the Cancel Button.
      8. Close AVD Manager

      Download and install the updated PRIV skin using the steps in the article: BlackBerry PRIV Emulator


      Once complete, you can verify that the update has been applied by checking the DPI configuration of the emulator configured using the new skin.  It should show "hdpi" as highlighted in the screenshot below.  If it shows 560, repeat the steps above because the previous PRIV skin is still in use.





    • Developer’s Guide to the PRIV Touch Enabled Keyboard

      Published Feb 09 2018, 2:17 PM by jdreher

      This article applies to the following:


      • BlackBerry PRIV


      This article will cover topics related to the touch-enabled keyboard of the BlackBerry PRIV.  The sliding keyboard feature of the PRIV keyboard is covered in another article here: Developer’s Guide to the PRIV Sliding Keyboard


      Capacitive Keyboard Integration


      The new capacitive physical keyboard can be used like a touchpad to scroll web pages, flick predictive text onto the screen, enable the cursor, and for fine cursor control.  The keyboard is capable of identifying up to 4 unique touch points.  We’ve designed the keyboard to act similar to a touchpad.  This will allow many applications to make use of this keyboard without requiring any code changings. 


      Scrolling Using the Physical Touch Keyboard


      Without extra development, application UIs built with standard Android™ view containers such as ScrollView and HorizontalScrollView provide scrolling behavior for motion on the touch keyboard when not performing text input.


      If you have custom scrollable views, to provide consistent scrolling behavior, you should ensure that your views handle mouse scroll wheel movement with either an external mouse connected to any Android device or external touchpad that behaves as a pointer device.


      You should make sure that:


        • Motion on the scrollwheel moves your view in the same direction as a ScrollView for vertical movement, or as a HorizontalScrollView for horizontal movement.
        • You consider the magnitude of the events; if you move your view the same distance for any mouse scroll event without considering the magnitude, your view is likely to move too fast and not smoothly.


      To more fully exploit the Priv keyboard in an application or in an input method editor (IME) for control beyond simple scrolling, it's important to understand the full range of input events that this device can provide.



      Identify the InputDevice


      The touch keypad provides an InputDevice with a source of InputDevice.SOURCE_TOUCHPAD in addition to the keyboard device that provides key events. The resolution (density) is set to match the device display, which is convenient for gesture detection. If you want more low-level information on touch devices, please consult the Android developer guidance for touch devices.


      The code below can be used to find and log the details of the device, since InputDevice.toString() provides formatted details.  A complete working sample application that demonstrates capturing touch events from the physical keyboard on PRIV can be found here:  CKBDemo  



      final int deviceIds[] = InputDevice.getDeviceIds();
      for (final int id : deviceIds) {
          final InputDevice device = InputDevice.getDevice(id);
          if (device != null && ((device.getSources() & InputDevice.SOURCE_TOUCHPAD) ==
                  InputDevice.SOURCE_TOUCHPAD)) {
              android.util.Log.i("DeviceLogger", device);


      The output (with logcat prefixes removed) is as follows:



      Input Device 7: touch_keypad
        Descriptor: 954faadc99bb5a7c1d0537b923e0490c90b47e98
        Generation: 137
        Location: built-in
        Keyboard Type: none
        Has Vibrator: false
        Sources: 0x100008 ( touchpad )
          AXIS_X: source=0x100008 min=0.0 max=1420.0 flat=0.0 fuzz=0.0 resolution=21.0
          AXIS_Y: source=0x100008 min=0.0 max=609.0 flat=0.0 fuzz=0.0 resolution=21.0
          AXIS_PRESSURE: source=0x100008 min=0.0 max=1.0 flat=0.0 fuzz=0.0 resolution=0.0
          AXIS_SIZE: source=0x100008 min=0.0 max=1.0 flat=0.0 fuzz=0.0 resolution=0.0
          AXIS_TOUCH_MAJOR: source=0x100008 min=0.0 max=1546.3961 flat=0.0 fuzz=0.0 resolution=0.0
          AXIS_TOUCH_MINOR: source=0x100008 min=0.0 max=1546.3961 flat=0.0 fuzz=0.0 resolution=0.0
          AXIS_TOOL_MAJOR: source=0x100008 min=0.0 max=1546.3961 flat=0.0 fuzz=0.0 resolution=0.0
          AXIS_TOOL_MINOR: source=0x100008 min=0.0 max=1546.3961 flat=0.0 fuzz=0.0 resolution=0.0


      Since the InputDevice.isExternal() method is hidden, applications can’t determine that the device is internal (built-in) without using reflection or some other indirect method. Note that the output above identifies the device location, but parsing the output of InputDevice.toString() to extract the location would be brittle, relying on formatting that has no guarantee of consistency between platform versions. 


      However, it may not be necessary to distinguish the internal device from external touchpad devices; you may wish to provide similar behavior for external touchpads connected to any Android device. In this case, you should consider the density of each MotionRange to provide some consistency of experience for gesture detection.  Devices with the same source type are actually relatively rare; external touchpads frequently provide a source with the class SOURCE_CLASS_POINTER.


      Events originating from the touch keyboard are MotionEvents with a source of SOURCE_TOUCHPAD and a device ID corresponding to the InputDevice described above. Priv supports up to 4 distinct touches on its physical keyboard, represented by separate pointer IDs. The touch event sequences for the physical keyboard are similar to a touch screen device, except that the source type of these touches are not directly associated with a display.


      Event Flow


      Beyond the InputDispatcher, a physical keyboard touch event follows this basic flow:


      • If there's an active IME, the IME can handle a keyboard touch event by overriding onGenericMotionEvent(MotionEvent). The default implementation:
        • Returns true for events from the built-in touchpad if the IME is active, causing the event to be consumed with no action. This prevents the events from causing scrolling while typing, which could provide a negative experience.
        • Returns false if the IME is not active.


      If implementing an IME which overrides onGenericMotionEvent(MotionEvent) to handle other events, the parent implementation should be called for any events not handled, in order to preserve this behavior. 


      If the BlackBerry® Keyboard is the active IME, it provides a range of keyboard gestures using the touch keypad.


      Most existing Android applications and views don’t handle events with a source of SOURCE_TOUCHPAD. To allow the events to provide scrolling behavior in the widest possible range of applications, a fallback is provided: motion on the touch keypad, which is not handled by the app, is transformed into a stream of new MotionEvents that are the equivalent of scrolling on a mouse scrollwheel; these synthetic events are then dispatched to the application.


      It's very important to provide consistency in event handling. If you're consuming some touchpad events to recognize gestures, you should typically consume all of them, whether a gesture is recognized or not, while your application is in a clearly defined state. Allowing only a subset of events to be handled by another component or to be converted to mouse scroll events will likely provide inconsistent behavior.


      Mouse Scrolling Events



      The synthetic events injected to the application where touchpad events are not handled are not a direct 1:1 conversion of each touchpad event. Instead, events are injected for small amounts of movement. The events have the following key characteristics:


        • The source is InputDevice.SOURCE_MOUSE, which is a form of a pointer device (see SOURCE_CLASS_POINTER).
        • Since there is no actual on-screen pointer, the "pointer" coordinates are also synthesized. You can retrieve these for a particular MotionEvent using getX() and getY(). The Y value is set to the middle of the display vertically, while the X value is set differently based on display orientation:
          • When the device is held in portrait orientation, the X value matches the X-axis value of the touch down point on the touch keypad at the start of the motion. This takes advantage of the touch keypad matching the display in width and resolution.
          • When the device is held in landscape orientation, the X value is set to the middle of the display horizontally.
        • The magnitude of the motion in the vertical and horizontal directions can be retrieved using getAxisValue(MotionEvent.AXIS_VSCROLL) and getAxisValue(MotionEvent.AXIS_HSCROLL).

      Pointer events are dispatched directly to the view under the pointer coordinates. This is not very significant in many apps, but if the layout contains multiple scrollable views, the events act on the view containing the coordinates synthesized as described above.


      Using the Touchpad Events


      As noted above, you can often get scrolling behavior in your app for free due to the fallback conversion to mouse scroll events. However, you can provide a richer input experience by handling the initial touchpad events directly, which enables:

        • Tap (and double-tap) gestures.
        • Touch-and-hold gestures.
        • Making use of touch positions within the touch keypad.
        • Multi-touch gestures (for example, pinching to zoom or differentiating scrolling using single or multiple fingers.)

      Basic horizontal and vertical fling (swipe) gestures are still possible using the mouse scroll events, and might be used for inertial scrolling or paging, but if you want to differentiate where on the surface the fling occurs you may need more information.  In addition, if you're implementing an IME, the events are only available as touchpad events, since the fallback conversion to mouse scroll events occurs further along the pipeline.


      Using the Android GestureDetector


      Conveniently, the GestureDetector class works with touch events from a touchpad source, just as it does with events from the touch screen. The internal thresholds used by GestureDetector are configured based on the screen density, but since the resolution of the touch keypad is designed to match that of the built-in display, a sequence of events generated by motion with a particular speed and distance on the touchpad should cause the same callbacks to be invoked as the same motion on the touch screen.


      Basic usage of the GestureDetector is demonstrated in the Activity code snippet below. In this instance, a very basic gesture listener pays attention only to fling and scroll events and only logs a few details. Note that the onGenericMotionEvent() method always returns true for touchpad events to ensure that other components don't deal with a subset of them, and defers to the parent implementation for all other event types. In a practical listener:


        • Fling gestures are often filtered with more strict thresholds appropriate to the particular purpose.
        • Listener callbacks that return a boolean often return true only if an action was really triggered. GestureDetector.onTouchEvent()returns false if no callback was triggered for an event or if the callback returned false, and the event may sometimes be passed on to apply further tests, including potentially feeding it to a different gesture detector. Regardless, it's still important to be consistent with the return values of onGenericMotionEvent(). If your component is in a state in which it is potentially recognizing gestures, it should almost certainly return true for all events, whether any particular event completed a gesture.


      public class TouchpadDemoActivity extends AppCompatActivity {
          private static final String TAG = "DemoActivity";
          // Other class member variables go here
          private final GestureDetector mDetector;
          private final DemoGestureListener mListener;
          protected void onCreate(Bundle savedInstanceState) {
              // other onCreate work goes here
              mListener = new DemoGestureListener();
              mDetector = new GestureDetector(this, mListener);
          // Other parts of Activity implementation go here
          public boolean onGenericMotionEvent(MotionEvent event) {
              if ((event.getSource() & InputDevice.SOURCE_TOUCHPAD) == InputDevice.SOURCE_TOUCHPAD) {
                  mDetector.onTouchEvent(event); // return code ignored
                  return true; // Always consume event to provide consistency
              return super.onGenericMotionEvent(event); // Make sure other events are handled properly.
          private static final class DemoGestureListener extends GestureDetector.SimpleOnGestureListener {
              public boolean onFling(MotionEvent e1, MotionEvent e2, float velocityX, float velocityY) {
                  final float distanceX = e2.getX() - e1.getX();
                  final float distanceY = e2.getY() - e1.getY();
                  Log.i(TAG, "onFling distanceX=" + distanceX + ", distanceY=" + distanceY
                          + ", velocityX=" + velocityX + ", velocityY=" + velocityY);
                  return true;
              public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
                  Log.i(TAG, "onScroll distanceX=" + distanceX + ", distanceY=" + distanceY
                          + ", current pointer count=" + e2.getPointerCount());
                  return true;


      GestureDetector  provides onScroll() callbacks with more than one pointer ID (more than one finger touching) but the other callbacks won't work with multiple touches. For multi-touch gesture support, you could explore the use of ScaleGestureDetector, which can be used in conjunction with GestureDetector. You may find it useful to explore the Android developer documentation on gesture handling.


      If you already use GestureDetector for gestures on the touch screen, you may find it convenient to use the same listener for the same gestures on the touch keypad as well. If you do this, you should consider that GestureDetector does not consider the device ID, so if you use the same GestureDetector instance as well as the same listener, you should be careful about mixing touch event sequences from the two different input devices. If your shared listener is maintaining some state between gestures, you should also consider when that state might need to be considered stale if the device ID changes.  You might encounter some odd gesture behavior if you’re not careful since gestures can be inappropriately recognized based on changes in coordinates between events from different surfaces. A couple of approaches to mitigate this include:


        • Use a different GestureDetector
        • Synthesize a MotionEvent with an action of ACTION_CANCEL when the device ID changes between real events, and pass that to the GestureDetector ahead of the event with the new device ID. The code fragment below demonstrates this approach, with the same test applied for events arriving from either onGenericMotionEvent() or onTouchEvent().


      static final MotionEvent CANCEL_EVENT = MotionEvent.obtain(0, 0, MotionEvent.ACTION_CANCEL, 0, 0, 0, 0, 0, 0, 0, 0, 0);
          int mLastTouchDeviceId;
           * Utility method to wrap passing a touch event to the gesture detector, with
           * a cancel event injected if the device ID has changed from the last event.
           * This should be called consistently for any event source using the same
           * detector.
           * @param event The motion event being handled.
           * @return the return code from the gesture detector.
          private boolean checkTouchEventForGesture(final MotionEvent event) {
              final int id = event.getDeviceId();
              // Note that this does inject a cancel on the very first event
              // from any device, which is harmless.
              if (mLastTouchDeviceId != id) {
              mLastTouchDeviceId = id;
              return mDetector.onTouchEvent(event);
          public boolean onGenericMotionEvent(MotionEvent event) {
              if ((event.getSource() & InputDevice.SOURCE_TOUCHPAD) == InputDevice.SOURCE_TOUCHPAD) {
                  checkTouchEventForGesture(event); // return code ignored
                  return true; // Always consume event to provide consistency
              return super.onGenericMotionEvent(event); // Make sure other events are handled properly.
          public boolean onTouchEvent(MotionEvent event) {
              checkTouchEventForGesture(event); // return code ignored
              return true;




    • Developer’s Guide to the PRIV Sliding Keyboard

      Published Feb 09 2018, 2:17 PM by jdreher

      This article applies to the following:


      • BlackBerry PRIV


      This article will cover topics related to the sliding keyboard of the BlackBerry PRIV.  The touch-enabled feature of the PRIV keyboard is covered in another article here: Developer’s Guide to the PRIV Touch Enabled Keyboard



      BlackBerry PRIV offers multiple methods to interact with the device. For typing, you have the option to use either the touch screen keyboard or the slide-out, physical keyboard. The physical keyboard is also touch sensitive, allowing you to use it for scrolling and gestures. These features have been designed to integrate seamlessly with existing applications, meaning applications shouldn’t require any modifications to take advantage of these features. However, knowing the state of the physical keyboard can be important for Input Method Editors (IME), such as custom Android™ virtual keyboards. If you do wish to make direct use of these features in your application – such as taking action when the keyboard is slid open or shut – you can make use of existing Android APIs.


      Detecting if the Physical Keyboard is Open or Closed


      The keyboard state can be derived by examining the Configuration of your activity or service - including an input method service - similar to how you would detect an external keyboard.  Configuration.keyboard shows the type of keyboard attached.  Configuration.keyboardHidden indicates whether any keyboard (soft or hard) is available.  In the case of a soft keyboard, it should be available even if not currently shown and should always return KEYBOARDHIDDEN_NO on PRIV. 


      Configuration,hardKeyboardHidden indicates whether a physical keyboard is available. This is defined to allow for 3 possible values, but on PRIV it has either the value HARDKEYBOARDHIDDEN_NO, indicating that the slider is open or that an external keyboard is connected, or HARDKEYBOARDHIDDEN_YES, indicating that the slider is closed. Note that if an external keyboard is connected to a slider device, this field will not tell you if the slider is open or closed because the field will have a value of HARDKEYBOARDHIDDEN_NO.  The following charts summarizes these values on PRIV.





      Physical Keyboard Open

      Physical Keyboard Closed

      Values for Configuration.keyboard



      Values for Configuration.keyboardHidden



      Values for Configuration.hardKeyboardHidden




      Below is some sample code that reads these configuration values.  A complete working sample application that reports the keyboard status can be found here:  SliderDemo



      Configuration conf = getResources().getConfiguration();
      String keyboardValue;
      switch(conf.keyboard) {
          case Configuration.KEYBOARD_NOKEYS: keyboardValue = "KEYBOARD_NOKEYS"; break;
          case Configuration.KEYBOARD_12KEY: keyboardValue = "KEYBOARD_12KEY"; break;
          case Configuration.KEYBOARD_QWERTY: keyboardValue = "KEYBOARD_QWERTY"; break;
          default: keyboardValue = "Unknown";
      String hardKeyboardHiddenValue = (conf.hardKeyboardHidden == Configuration.HARDKEYBOARDHIDDEN_NO) 


      Physical Keyboard vs External Keyboard


      You may have noticed that the configuration values used here are the same that are used to detect when an external Bluetooth® or USB keyboard is attached to an Android device.  There are some similarities and differences to how the device behaves when the keyboard is slid open or closed compared to when an external Bluetooth or USB keyboard is attached. 


      What’s the Same


      When an external Bluetooth or USB keyboard is connected to PRIV, the configuration values are the same as the first column in the chart above, regardless of the position of the sliding keyboard.  This allows support for external keyboards no matter the position of the sliding keyboard.


      What’s Different


      Usually when “keyboardHidden|keyboard" is added to the attribute android:configChanges and an external Bluetooth or USB keyboard is connected to an Android device, the current activity is restarted and configuration is updated.  Since connecting an external keyboard is infrequent, users aren’t that impacted.  However, users may frequently open and close the physical keyboard.  If the activity is restarted each time the keyboard is opened or closed, it would have a negative impact on both system resources and the user experience.  Imagine a user losing text they’ve entered or their scroll position every time they opened or closed the keyboard.  For this reason, we don’t restart the activity in this circumstance and instead only update the configuration, which triggers the onConfigurationChanged method.  Note that you must add "keyboardHidden|keyboard" for the attribute android:configChanges in your application’s manifest file in order for onCofigurationChanged() to be fired for these keyboard events.  If “keyboardHidden|keyboard” is not added, neither onRestart() nor onConfigurationChanged() are fired for these keyboard events. The Configuration retrieved through getResources().getConfiguration() will be correct regardless of whether or not “keyboardHidden|keyboard" is used.  This is summarized in the table below.




      With keyboardHidden|keyboard

      Without keyboardHidden|keyboard


      onCofigurationChanged Fired?

      onRestart Fired?

      onCofigurationChanged Fired?

      onRestart Fired?

      Priv Keyboard Slid Open or Closed





      External Keyboard Connected or Disconnected






      Building a Physical Keyboard-Friendly Application


      If your app uses text input, you should aim for the best possible text input experience for users, whether they're using a physical keyboard or a soft, touch screen keyboard. Part of providing a positive user experience includes being consistent with user expectations of text input using the same input method. A few key principles will help you (and the IME developer) provide a consistent experience.


      Help the IME Know When Text is Actually Being Edited


      Most fields derived from TextView/EditText implement the recommendations below.  But if you're implementing a custom text field that isn't a descendent of TextView, you should ensure you follow these guidelines:


        • Wherever possible, always request soft input when a text field is in focus, whether a physical keyboard is present or not. Let the IME decide whether it's appropriate to show any input view.
        • Don't hide soft input for a text field if you expect text input to continue.
        • Don't start an input connection when you don't have a text field in focus. Finish the connection when a text field loses focus.
        • Make sure that your view's implementation of onCheckIsTextEditor() returns true where appropriate. This helps the input method manager service to decide whether soft input should be shown for views that haven't specified a soft input mode.



      Be Careful About Consuming Key Events at the Pre-IME Stage


      This is most important when a text field is in focus with the IME active. If some events are consumed early and the IME sees only a subset of them, it may lead to inconsistent behaviour from the IME.  If you handle a KeyEvent before the IME, but don't indicate that it's consumed, you could end up with two actions for the same KeyEvent.


      Remember also that you should usually aim to consume all KeyEvents in a sequence for a particular key, from initial key press to the subsequent key release, or none of them. The IME should also consume all or none of the events in a sequence, but if it doesn't receive all of them it may still take action on the ones it does receive.


      Consider Initial Focus


      With an all-touch device, users are accustomed to tapping in a text field to show the soft input method. This will typically move focus if it's not already in the field, as well as triggering a request to show soft input.


      With a physical keyboard, the user may expect to start typing right away. If the initial focus is not in the right text field, typing may have no effect. If the focus is in the text field, but soft input has not been requested, the IME may not be able to provide normal behaviour for physical key input, since it doesn't have sufficient information to determine that text editing is really happening.


      Consider Movement Keys and Focus Changes


      Many built-in reduced mobile keyboards don't have arrow keys, but some do, and most external keyboards do. As newer Android tablets enter the market with more complete keyboard accessories intended for everyday use, full keyboards are going to be used more often.


      Down, up, left and right arrow keys map to DPAD Android keycodes, specifically:

      For most standard text views, key events with these key codes moves the cursor one position in the corresponding direction. If moving up from a position in the top line but not at the start of the line, another UP key typically moves the cursor to the start of the field, and moving down from a position in the last line that's not at the end of the field causes the cursor to jump to the end. If the events have a shift metastate (usually due to holding the shift key) they typically cause selection to be extended or reduced.


      If the cursor is already at the start of the field, moving UP (without shift) typically moves focus to the previous focusable field, while moving DOWN from the last position in the field typically moves focus to the next field. Horizontal movement past the start or end of the field can be a little less predictable, and whether it causes a change of focus can depend on the length of the contents of the current and other fields.


      In addition to movement key events coming from physical keyboards, some IMEs can generate movement key events, potentially using virtual arrow keys on a soft input view, or converting other input events to movement keys. The keyboard on PRIV can provide cursor movement based on finger movement on the touch-enabled keyboard, for example.


      Movement keys originating from the IME can provide some additional complications. These keys are often injected as KeyEvents on the input connection, and thus can only be sent on a valid input connection. If you allow movement out of a text field to an adjacent non-text field, you may allow the user to navigate out of the field using the IME, but they may have to tap back in the field to move focus back and the IME is also likely to be hidden as a consequence of the loss of focus in the text field. Note that some IMEs may provide horizontal movement using the InputConnection.setSelection() method, which doesn’t move between fields. However, it is very difficult for an IME to provide equivalent behavior for vertical movement with no knowledge of line wrapping.


      As an application developer, you should consider which focus transitions make sense. Views that shouldn't ever take focus may need to be marked as such, which can be done using the android:focusable attribute in a layout XML.


      If you have a text view in your layout that should never lose focus as a result of movement keys, you can ensure this behaviour by applying an appropriate movement method to the view. In the following code fragment, the ArrowKeyMovementMethod is extended to limit movement to the down direction only by consuming movement events in all other directions, whether or not they change the cursor position in the field. A similar override for the down() method would ensure that no movement out of the field is possible with arrow keys. Within the fragment XML, the EditText is defined with only basic attributes.



      public static class BasicFragment extends Fragment {
          public PlaceholderFragment() {
          public View onCreateView(LayoutInflater inflater, ViewGroup container,
                  Bundle savedInstanceState) {
              View rootView = inflater.inflate(R.layout.fragment_my_basic_layout,
                      container, false);
              final TextView myTextBox = (TextView) rootView.findViewById(;
              myTextBox.setMovementMethod(new FocusMoveDownOnlyArrowKeyMethod());
              return rootView;
      private static class FocusMoveDownOnlyArrowKeyMethod extends ArrowKeyMovementMethod {
          protected boolean left(TextView widget, Spannable buffer) {
              super.left(widget, buffer); // ignore return code
              // The parent implementation returns false if the movement could not
              // move the cursor due to being at the start or end of the field.
              // Returning true here means that the movement will go no further,
              // and will not cause a change of focus.
              return true;
          protected boolean right(TextView widget, Spannable buffer) {
              super.right(widget, buffer); // ignore return code
              return true;
          protected boolean up(TextView widget, Spannable buffer) {
              super.up(widget, buffer); // ignore return code
              return true;


      Handle the Enter Key as an Editor Action


      If your text field defines an editor action for the EditorInfo applied to its InputConnection, you should almost always handle an enter key as if the IME had invoked the action. In most cases where an editor action is defined, the user is likely to expect the action to take effect when pressing an enter key.


      This behaviour will be inherited by views derived from TextView, but if you've got a custom view that is not a descendent of TextView, or if you've overridden onKeyDown() without deferring to the parent implementation for this key you may need to implement this behaviour yourself.  A good implementation example to follow is that of the onKeyDown() method of TextView itself, which you can view in the TextView source.


      Many IMEs do not consume physical key events, and some may not consume enter key events even if they consume other key events.



      Important Information for IME Developers


      Our discussion so far has covered what a typical Android application developer would need to know about the PRIV keyboard.  The next section focuses on developing your own IME or virtual keyboard.


      Deciding Which Keyboard to Use


      Android also allows users to specify that they always wish to show the on-screen keyboard, even if a physical keyboard is available.  This toggle is at the top of the input method chooser dialog. This setting changes the Configuration presented to activities and services to match the appearance of no physical keyboard being available: Configuration.keyboard is set to KEYBOARD_NO_KEYS, and Configuration,hardKeyboardHidden is set to HARDKEYBOARDHIDDEN_YES, regardless of whether the slider is open or closed or an external keyboard is attached or not. In this mode, key presses on a physical keyboard follow the same input pipeline. In the image below, the toggle is turned on.





      Typically an application developer should not use the presence of a physical keyboard to control whether or not soft input is requested for text input. The active Input Method Editor (IME) may provide enhanced handling of physical key input, but may rely on having soft input requested to determine if it's appropriate to consume key events. To provide the most consistent input experience, it is better to always request soft input for text editing and let the IME determine whether or not it is appropriate to provide any on-screen view based on the current configuration.



      Distinguishing the Built-in Keyboard


      The interaction with a built-in keyboard on a mobile device is different in significant ways from that with a typical external keyboard.



      • The built-in keyboard almost always has fewer and smaller keys than a typical external keyboard. This may require some on-screen affordances to provide access to characters that could be entered with a full external keyboard.
      • The built-in keyboard is usually very close to the display, and thus it may be practical to combine interaction with UI elements on the touch screen with keyboard input. In contrast, an external keyboard may be far enough away to make interaction with the touch screen awkward.


      Given these differences, it may be desirable for an IME to handle keys from a built-in keyboard differently from those from an external keyboard.


      The InputDevice API does not provide a method to distinguish a built-in from an external keyboard in general. However, as documented in the Keyboard Configuration section of the keyboard devices document, a built-in keyboard is always assigned a device ID of 0, in order to maintain compatibility with the deprecated KeyCharacterMap.BUILT_IN_KEYBOARD field. Although the intent of that deprecation appears to be to discourage app developers from making assumptions about the available input devices, the clear device ID is still very useful for IMEs which need to make a distinction between built-in and external keyboards.  For any particular KeyEvent, you can determine the ID of the source device using KeyEvent.getDeviceId().


      An application developer may not need to distinguish between internal and external devices, but it may still be a consideration if you're trying to combine on-screen controls and key presses. It may also be important to consider orientation for the built-in keyboard differently than for an external one.  The keyboard on a portrait slider is difficult to use for text input with the device in a landscape orientation.


      Distinguishing Multiple Devices


      It may be that you want to determine which or how many keyboards are present in advance of receiving any key events. For example, as an IME, you may wish to provide no UI if an external keyboard is available, regardless of the presence of a built-in keyboard.  You can use the InputDevice API to get information about currently present devices. The following code block shows an example in which alphabetic keyboards are identified.



      final int deviceIds[] = InputDevice.getDeviceIds();
      for (final int id : deviceIds) {
          final InputDevice device = InputDevice.getDevice(id);
          if (device != null)
                  && device.getKeyboardType() == InputDevice.KEYBOARD_TYPE_ALPHABETIC) {
              if (id == 0) {
                  // Do something with a built-in alphabetic keyboard.
              } else {
                  // Do something with an external keyboard.


      Note that the InputDevice API doesn't tell you if a built-in keyboard is hidden or not. While there's only one keyboard device present, you can correlate with the information provided in the Configuration, but if there is an internal keyboard as well as an external one, you won't be able to tell the slider position from the Configuration.



      Android Key Event Flow


      The flow of key events is described in Android developer documentation for keyboard devices   - in particular refer to the Keyboard Operation section. Typically as an Input Method Editor (IME) or app developer you won't need to understand the details as far as the InputDispatcher, but you should be aware of the subsequent stages through which a key event passes.  In particular, note that for the application with the focused window, a key event can be consumed before the IME has an opportunity to see it, through views overriding View.dispatchKeyEventPreIme(KeyEvent)or View.onKeyPreIme(int, KeyEvent). This might be done to prevent the IME from hiding on a back key press if the app needs to do something different, although any key event could be handled in this way.


      After the pre-IME dispatch, if the event was not consumed, the active IME has an opportunity to handle the events. Many IMEs don’t handle physical key events at all, leaving them for the app to process, although the default implementations of InputMethodService.onKeyDown(int, KeyEvent) and InputMethodService.onKeyUp(int, KeyEvent) together handle back key events in order to dismiss soft input if showing.


      If the IME has not handled a key event, it propagates again through the view hierarchy to the focused view where it may be handled. Typically for text views, this is done through a key listener converting key events into edits on the associated editor; in many cases this will be an instance of QwertyKeyListener for standard text views.


      Why might an IME consume key events?


      If physical key events can be handled by an application's views, and provide text input in standard text views without the app developer having to do anything extra, you might wonder why an IME developer should ever bother intercepting them. Simply put, there are opportunities to provide a better typing experience, particularly for a small built-in keyboard.


      Auto-Correction, Auto-Completion and Suggestions


      You may want to provide auto-correction and auto-completion when the user types a space or punctuation key. This may be less important for a full-sized PC keyboard than a small keyboard on a mobile device, but on a small keyboard where accuracy may be reduced such functionality can significantly enhance typing efficiency.


      In addition, you may wish to provide suggestions for the next word or alternatives to auto-correction, and possibly have keyboard actions select from those suggestions.


      The IME could react to changes in the text without handling the keys, using InputConnection APIs to track changes, but without processing the keys, it can be tricky to distinguish that changes in the text came directly from typing on the keyboard, and thus to be sure that it's appropriate to apply auto-correction.


      Enhanced Handling of Key Long Press and Meta Keys


      The standard Android behaviour for client views derived from TextView (or which at least use QwertyKeyListener) brings up a symbol picker dialog for a long press on keys with associated accents for keyboards of the type typical on a mobile device. This results in a few issues:


        • Accent lists are hard coded into the framework and may not be complete for the current input language.
        • Using an on-screen dialog in the middle of the screen for some characters can be very disruptive to typing flow, and providing a way to access accents without the user having to move their thumbs far off the keyboard can be more efficient.
        • Not all input clients provide the behaviour provided by QwertyKeyListener. WebViews, for example, typically provides repeated characters for key long-press actions. It may cause frustration to users if they can't rely on the keyboard behaving the same way in different text fields.


      Other behaviour on long-press actions may also be desirable, for example providing an easy way to capitalize a single letter.

      The default key listeners provide some handling of meta keys (i.e shift and alt), including:


        • A "sticky" mode for which a single press and release of a meta key followed by a single press and release of character key will act as if the meta key had been held for the second key press.
        • A locked mode, where two presses and releases of a meta key cause subsequent character key presses to act as if the meta key was held, until the same meta key is pressed again.


      As with the long-press handling, not all clients provide this processing of metastate, so the behavior can seem inconsistent. There's also no standard mechanism to provide any visual clues of locked or sticky state, so the user may not know what effect is applied to the next key stroke.


      If the IME handles raw key events, it is able to overcome all of the above issues and provides a consistent experience across different input clients. It can also provide special functions on key combinations or on specific special keys, such as showing an on-screen symbol keyboard when pressing the sym key.


      The Challenge of Consistency


      Although an IME can provide an enhanced user experience for text input by consuming raw key events and using the InputConnection APIs to commit text, it should not consume key events when the user is not entering text. Doing so could break applications that expect to handle raw key events directly, such as for shortcuts or game controls. Showing any input or candidates view outside of text input could also be disruptive.


      So how can an IME know that there’s really an active input connection? In general, if the client has requested that soft input be shown, the IME should be able to assume that text input is occurring. It is more problematic if the app does not request soft input at all, or if it is hidden while a text field has focus.


      You’ll typically find that your IME's onStartInput() method is called every time the user switches to a new app, getCurrentInputStarted() will always return true and getCurrentInputConnection() will always return a non-null InputConnection. These are thus insufficient tests for an active connection.


      For most apps, the IME can actually work out indirectly that an input connection was started by the client, using a technique like the following.  This relies on behavior that’s not documented in Android APIs, but it seems to work most of the time.



      final InputBinding binding = getCurrentInputBinding();
      final InputConnection connection = getCurrentInputConnection();
      final boolean isThisReallyAnActiveConnection  = connection != null
              && binding != null
              && binding.getConnection() != connection;


      Unfortunately, this isn’t always reliable. A small number of applications, notably some of the most popular Android browsers, will always have an active input connection, regardless of whether a text field is in focus. A new connection may be started every time the user moves focus to a text field, but that connection may not be finished when focus moves out of the text field.


      Solving this problem in the IME alone is very tricky, but you may be able to use some history, such as if the input view was ever shown for the current input connection and subsequently hidden (particularly if you know it was hidden with the back key), it may be appropriate for the IME to show itself on a subsequent key press.

    • BlackBerry PRIV Emulator

      Published Feb 09 2018, 2:17 PM by jdreher

      This article applies to the following:


      • BlackBerry PRIV
      • Android Emulator





      You can add a skin to the Android™ Virtual Device (AVD) Manager in the Android Studio SDK so that you can emulate the look and feel of a BlackBerry PRIV. This allows you to simulate the hardware keys on the PRIV, including both the power, volume up, volume down and the physical keyboard.


      Before you begin, make sure you have installed the most recent version of the Android Studio and Android SDK that matches the version of Android on PRIV (as of writing 5.1.1 API 22). Once you have that installed, download the Priv skin files available in the download link at the bottom of this knowledge base article.


      Once you have downloade the PRIV skin files use the following steps to create an Android Virtual Device.


      Place the skin for PRIV in the SDK


      1. Download the archive, and extract the PRIV folder to \platforms\android-xx\skins, where xx is the platform version.

      Set up the emulator in Android Studio - AVD Manager


      1. On the Android Studio toolbar, click AVD Manager.
      2. Click + Create Virtual Device > Import Hardware Profiles.
      3. Import \platforms\andriod-xx\skins\PRIV\device.xml, where xx is the platform version.
      4. Confirm that the new device "PRIV" appears in the devices list. Press the Refresh button to refresh the list for PRIV to appear.  It might be necessary to close Select Hardware (cancel), and click Create Virtual Device again in order to refresh the list. 
      5. Select PRIV and then click Next.



      1. Select x86_64 (assuming you are using a 64-bit operating system) and then click Next.



      1. Click Show Advanced Settings to show all settings. In the Android Virtual Device (AVD) window, configure the following emulator settings to your preferred values:
          • Android Virtual Device (AVD) Name
          • Startup scale and orientation
          • Camera orientation
          • Network characteristics


      1. Ensure Show Advanced Settings is selected, and set the following emulator preferences:
        • Memory and storage limits (we suggest increasing RAM to 3 GB and VM heap to 128 MB)
        • For skin, select PRIV




      1. Click Finish.
      2. Launch the emulator.




    • Reporting a false-positive result by BlackBerry Guardian on device app checks

      Published Feb 09 2018, 2:17 PM by

      What is Guardian on Device?


      BlackBerry Guardian is a program that combines automated and manual analysis with Trend Micro’s Mobile App Reputation Service to comprehensively vet apps in our storefront. Starting this fall with BlackBerry Passport devices, BlackBerry Guardian will perform automated checks of all Android apps that customers install on their BlackBerry smartphones when it is enabled by the user. These checks apply to apps installed from any source, including Amazon Appstore. If a suspicious app is detected, the user will have the choice to proceed or cancel the installation.


      BlackBerry Guardian focuses on flagging both potential malware and potential risks to users’ privacy.


      How does it work?


      BlackBerry Guardian, when enabled on the device, works by comparing the app to be installed to a list of known issues. This happens when the user attempts to install the app. If it finds a match, it notifies the user that there is a concern, but gives the user a choice whether or not to install the app. Device users can also choose whether or not to enable BlackBerry Guardian by changing the setting located at Settings > App Manager > Installing Apps > Inspect Apps Before Installing.


      However, no system is perfect, and if there are errors in identification or technical errors, an app could be flagged when it shouldn’t be. If your users inform you that your app was flagged, BlackBerry has developed a process to whitelist your app, after a more thorough analysis, for the BlackBerry Guardian on device program.


      What should vendors do if their app is flagged?


      App vendors with an Android app that has been flagged can submit their app to our manual vetting program. Simply locate the form at, provide the requested details and submit it.

      What can vendors expect next?


      Once the form is submitted, BlackBerry will attempt to acquire and manually review the app. We will try to reply to all inquiries within a week. If the review reveals that the initial automated assessment was a false positive, the flag will be removed from the app and users who download it will no longer see a warning at installation.




      If you have any questions about the program, please sign in to leave a comment below.

    • Retaining the Original Package Name for Upgrades to AIR Applications using AIR Captive Runtime

      Published Feb 09 2018, 2:17 PM by msohm



      This article applies to the following:

      • Adobe® AIR® Applications created for BlackBerry® 10 
      • Adobe AIR Applications using the AIR Captive Runtime for Android™ 


      Updates to existing BlackBerry 10 applications must use the same signing key and same package name, otherwise they can be rejected by BlackBerry World or the application will appear as a second icon on their device and the updated application will not have access to the previous application's saved data. Details on that scenario can be found in the article:  Error "File bundle (your_name) has been rejected." When Uploading BAR File to BlackBerry® World™


      By default, when you export an AIR application from Adobe® Flash Builder® that uses the AIR Captive Runtime, "air." is prefixed to the application's package name.  This does not occur when you export BlackBerry applications.  This means if your BlackBerry application specified a package name of com.mypackage.myapp, the exported BAR file would use com.mypackage.myapp as the package name, but the exported APK would use  This default configuration will break upgrades for users of existing applications if you were to repackage that APK file as a BAR file and submit to BlackBerry World.


      Removing the air. Prefix From Your Package Name


      There are two sets of instructions below, one for using Adobe AIR SDK 3.8 or higher in Flash Builder and another for using an Adobe AIR SDK lower than version 3.8.  If you wish to update the Adobe AIR SDK in Flash Builder refer to this Adobe help article: Flash Builder Help - Update the AIR SDK for ActionScript Projects | Flash Builder 4.7  Note that this is not required and you can retain your existing Flash Builder development environment.



      Using Flash Builder with Adobe AIR SDK 3.8 or higher


      1. Create an environment variable called AIR_NOANDROIDFLAIR and set its value to true.  This is case sensitive: the variable name must be all upper case and true must be lower case.
      2. Restart Flash Builder.
      3. Export a Release Build using the AIR Captive Runtime for Android.
        1. Right click on your Project in Flash Builder and click Export.
        2. Select Release Build, then click Next.
        3. Select only Google Android as the Target Platform, then click Next.
        4. Select "Export Application with Captive Runtime", then click Next.
      4. Package the exported APK file as a BAR file.
      5. Open the BAR file using an archive utility such as WinZip and verify the Package-name in the \META-INF\ file does not contain the air. prefix.  If it does, your installation of Flash Builder is using an AIR SDK lower than 3.8.  You will need to perform the steps below.


      Using Flash Builder with Adobe AIR SDK lower Than 3.8


      The steps below can be used by developers who wish to continue using a version of Adobe AIR SDK lower than 3.8 in Flash Builder. You will need to download the Adobe AIR SDK version 3.8 or higher to package your application.  Compilation of your .swf files can be done using your existing SDK.


      1. Download the latest Adobe AIR SDK and unzip it to a new folder on your computer.
      2. Create an environment variable called AIR_NOANDROIDFLAIR and set its value to true.  This is case sensitive: the variable name must be all upper case and true must be lower case.
      3. Right click on your project in Flash Builder and click Export.
      4. Select Release Build, then click Next.
      5. Select only Google Android as the Target Platform.
      6. In the Export section, select Keep bin-release-temp folder.
      7. Click Next.
      8. Click Cancel.  This should result in a new bin-release-temp folder in your project.  It should contain your -app.xml, bar-descriptor.xml and .swf files for your application.
      9. Copy files from that folder to a new folder on your computer.
      10. Open a command prompt and navigate to the folder created in step 8.
      11. Run Command:
        java.exe -jar "PATH_YOU_UNZIPPED_AIR_SDK_FROM_STEP_1\lib\adt.jar" -package -target apk-captive-runtime -storetype pkcs12 -keystore "PATH_TO_YOUR\AndroidCert.p12" -storepass yourAndroidCertPassword YourApp.apk YourApp-app.xml YourApp.swf bar-descriptor.xml

         Update the parameters in the command above to match your paths, password, and application name.

      12. Package the exported APK file as a BAR file.
      13. Open the BAR file using an archive utility such as WinZip and verify the Package-name in the \META-INF\ file does not contain the air. prefix.


      Note:  Before packaging an upgrade to an existing AIR Application for submission to Google Play™ or the Apple App store, remove the AIR_NOANDROIDFLAIR environment variable or set it to false to ensure the air. prefix is present.

    • Barcode scanner using ZXING for BlackBerry 10 and BlackBerry PlayBook

      Published Feb 09 2018, 2:17 PM by abhi007tyagi

      Barcode Scanner using ZXING library for BlackBerry® 10 and BlackBerry® PlayBook™ using the Android™ Runtime.

    2 pages