Dealing with emulator issues and Android UI test failures on CI

Let's dig deeper into emulator and UI test-related issues for Android! The last article of our Android testing series by Richard Bogdan, Senior Software Engineer.

In the previous article, I introduced how to set up testing on the CI and what you can do to make your life easier by extracting the required information to debug tests when something goes wrong. Now I will dig deeper into the latter topic and focus on emulator and UI test-related issues.

Mitigation strategies

When I claim that instrumented tests (especially UI tests) are flaky, I'm pretty sure everyone can recall an event from their own experience when it happened. Or if not, you are probably really lucky, but at least you heard that someone knows a person who experienced it. Sadly, there are no silver bullets to prevent flakiness, but as with everything, there are quite a few things we can do to prevent most of them. So here are the 3 groups of strategies:

  1. Preventive
  2. Reactive
  3. Supportive

Let’s see each.

Preventive test flakiness mitigation

As the name suggests, these strategies involve all the things that you can do before launching your tests, and enables you to prevent the flaky behavior from happening. This sounds rather vague, but I will try to give more examples and details now.

Page Object Pattern

Are you familiar with the term “coding patterns”? Do you know and use some of them? If you answered with yes for both questions, then I guess you won’t be surprised to hear that testing also has it’s patterns and Page Object Pattern is one of them.

This pattern comes useful when you are doing UI tests, for example with Espresso or UIAutomator. Originally, this pattern was used by web UI testers, so this is where the name “Page Object” comes from. Although you would assume from the name, that you can create Page Objects for pages or screens only; but in fact, they can be used for smaller items as well (e.g. fragments or different views). The goal of the Page Object Pattern is to introduce a decoupling layer between test cases and the code required to do an action/access element on the UI. This will be the Page Object. For the visual thinkers, here is an overview image:

As you can see:

  • You can run multiple test classes on a single device
  • A given test class can use one or more UI component (e.g. Activity, Fragment)
  • Each UI component will have one page object
  • A test case can use one or more page objects

Putting this in practice:

Let’s imagine you have UI tests of a simple TODO application, which lists some TODO items, you can add, edit and delete them. From a testing perspective you would not care about the resource names of the given buttons and views, because it is highly likely that you would test different behaviours instead. So if you put all of these code in the page object, test cases can become less boilerplate and shorter. Also, if some behaviour in the UI changes, like for example previously you had a button for each item to delete them, and that changes to swipe them out to delete, you would need to update every test case. You should put this logic in the page objects, and if you have to change, you just have to update it in one place. To stick with this example, there should be a method something like “deleteItem” in the page object. So to summarise, using page object pattern gives you the following advantages:

  • Avoid code duplications, as all accessors, actions, etc can be in a single place
  • More readable code
  • You have to change the code in a single place in case of behaviour change
  • You can combine it with other patterns/techniques, for example it becomes more compact if you combine it with fluent API

Now, example time:

@Test
public void fragmentStateTest() {
   final IndexActivityScreen indexActivityScreen = 
          new IndexActivityScreen(uiDevice);

   final MainActivityScreen mainActivityScreen = 
               indexActivityScreen.launchUiTests();
   mainActivityScreen
           .showParentFragment()
           .showChildFragment();
}
Copy code

Note: as you see, I use the word “screen” in the naming, instead of “page. The reason is that it is more descriptive for mobile user interfaces, page is more understandable in webpage or desktop terms.

The above example shows a simple UiAutomator test case, where we do the following:

  1. Starts the application with the IndexActivity
  2. Launch MainActivity in IndexActivityScreen with the method launchUiTests()
  3. MainActivity displays a fragment (ParentFragment)
  4. ParentFragment displays an another fragment (ChildFragment)

Thanks to the page object pattern, the code for this is quite short and readable. As you can see, I combined the page object pattern with a fluent API, to make it more concise.

Looks nice, but how will this help in preventing test flakiness?

Probably you experienced flaky UI tests in the past, because someone from your team forgot to add some waiting time to a given view element to appear. It happens from time to time, because we are humans and make mistakes/forget things. The good thing is that if you use page object pattern, you will write the waiting logic once, and you reuse it in the different test cases, because it is in your screen object. So the less time you have to add the waiting logic, the less chance you will forget to add it. Also, readable code helps to avoid human errors, right?

You might argue that page object pattern is not the only thing that would solve/mitigate the above mentioned issues, and you are right. My point here is that it is a great choice for mitigating the issues, because it solves these along with it’s benefits.

I will provide an example for UiAutomator. This how my screen class looks for the IndexActivity:

/**
* Screen class for {@link IndexActivity}.
*/
public class IndexActivityScreen extends BaseScreen {

   private static final BySelector uiTestsButton = By.res(id + "btn_ui_tests");
   private static final BySelector networkTestsButton = By.res(id + "btn_network_tests");

   /**
    * Constructor for class.
    *
    * @param uiDevice the UiDevice that launches the test cases.
    */
   public IndexActivityScreen(@NonNull final UiDevice uiDevice) {
       super(uiDevice);
   }

   /**
    * Clicks on the UI tests button and launches the {@link MainActivity}.
    *
    * @return the created {@link MainActivityScreen}.
    */
   @NonNull
   public MainActivityScreen launchUiTests() {
       click(uiTestsButton);
       return new MainActivityScreen(uiDevice);
   }

   /**
    * Clicks on the Network tests button and launches the {@link NetworkActivity}.
    *
    * @return the created {@link NetworkActivityScreen}.
    */
   @NonNull
   public NetworkActivityScreen launchNetworkTests() {
       click(networkTestsButton);
       return new NetworkActivityScreen(uiDevice);
   }

   @Override
   public void waitTillLoad() {
       uiDevice.wait(Until.findObject(uiTestsButton), DEFAULT_TIMEOUT);
       uiDevice.wait(Until.findObject(networkTestsButton), DEFAULT_TIMEOUT);
   }
}
Copy code

As you see, we have:

  1. The UI elements we interact with as a member variable
  2. The available interactions as member methods
  3. And a method named waitTillLoad(), which will make sure all the required elements are shown on the screen, when we start interacting with them

And just to have the full picture, have a look at the parent class for this IndexActivityScreen class:

public abstract class BaseScreen {

   /**
    * The given UiDevice that runs the tests.
    */
   @NonNull
   protected final UiDevice uiDevice;

   /**
    * The id prefix to find resources by id.
    */
   @NonNull
   protected static final String id = InstrumentationRegistry.getInstrumentation()
                                                             .getTargetContext()
                                                             .getPackageName() 
                                                             + ":id/";

   /**
    * The default value for timeouts.
    */
   protected long DEFAULT_TIMEOUT = 10000;

   /**
    * The number of times the test will try to click on a given BySelector before failing the test.
    */
   private static final int numberOfClickAttempts = 3;

   /**
    * Constructor for class.
    *
    * @param uiDevice the UiDevice that launches the test cases.
    */
   public BaseScreen(@NonNull final UiDevice uiDevice) {
       this.uiDevice = uiDevice;
       waitTillLoad();
   }

   /**
    * Searches for the given {@link BySelector} on the active UI.
    *
    * @param by the given BySelector to search for.
    * @return the {@link UiObject2} if found, {@code null} otherwise.
    */
   @Nullable
   public UiObject2 find(@NonNull final BySelector by) {
       return uiDevice.wait(Until.findObject(by), DEFAULT_TIMEOUT);
   }

   /**
    * Performs a click action on the given {@link BySelector}. Attempts to find the given BySelector 3 times. This
    * is needed for cases when the UI is not ready.
    *
    * @param by the given BySelector.
    */
   public void click(@NonNull final BySelector by) {
       for (int i = 0; i < numberOfClickAttempts; i++) {
           final UiObject2 uiObject2 = find(by);
           if (uiObject2 != null) {
               uiObject2.click();
               break;
           } else {
               Log.i(BaseUiTest.UI_TEST_TAG, String.format("Could not find selector with name %s, retrying %s",
                       by.toString(), i));
           }
       }
   }


   /**
    * Waits till the given screen is loaded.
    */
   public abstract void waitTillLoad();
Copy code

Things to see from the above code:

  • interactions like click/find have to be written once
  • constructor of the parent class will call the waitTillLoad() method, so all subclasses will do the same with calling super(uiDevice) from their constructors
  • waitTillLoad() is abstract, every non abstract subclass will have to define it, so it is less likely you will forget it
  • timeouts, retries are unified, they are defined in one place

Takeover

Write your tests in a readable, compact and expressive manner, it will save you a lot of headaches. Page object pattern is a good candidate for this.

Reactive test flakiness mitigation

Continuing the discussion on UI test flakiness mitigation, reactive strategies involve all the things that you can do while running your tests, and enables you to prevent the flaky behaviour from happening.

Dealing with System events

System events can happen during your test runs, one of them is when Android is not responding (ANR for short), and you get a dialog about it. I quite often see them when I fire up an emulator with API level 30.

Ever happened that your UI test failed because Android threw a system dialog? If the answer is yes, I bet you felt similarly like in the image below.

Even if your app is super fast and responsive, it can happen that on CI such dialogs will appear, because as we learned in previous posts, the performance of the CI machines are limited. Obviously, you have to do something with those dialogs.

If there are no other requirements, in most cases you can close them, either with clicking on the “Wait” or the “Close app” button. It is a bit easier to go with the first one, because as I said, even your app can be one that is not responding.

So what would be the solution? Create a watcher that will watch for these dialogs during your test runs. Sounds reasonable. I have good and bad news. Let's start with the bad: there is no such thing in Espresso. The good news is that there is a watcher in UiAutomator, and this will help you even if you have Espresso tests.

Here is an example of how to do it. As I said we will need a watcher for this, UiAutomator has UiWatcher. The steps to achieve this: 

  1. Create a method for registering the watcher
private static final String anrText = "isn't responding";

private static void registerANRWatcher() {
   uiDevice.registerWatcher("ANR", () -> {
       final UiObject anrDialog = uiDevice.findObject(new UiSelector()
               .packageName("android")
               .textContains(anrText));

       return checkForAnrDialogToClose(anrDialog);
   });
}
Copy code

So the dialog will show something like “<application> isn’t responding”, we should search for that text. Just to be sure we are not picking something up from a different application which has the same text, filter the package to “android”. 

  1. Check if there is an ANR dialog
private static boolean checkForAnrDialogToClose(@NonNull final UiObject anrDialog) {
   return anrDialog.exists() && closeAnrWithWait(anrDialog);
}
Copy code

Simple step, not much to explain.

  1. Click on the “Wait” button, when itt appears
private static boolean closeAnrWithWait(@NonNull final UiObject anrDialog) {
   Log.i(UI_TEST_TAG, "ANR dialog detected!");
   try {
       uiDevice.findObject(new UiSelector().text("Wait").className("android.widget.Button").packageName(
               "android")).click();
       final String anrDialogText = anrDialog.getText();
       final String appName = anrDialogText.substring(0, anrDialogText.length() - anrText.length());
       Log.i(UI_TEST_TAG, String.format("Application \"%s\" is not responding!", appName));
   } catch (final UiObjectNotFoundException e) {
       Log.i(UI_TEST_TAG, "Detected ANR, but window disappeared!");
   }
   Log.i(UI_TEST_TAG, "ANR dialog closed: pressed on wait!");
   return true;
}
Copy code

As you see, in case we have an ANR dialog, we click on the “Wait” button to make it disappear. You can do additional things, for example I log which application had the ANR, it might be interesting information if I check the logs.

  1. Register the watcher
@BeforeClass
public static void setUpBeforeClass() {
   uiDevice = UiDevice.getInstance(getInstrumentation());
   registerANRWatcher();
}
Copy code

BeforeClass annotated methods are a perfect spot for it.

This will save you a lot of headache. And as I promised, I will tell you what you should do, when you have Espresso tests. Well, I am certain that there are different approaches as well, but nothing is preventing you from using this very same code, you just have to depend on UiAutomator and this code for your tests. A good approach is to create an abstract parent test class that contains this, and your tests can extend that class. This way you will have no code duplication, actual test classes do not have to contain this code, so it will look nearly the same, it just saves you from those nasty ANR dialogs. Based on this, you can write your own watcher for other cases as well, hurray!

Takeover

Close those ANR dialogs and other unnecessary stuff during your test runs with UiWatchers.

Supportive test flakiness mitigation

The last group on UI test flakiness mitigation, supportive strategies involve all the things that you can do to help reduce the flakiness of your tests, but it does not prevent the flaky behaviour from happening.

Better hardware

One trivial thing you can do if you have a flaky test because of timeouts, you can buy/rent faster hardware, which you maybe can and want to do or not. Of course this will help, but as I said, will not prevent it, because the amount of performance that the tests can require has no upper limit (imagine having the latest/fastest machine, and launching 10 or more emulators simultaneously, I wouldn’t place high bets on not having some performance issues).

Screenshotting UI test events

In some cases it could be a big help, if you would have a screenshot of the device, when a given UI test case fails, to have a better understanding of what caused it. For example, taking screenshots helped me to discover, sometimes ANR dialogs appear during my UI test runs, causing intermittent test failures. And of course, this comes handy for not intermittent/flaky failures.

For taking screenshots, you have to create a TestWatcher and implement the actions for the given events you want. For the complete set of events that a TestWatcher has, please check it’s documentation. I will show you how to take a screenshot when the test is going to be started, and failed:

  1. (Optional): create a test rule for getting the name of the currently active test case.
@Rule
public TestName testName = new TestName();
Copy code

This will come handy when creating those screenshot files, and later it will be easier for you to match screenshots with test cases. As easy way to do this is to use TestName.

  1. Create your TestWatcher
public class TestDataCollectionRule extends TestWatcher {

   @Override
   protected void starting(@NonNull final Description description) {
       super.starting(description);
       Log.i(UI_TEST_TAG, "Test started: " + testName.getMethodName());
       takeScreenShot(TestEvent.START);
   }

   @Override
   protected void succeeded(@NonNull final Description description) {
       super.succeeded(description);
       Log.i(UI_TEST_TAG, "Test success: " + testName.getMethodName());
   }

   @Override
   protected void failed(@NonNull final Throwable e, @NonNull final Description description) {
       super.failed(e, description);
       Log.i(UI_TEST_TAG, "Test failed: " + testName.getMethodName());
       takeScreenShot(TestEvent.FAIL);
   }
}
Copy code

Pro tip:some logs in the logcat output will be also helpful. The only thing that requires explanation is TestEvent enum. It is an inner enum class that I created, and I use it to indicate which test event triggered the screenshot taking. You can use different classes for this purpose, if you would not like to create your own.

  1. Add a method for creating the screenshots
protected void takeScreenShot(@NonNull final TestEvent testEvent) {
   final Bitmap screenshotBitmap = getScreenShotBitmap();
   final String screenShotFileName = getTestReportFileBaseName(testEvent);

   storeScreenshot(screenshotBitmap, screenShotFileName);
}

private Bitmap getScreenShotBitmapWithScreenShotApi() {
   return getInstrumentation().getUiAutomation().takeScreenshot();;
}

private Bitmap getScreenShotBitmapWithUiAutomator() {
   return Screenshot.capture().getBitmap();
}
Copy code

As you see,you can do it with the ScreenShot API, or with the UiAutomation.takeScreenshot() method. I leave this choice to the reader.

  1. Add method for creating a good name for the screenshot
private String getTestReportFileBaseName(@NonNull final TestEvent testEvent) {
   final SimpleDateFormat simpleDateFormat = new SimpleDateFormat("yyyyMMdd_hhmmss");
   final String timeStamp = simpleDateFormat.format(Calendar.getInstance().getTime());
   return timeStamp + "_" + testName.getMethodName() + "_" + testEvent;
}
Copy code

It is rather subjective what is good for the developers taste, in my example I concat the timestamp with the name of the test case and the given test event (e.g. 20210520_120000_myUiTest_FAILURE)

  1. The last, and probably biggest step we need is to save it to the device
private ContentResolver getContentResolver() {
   return InstrumentationRegistry.getInstrumentation()
                                 .getTargetContext()
                                 .getApplicationContext()
                                 .getContentResolver();
}

private void storeScreenshot(@NonNull final Bitmap screenShotBitmap, @NonNull final String screenShotFileName) {
   final ContentResolver contentResolver = getContentResolver();
   final String UiTestScreenShotsDirName = "UiTestScreenShots";

   try {
       if (android.os.Build.VERSION.SDK_INT >= 29) {
           storeWithMediaStore(new ContentValues(), contentResolver, screenShotFileName,
                   UiTestScreenShotsDirName, screenShotBitmap);
       } else {
           storeWithFileOutputStream(new ContentValues(), contentResolver, screenShotFileName,
                   UiTestScreenShotsDirName, screenShotBitmap);
       }
       Log.i(UI_TEST_TAG, "Created screenshot " + screenShotFileName);
   } catch (final IOException e) {
       Log.e(UI_TEST_TAG, "Failed to take screenshot!", e);
       e.printStackTrace();
   }
}
Copy code

As you can see, I added some logs here too, and I use the good old ContentResolver in the process of storing the screenshots. Due to limitations and updates from API level 29, we need a different approach for them. Here you can check my examples:

@RequiresApi(Build.VERSION_CODES.Q)
private void storeWithMediaStore(@NonNull final ContentValues contentValues,
                                @NonNull final ContentResolver contentResolver,
                                @NonNull final String screenshotFileName,
                                @NonNull final String screenshotLocation,
                                @NonNull final Bitmap screenshotBitmap) throws IOException {
   applyBaseScreenshotContentValues(contentValues);
   contentValues.put(MediaStore.MediaColumns.DISPLAY_NAME, screenshotFileName + ".jpeg");
   contentValues.put(MediaStore.Images.Media.RELATIVE_PATH,
           Environment.DIRECTORY_PICTURES + "/" + screenshotLocation);

   final Uri uri = contentResolver.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, contentValues);
   if (uri != null) {
       try (final OutputStream outputStream = contentResolver.openOutputStream(uri)) {
           saveScreenshotToStream(screenshotBitmap, outputStream);
       }
       contentResolver.update(uri, contentValues, null, null);
   }
}

@RequiresPermission(Manifest.permission.WRITE_EXTERNAL_STORAGE)
private void storeWithFileOutputStream(@NonNull final ContentValues contentValues,
                                      @NonNull final ContentResolver contentResolver,
                                      @NonNull final String screenshotFileName,
                                      @NonNull final String screenshotLocation,
                                      @NonNull final Bitmap screenshotBitmap) throws IOException {
   final File picturesDir = new File("/sdcard/Pictures/" + screenshotLocation);
   final File screenshotFile = new File(picturesDir, screenshotFileName + ".jpg");
   screenshotFile.mkdirs();
   if (screenshotFile.exists()) {
       screenshotFile.delete();
   }

   try (final FileOutputStream outputStream = new FileOutputStream(screenshotFile)) {
       saveScreenshotToStream(screenshotBitmap, outputStream);
   }

   applyBaseScreenshotContentValues(contentValues);
   contentResolver.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, contentValues);
}

private void saveScreenshotToStream(@NonNull final Bitmap screenshotBitmap,
                                   @NonNull final OutputStream outputStream) {
   screenshotBitmap.compress(Bitmap.CompressFormat.JPEG, SCREENSHOT_COMPRESSION, outputStream);
}
Copy code

I would not go into deep details, as it is a different topic, and my article would never have an end, but maybe I will update this article with a link, if I decide to write about data storing on Android in the future. The only thing you need to know, that this will store the screenshots under the sdcard/Pictures/UiTestScreenshots/ directory.
Just do not forget to grant the write permission of your app below API level 29. The easiest way is to use GrantPermissionrule.

@Rule
public GrantPermissionRule mRuntimePermissionRule = GrantPermissionRule.grant(
       Manifest.permission.WRITE_EXTERNAL_STORAGE);
Copy code


Takeover

Create screenshots from different test events, it can help you in discovering what was the issue that caused the failure.

Dump view hierarchy

Creating screenshots is extremely helpful in some cases, but we would not be developers if we would not want more near code details. There can be similarly looking views, specially during transitions, so some resource names and ID would mean the world in some cases, when we debug a test failure. As the title says, we can also dump the view hierarchy to a file. Here is how you can do it.

  1. Create your TestWatcher (see Screenshotting Ui test events)
  2. Create a method for dumping the view hierarchy and storing it on the device
private void dumpWindowHierarchy(@NonNull final TestEvent testEvent) {
   try {
       final String hierarchyReportFileName = getTestReportFileBaseName(testEvent) + "_hierarchy";
       if (android.os.Build.VERSION.SDK_INT >= 29) {
           uiDevice.dumpWindowHierarchy(getOutputStreamForViewHierarchyFile(hierarchyReportFileName));
       } else {
           uiDevice.dumpWindowHierarchy(createViewHierarchyFile(hierarchyReportFileName));
       }
   } catch (final IOException e) {
       Log.i(UI_TEST_TAG, "Failed to dump view hierarchy.", e);
       throw new IllegalStateException(e);
   }
}
Copy code

Please note, for Espresso you can use TreeIterables to create the view hierarchy. Similarly like in the case of taking screenshots, we have to use a different approach for storing the files from API level 29.

@RequiresApi(Build.VERSION_CODES.Q)
private OutputStream getOutputStreamForViewHierarchyFile(@NonNull final String hierarchyReportFileName)
       throws FileNotFoundException {
   final ContentValues values = new ContentValues();
   values.put(MediaStore.Downloads.DISPLAY_NAME, hierarchyReportFileName);
   values.put(MediaStore.Downloads.RELATIVE_PATH, "Download/UiTestHierarchy");
   values.put(MediaStore.Downloads.MIME_TYPE, "text/xml");
   values.put(MediaStore.Downloads.IS_PENDING, 1);

   final ContentResolver resolver = getContentResolver();
   final Uri contentUri = MediaStore.Downloads.getContentUri(MediaStore.VOLUME_EXTERNAL_PRIMARY);
   final Uri itemUri = resolver.insert(contentUri, values);

   if (itemUri != null) {
       resolver.openFileDescriptor(itemUri, "w");
       values.clear();
       values.put(MediaStore.Downloads.IS_PENDING, 0);
       resolver.update(itemUri, values, null, null);
       return resolver.openOutputStream(itemUri);
   }
   return null;
}

@NonNull
private File createViewHierarchyFile(@NonNull final String hierarchyReportFileName) {
   final File hierarchyDump = new File("/sdcard/Download/UiTestHierarchy", hierarchyReportFileName + ".xml");
   hierarchyDump.mkdirs();
   if (hierarchyDump.exists()) {
       hierarchyDump.delete();
   }

   return hierarchyDump;
}
Copy code
  1. As seen in the code above, these methods will store the view hierarchy files under the folder /sdcard/Download/UiTestHierrchy/. Please note that some helper functions were already introduced in the Taking Screenshots section, you can find them there.

Takeaway

You can create view hierarchy dumps in different stages of your UI tests, which can help you in debugging test failures.

Pulling the saved data from the device

When you work on your local machine, you can easily view the collected test data by using the Device File Explorer in Android Studio. Just search for the given file and open it with a double click, or you can even download it to your machine, just right click and left click on “Save As”.

It is much more tricky on the CI, because it might not store the data of the given virtual device. For example this is the case in Bitrise. We have to pull those files from the device, before the build finishes and upload it somewhere. Luckily with a simple Script step, we can do it easily, just add the following code to your bitrise.yml, when the collected data is ready:

- script@1:
   title: save_collected_test_data
   is_always_run: true
   description: |-
     Pulls from the virtual device and moves it to the build reports dir the following things:
     1. screenshots of UI tests
     2. view hierarchy xmls of UI tests
   inputs:
     - content: |-
         #!/usr/bin/env bash
         # fail if any commands fails
         set -e
         # debug log
         set -x

         echo "Listing files in /sdcard/Pictures/UiTestScreenShots/"
         adb shell ls /sdcard/Pictures/UiTestScreenShots/ || true

         echo

         echo "Listing files in /sdcard/Download/UiTestHierarchy/"
         adb shell ls /sdcard/Download/UiTestHierarchy/ || true

         echo "Pulling test data files"
         mkdir -p /bitrise/src/trace-test-application/build/reports/screenshots/
         adb pull /sdcard/Pictures/UiTestScreenShots/ /bitrise/src/trace-test-application/build/reports/screenshots/
         mkdir -p /bitrise/src/trace-test-application/build/reports/viewhierarchy/
         adb pull /sdcard/Download/UiTestHierarchy/ /bitrise/src/trace-test-application/build/reports/viewhierarchy/
Copy code

As you see, I am printing out the list of files for debug purposes, before pulling them to the build/reports directory in the given application. Now we just have to upload it somewhere, where we can find it. A previous example I have shown in the Testing in the CI article, upload it with Deploy to Bitrise.io step.

Takeover

Pull the collected data with the adb pull command and store it somewhere.

Check your device’s health

Sometimes virtual device launches result in a failure. They can even die during your build run, and it can leave you clueless, why your test run ended in failure. A simple and helpful trick is to check the health of your devices with adb devices command. For example if you receive “offline” as a result, you should know that your device was not responding or was not connected at that time. Please see documentation for details.
In Bitrise, just add a Script step to do the device health check.

- script@1:
   inputs:
     - content: |-
         #!/usr/bin/env bash
         # fail if any commands fails
         set -e
         # debug log
         set -x

         adb devices
         sdkmanager --list
   title: Log emulators and installed SDK packages
Copy code

Note: I prefer listing out the installed SdkManager packages, it helps when a strange issue happens, and it is a known issue from Google’s side.

Summary

You can do things before, during and after running the tests to help to mitigate flakiness issues. Hope you liked my article, let me know your thoughts and questions.

No items found.

Explore more topics

App development

Best practices from engineers on how to use Bitrise to build better apps, faster.

Community

Meet other Bitrise engineers, technology experts, power users, partners and join our BUGs.

Company

All the updates about Bitrise events, sponsorships, employees, and more.

Insights

Mobile development, latest tech, industry insights, and interviews with experts.

Mobile DevOps

Learn why mobile development is unique and requires a set of unique practices.

Releases

Stay tuned for the last updates, new features, and product improvements.

Get the latest from Bitrise

Join other Mobile DevOps engineers who receive regular emails from Bitrise, filled with tips, news, and best practices.