Speeding up Android builds by 'fanning out' tests on Bitrise

Build your Android app faster than ever by fanning out - and parallelizing - tests on Bitrise. An example workflow and the necessary documentation to make it work for you.

📝 UPDATE:

This blog post describes a legacy solution. To run builds in parallel, learn more about Build Pipelines.

Speed and quality can go hand in hand, but sometimes you need a bit of tinkering to get both. Try this Bitrise setup to split tests up, run them in parallel and speed up your Android builds.

TLDR

Just want to try it out for yourself? Find the code and assets used to make this post here:

Android Test App

bitrise.yml

Fan Out Calculator (How many "fan outs" are likely optimal for your use case?)

Fan Out for iOS

Before we get started

  • This article assumes you have a Bitrise account. If not, register here (the standard free and Developer plans have a single concurrency, so create an organization if you'd like to try out our Org Standard plan with two concurrencies for free)
  • To run workflows in parallel, you'll also need multiple concurrencies. If you'd like to upgrade your plan to have more of those (or want a trial, demo or anything to help explain to your CTO why this is useful) reach out to the team here
  • This is "advanced Bitrise": By following the example, you should be able to achieve (significantly) faster build times for specific cases, but the results will vary (they might be worse, the same or even better than mine) and you'll end up with a more complex workflow. If you're willing and able to add some complexity to increase performance, though, read on:

Speed Up Your Builds!

Use the gradle cache!

Storing your gradle dependencies on Bitrise can make sure that you can access all the gradle deps as soon as possible. When using our build steps setting up caching for gradle dependencies only requires the Cache:Push and Cache:Pull steps to be added to your workflow and to set the cache level of the build step to deps_only.

In addition to the gradle caching use static dependency versions

When you are using static versions you can prevent gradle from checking for updates online during the builds. How this can be done in the build.gradle file:

Use classpath "com.google.gms:google-services:3.2.0" instead of classpath "com.google.gms:google-services:3.+

Only include the necessary resources!

When you are building an Android application having all the resources available in the generated apk is simply just an overkill. As (in most cases) you would only be running your test using a single language and a single screen density you can leave out any other resources from the generated apk. To achieve this you can create a new flavor for your application that only includes the necessary resources, decreasing your compilation time and the apk size as well.


android {
    productFlavors {
        ui {
            ...
            resConfigs ("en", "xxhdpi")
        }
    }
}
Copy code

Don’t use legacy multidex

Legacy Multidex is used when the minSdkVersion is set to 20 or lower. Simply create another variant to only use for testing and only running for your tests.


android {
        dev {
            ...
            minSdkVersion 21
        }

        prod {...}
    }
}
Copy code


Disable apk split

As development builds don't require multiple APKs this option can simply be disabled.

eg.:


if (project.hasProperty('devBuild')) {
    // Prevent multi apk generation on development
    splits.abi.enable = false
    splits.density.enable = false
}
Copy code

And in the Additional Gradle Arguments input of the step add -PdevBuild

Remove Crashlytics from UI testing builds

In most cases you won’t need to store your crash logs in a different tool when running your UI tests as you would get the results in the test reports as well.


android {
    buildTypes {
        debug {
            ext.enableCrashlytics = false
        }
    }
}
Copy code

When you have a modularized project use the org.gradle.parallel=true option, this would allow Gradle to run the build in parallel on your different modules.

Build Your Tests For Success!

Whether we like it or not, tests fail! Sometimes they fail due to code changes, but sometimes they fail due to a previous step or case failing. This makes it much harder to identify which steps/cases are really failing and which are working as expected.

First step is to meaningfully increase your testing time (yes, you read that right, increase)

Organise your test cases

Going through features and user stories during your UI tests provides a perfect ground for parallelisation later on based on your feature sets and adding setup steps to each would make the tests less prone to errors in other test sets. It can make it much easier to understand your test sets if those are focused on a specific user journey and it makes it easy to run your test sets one by one.

Prepare for a different environment

Tests ran on a CI environment are using different network settings/connection and the devices have different performance as well. Running your tests on the devices with a high speed internet connection can produce smaller wait times and fast loading of different UI elements while testing on an older phone or on an emulator can slow down builds. Make sure to always test if the element that you are planning to interact with is visible and make sure you give enough time to the device to display the desired screen completely.

Run Parallel Test Builds!

You can start parallel builds and wait for them to complete using the Bitrise Start Build & Bitrise Wait for Build steps. This allows you to run more simulators at once.

This example shows how to:

  1. Trigger X test workflows from a Primary workflow and wait for them to complete
  2. Gather the test results from all X fan out workflows via the Bitrise API using NodeJS
  3. Generate a single Test Report for all X test results using bash
undefined

Workflows Explained

Every workflow and every script below can be found in the bitrise.yml we linked on the top of the article. Feel free to start experimenting with your own setup based on the provided sample! 😎

Primary Workflow

The ui_test_fan_out_ftl workflow triggers the fan out builds using the Bitrise Start Build & Bitrise Wait for Build steps.
In order to not have to rebuild the app multiple times, we can leverage the Cache Push & Cache Pull steps, or other steps like the S3 File Uploader step.

undefined

Fan Out Workflows

The test_x_ftl workflows triggered by the primary build will run in parallel on the apks created in the ui_test_fan_out_ftl build. Each of these workflows can target a different test runner class or test package. The script step in the workflow generates the zip archive that is later on exposed by the Deploy to Bitrise.io step and which is also available on the Apps & Artifacts tab of the build.

The script that we used:


set -euxo pipefail
echo "-- compress --"
cd "${BITRISE_DEPLOY_DIR}/"
ls
exec zip -qr "${BITRISE_DEPLOY_DIR}/TestResults1.zip" "$BITRISE_TEST_RESULT_DIR"
Copy code


undefined

No Fan Out Workflow

The ui_test workflow runs the same number of tests in serial for performance comparison. The script steps (Test 1, etc.) contain a call to adb to start up the instrumentation tests.

undefined

Collecting build artifacts

Using the Bitrise API we can get the build artifacts back into the primary build for processing. We can do this in a script step easily but for convenience we created a custom step to do it for you.

Custom Bitrise step to automatically retrieve all artifacts created by the fan out builds. (Work in progress)

Deadlocks

Running your fan out builds require a specific amount of concurrencies. If other builds are taking up the concurrencies you can run into a deadlock where builds would fail waiting for all the started builds to finish while the fan out builds are stuck in the queue. A possible solution can be:

  • At the start of the primary workflow check if it is taking the last concurrency using the Bitrise API
  • If it is taking the last concurrency then self abort the build.
  • Reschedule it for a retry in 5 mins

Results

Although this is a trivial example with very few tests, it shows how you can apply fan out builds to your own workflows.

No Fan Out 7 Test Runs 17.1 mins

Here you can see that the tests happened one after another and the total build time was 17.1 mins in this simple example. Depending on how many tests you perform your results may vary.

undefined

Fan Out with 7 fan outs 10.5 mins

Here you can see that the tests happened in parallel and the total build time was 10.5 mins in this simple example. Depending on how many tests you perform your results may vary.

undefined

What else can you use Fan Out builds for?

  • UI Testing
  • Unit Testing
  • Code Coverage
  • Linting
  • Building App Variants
  • Play Store Uploads
  • and more…

Any sequential action that could be split up to run in parallel, can be split out over workflows to cut down on the time spent waiting for a build to finish tremendously. Have you tried the fan out setup and want to share your experiences?

Find us on Twitter, drop by our LinkedIN or Facebook pages, or simply start a topic on Reddit - We're always happy to jump in on the conversation.

Happy building! 🚀

Get Started for free

Start building now, choose a plan later.

Sign Up

Get started for free

Start building now, choose a plan later.