Engineering

Automating e2e native mobile testing into a continuous workflow

Create a stable environment that’s less maintenance 


Most developers can’t imagine not having source control and continuous integration systems. However, one area enterprises still struggle with is how to integrate an automated test solution for native mobile applications into a continuous integration workflow. This article provides details on how to embed automated end to end (e2e) tests for native mobile devices into the continuous integration environment for stronger stability and maintainability.  

Setting up a solid foundation  

Plan to run e2e tests in conditions close to your live system. Use a staging environment or something very similar to production with the least amount of mock services as possible to ensure the best coverage and more accurate tests. Consider e2e tests as regression testing to eliminate manual testing efforts.  

To implement tests effectively, have the following elements in place: 

  • A dedicated machine with multiple mobile devices connected to run tests on (at least one per platform) or give a device farm access of your application  
  • A dedicated and stable test environment 
  • A dedicated person to maintaining tests, as there might be instances of developers changing text, button labels, etc. 

Before deciding on a test framework, it’s important to answer the following questions: 

  • What platforms do you want to support? (e.g., Android, iOS) 
  • What type of applications do you need to test? (e.g., Native, Hybrid, Web) 
  • Will the applications under test be executed on simulators or physical devices? 
  • Will you use device farms? 
  • What scripting language is suitable for you to write test cases?
  • Do you want to use Gherkin language to specify test cases? 
  • Are you allowed to use open source applications? 
  • What test result format can it output? 
  • Is the development of the framework still active? 
  • What tools are available to create test cases? (e.g., Visual Studio plugin) 

Determine and select which framework is best for your organization after cross-referencing the answers to the above questions against all possible testing options. Then, make a proof of concept. Try to implement the framework for your application by creating a simple test case that includes any other activities to carry out (e.g., generate reports, debug failed test cases, etc.). Once satisfied with the results, continue to the next step. 

Building out test automation  

Maintainability is critical for building confidence among the team. Most teams lacking maintainability feel frustrated and uninspired by the slow speed. Three key components to avoid maintenance ruckus are to: 

  1. Implement a good approach to test data management. 
  2. Write thoughtful test scripts. 
  3. Create well-chosen test cases to automate. 

Now let’s dig in more deeply into each component. 

Test data management 

Managing test data is an important step for test automation. I recommend using a stable environment. Clean the data created or modified after each test run to keep the environment consistent and predictable. Alternately, opt to pollute the database with user = “test_user” + getTimestamp().  

You will collect garbage after each run which may impact your system performance over time. Some tests may fail if run twice per day. For example, if you want to assert the average transfer amount per day and a specific test step fails, it may create an unstable environment during the next run. Some test cases may not be possible to automate in an unpredictable environment. 

The point I want to make is that you need to make the right decision for a test data management solution.  

Best practices 

  • Building a database into the memory and discarding it afterward is an easy and fast approach to take.  
  • Consider having a dedicated database for test automation and use backups to restore the database to a fixed state before each test suite execution.  
  • Be sure to dedicate the test automation database only for this purpose and keep the database clean with the minimum amount of data required for tests. 

Test scripts 

Writing test scripts may be as easy as pressing a record button, executing steps, and then clicking the stop button or writing, “Given I click the "Login" button.” This sounds easy in theory; however, these tests don’t always work well (or at least not without fine-tuning the execution scripts). Some of the most common problems are: 

Test cases to automate 

If possible, start e2e test automation at the beginning of a project. Then, automate test cases for newly created features and build up the test suite while developing the project. This is the ideal situation as developers are able to build the application with testability in focus upfront, not as an afterthought.  

Often, developers need to automate test cases for an already developed project. To start, automate the most important user stories. To rank and prioritize stories, use the equation. Alternately, automate test cases noting bugs. Then, build test cases around areas where bugs are introduced. This approach helps identify bugs earlier the next time and ensures that the same bug is not introduced again. 

If using Jenkins, there are already quite a few guides to follow. In general, the procedure should look like this: 

  • Prepare the environment. 
    • Restore or deploy the database. 
    • Install the framework executables and dependencies. 
    • Fetch the app to use as a release candidate. 
    • Perform any other framework-specific activities. 
    • Wake up and unlock the mobile device. 
  • Execute the test steps. 
    • Publish the results. 

There are extra plugins to parse cucumber reports. (I have run into problems with the content security policy not allowing me to render it in .html.) 

How to automate e2e tests with continuous integration  

Devbridge used TeamCity for most Android and iOS builds. In these instances, we integrated mobile e2e test automation in TeamCity as well. For more control over the testing process, we incorporated local Mac Mini as a build agent in TeamCity and connected both Android and iOS mobile devices to the Mac. This step enabled us to write cross-platform automated test scripts on the same hardware and in a local environment, either by remote connecting or connecting the Mac Mini to a monitor. 

Additionally, we configured Calabash to output results in JUnit and .html format. JUnit (xml) is used by TeamCity to display it under each build and .html format as a means to report for those handling test coverage. However, it also was used to debug, as it stored screenshots and stacked trace for all failed test cases.  

We then configured Calabash so that the artifact and snapshot dependency was on the appropriate application build. This build chain offered the ability to use the same binaries to run automated tests on physical devices as well as for manual testing. In our case, using the same feature files to run on both devices was not feasible because both applications were not written to have an identical workflow. However, we could reuse most of the steps across devices. 

For our database, we mapped a test user to a specific database, created specifically for test automation purposes. We restored the whole database into the predefined state before each run on both devices. These steps enabled us to have confidence in the data quality for each run. If we needed additional data in a backed-up database state, we just had to restore the backup and add the required data to make a new backup. The new backup was then used for the next database restores. This workflow saved us from a lot of future maintenance, as we expanded and created new features, which could include sensitive data. 

To reduce data concurrency issues, we configured e2e tests to run on only one build agent. Additionally, it helped us avoid restoring the database while another test run was in the works. 

We then compiled a list of user stories and ranked them using the formula. We calculated the value by the most used feature (or the feature with the most bug reports) and started the implementation. The list was updated with newly developed features or additional bugs reported by users. 

All source code was stored in Bitbucket, and new features were pushed directly to Bitbucket. Before each test run, TeamCity pulled from the Calabash baseline and executed all current features. This enabled us to track changes for multiple test engineers to work simultaneously on the project and to run up-to-date test cases in TeamCity without manually inputting them. 

The workflow we ended up with is as follows: 

  • Android or iOS build configuration produced the application as a build artifact. 
  • Upon the successful build of the application, the artifacts were used in Calabash configuration and triggered test execution on local Mac Mini. 
  • Executing script restored the backup on database dedicated for test automation. 
  • All tests were executed on physical devices. 
  • The Calabash configuration produced reports and stored them as TeamCity artifacts for each run. 
  • If any tests failed, the responsible person was informed by email. 

Getting results 

By following the above guidelines, we managed to incorporate automated e2e tests in a continuous integration environment and obtain test results for every new application build. Overall results were positive. Consequently, we opted to use it to perform smoke tests and regression testing. While there were still some issues with the framework which introduced some stability problems, I am hopeful that this will be fixed. 

Start developing your application with testability in mind. Have unit, integration, and e2e test results in a single place accessible and visible for all team members. Otherwise, you will run into various issues. In our case, we automated an application that was developed without considering testability and added e2e automated tests later in the product’s lifecycle. Thus, test automation was more complex. Nevertheless, it helped uncover some architectural flaws to be addressed.