Testing Blog
Android UI Automated Testing
Friday, March 20, 2015
by Mona El Mahdy
Overview
This post reviews four strategies for Android UI testing with the goal of creating UI tests that are fast, reliable, and easy to debug.
Before we begin, let’s not forget an important rule: whatever can be unit tested should be unit tested.
Robolectric
and
gradle unit tests support
are great examples of unit test frameworks for Android. UI tests, on the other hand, are used to verify that your application returns the correct UI output in response to a sequence of user actions on a device.
Espresso
is a great framework for running UI actions and verifications in the same process. For more details on the Espresso and UI Automator tools, please see:
test support libraries
.
The Google+ team has performed many iterations of UI testing. Below we discuss the lessons learned during each strategy of UI testing. Stay tuned for more posts with more details and code samples.
Strategy 1: Using an End-To-End Test as a UI Test
Let’s start with some definitions. A
UI test
ensures that your application returns the correct UI output in response to a sequence of user actions on a device. An
end-to-end (E2E)
test brings up the full system of your app including all backend servers and client app. E2E tests will guarantee that data is sent to the client app and that the entire system functions correctly.
Usually, in order to make the application UI functional, you need data from backend servers, so UI tests need to simulate the data but not necessarily the backend servers. In many cases UI tests are confused with E2E tests because E2E is very similar to manual test scenarios. However, debugging and stabilizing E2E tests is very difficult due to many variables like network flakiness, authentication against real servers, size of your system, etc.
When you use UI tests as E2E tests, you face the following problems:
Very large and slow tests.
High flakiness rate due to timeouts and memory issues.
Hard to debug/investigate failures.
Authentication issues (ex: authentication from automated tests is very tricky).
Let’s see how these problems can be fixed using the following strategies.
Strategy 2: Hermetic UI Testing using Fake Servers
In this strategy, you avoid network calls and external dependencies, but you need to provide your application with data that drives the UI. Update your application to communicate to a local server rather than external one, and create a fake local server that provides data to your application. You then need a mechanism to generate the data needed by your application. This can be done using various approaches depending on your system design. One approach is to record server responses and replay them in your fake server.
Once you have hermetic UI tests talking to a local fake server, you should also have
server hermetic tests
. This way you split your E2E test into a server side test, a client side test, and an integration test to verify that the server and client are in sync (for more details on integration tests, see the backend testing section of
blog
).
Now, the client test flow looks like:
While this approach drastically reduces the test size and flakiness rate, you still need to maintain a separate fake server as well as your test. Debugging is still not easy as you have two moving parts: the test and the local server. While test stability will be largely improved by this approach, the local server will cause some flakes.
Let’s see how this could this be improved...
Strategy 3: Dependency Injection Design for Apps.
To remove the additional dependency of a fake server running on Android, you should use dependency injection in your application for swapping real module implementations with fake ones. One example is
Dagger
, or you can create your own dependency injection mechanism if needed.
This will improve the testability of your app for both unit testing and UI testing, providing your tests with the ability to mock dependencies. In
instrumentation testing
, the test apk and the app under test are loaded in the same process, so the test code has runtime access to the app code. Not only that, but you can also use classpath override (the fact that test classpath takes priority over app under test) to override a certain class and inject test fakes there. For example, To make your test hermetic, your app should support injection of the networking implementation. During testing, the test injects a fake networking implementation to your app, and this fake implementation will provide seeded data instead of communicating with backend servers.
Strategy 4: Building Apps into Smaller Libraries
If you want to scale your app into many modules and views, and plan to add more features while maintaining stable and fast builds/tests, then you should build your app into small components/libraries. Each library should have its own UI resources and user dependency management. This strategy not only enables mocking dependencies of your libraries for hermetic testing, but also serves as an experimentation platform for various components of your application.
Once you have small components with dependency injection support, you can build a test app for each component.
The test apps bring up the actual UI of your libraries, fake data needed, and mock dependencies. Espresso tests will run against these test apps. This enables testing of smaller libraries in isolation.
For example, let’s consider building smaller libraries for login and settings of your app.
The settings component test now looks like:
Conclusion
UI testing can be very challenging for rich apps on Android. Here are some UI testing lessons learned on the Google+ team:
Don’t write E2E tests instead of UI tests. Instead write unit tests and integration tests beside the UI tests.
Hermetic tests are the way to go.
Use dependency injection while designing your app.
Build your application into small libraries/modules, and test each one in isolation. You can then have a few integration tests to verify integration between components is correct .
Componentized UI tests have proven to be much faster than E2E and 99%+ stable. Fast and stable tests have proven to drastically improve developer productivity.
8 comments
How the Google+ Team Tests Mobile Apps
Friday, August 30, 2013
by Eduardo Bravo Ortiz
“Mobile first” is the motto these days for many companies. However, being able to test a mobile app in a meaningful way is very challenging. On the Google+ team we have had our share of trial and error that has led us to successful strategies for testing mobile applications on both iOS and Android.
General
Understand the platform. Testing on Android is not the same as testing on iOS. The testing tools and frameworks available for each platform are significantly different. (e.g., Android uses Java while iOS uses Objective-C, UI layouts are built differently on each platform, UI testing frameworks also work very differently in both platforms.)
Stabilize your test suite and test environments. Flaky tests are worse than having no tests, because a flaky test pollutes your build health and decreases the credibility of your suite.
Break down testing into manageable pieces. There are too many complex pieces when testing on mobile (e.g., emulator/device state, actions triggered by the OS).
Provide a
hermetic test
environment for your tests. Mobile UI tests are flaky by nature; don’t add more flakiness to them by having external dependencies.
Unit tests are the backbone of your mobile test strategy. Try to separate the app code logic from the UI as much as possible. This separation will make unit tests more granular and faster.
Android Testing
Unit Tests
Separating UI code from code logic is especially hard in Android. For example, an Activity is expected to act as a controller and view at the same time; make sure you keep this in mind when writing unit tests. Another useful recommendation is to decouple unit tests from the Android emulator, this will remove the need to build an APK and install it and your tests will run much faster.
Robolectric
is a perfect tool for this; it stubs the implementation of the Android platform while running tests.
Hermetic UI Tests
A hermetic UI test typically runs as a test without network calls or external dependencies. Once the tests can run in a
hermetic environment
, a white box testing framework like
Espresso
can simulate user actions on the UI and is tightly coupled to the app code. Espresso will also synchronize your tests actions with events on the UI thread, reducing flakiness. More information on Espresso is coming in a future Google Testing Blog article.
Diagram: Non-Hermetic Flow vs. Hermetic Flow
Monkey Tests
Monkey tests look for crashes and
ANRs
by stressing your Android application. They exercise
pseudo-random
events like clicks or gestures on the app under test. Monkey test results are reproducible to a certain extent; timing and latency are not completely under your control and can cause a test failure. Re-running the same monkey test against the same configuration will often reproduce these failures, though. If you run them daily against different SDKs, they are very effective at catching bugs earlier in the development cycle of a new release.
iOS Testing
Unit Tests
Unit test frameworks like
OCUnit
, which comes bundled with Xcode, or
GTMSenTestcase
are both good choices.
Hermetic UI Tests
KIF
has proven to be a powerful solution for writing Objective-C UI tests. It runs in-process which allows tests to be more tightly coupled with the app under test, making the tests inherently more stable. KIF allows iOS developers to write tests using the same language as their application.
Following the same paradigm as Android UI tests, you want Objective-C tests to be hermetic. A good approach is to mock the server with pre-canned responses. Since KIF tests run in-process, responses can be built programmatically, making tests easier to maintain and more stable.
Monkey Tests
iOS has no equivalent native tool for writing monkey tests as Android does, however this type of test still adds value in iOS (e.g. we found 16 crashes in one of our recent Google+ releases). The Google+ team developed their own custom monkey testing framework, but there are also many third-party options available.
Backend Testing
A mobile testing strategy is not complete without testing the integration between server backends and mobile clients. This is especially true when the release cycles of the mobile clients and backends are very different. A replay test strategy can be very effective at preventing backends from breaking mobile clients. The theory behind this strategy is to simulate mobile clients by having a set of golden request and response files that are known to be correct. The replay test suite should then send golden requests to the backend server and assert that the response returned by the server matches the expected golden response. Since client/server responses are often not completely deterministic, you will need to utilize a diffing tool that can ignore expected differences.
To make this strategy successful you need a way to seed a repeatable data set on the backend and make all dependencies that are not relevant to your backend hermetic. Using in-memory servers with fake data or an RPC replay to external dependencies are good ways of achieving repeatable data sets and hermetic environments. Google+ mobile backend uses
Guice
for dependency injection, which allows us to easily swap out dependencies with fake implementations during testing and seed data fixtures.
Diagram: Normal flow vs Replay Tests flow
Conclusion
Mobile app testing can be very challenging, but building a comprehensive test strategy that understands the nature of different platforms and tools is the key to success. Providing a reliable and hermetic test environment is as important as the tests you write.
Finally, make sure you prioritize your automation efforts according to your team needs. This is how we prioritize on the Google+ team:
Unit tests: These should be your first priority in either Android or iOS. They run fast and are less flaky than any other type of tests.
Backend tests: Make sure your backend doesn’t break your mobile clients. Breakages are very likely to happen when the release cycle of mobile clients and backends are different.
UI tests: These are slower by nature and flaky. They also take more time to write and maintain. Make sure you provide coverage for at least the critical paths of your app.
Monkey tests: This is the final step to complete your mobile automation strategy.
Happy mobile testing from the Google+ team.
17 comments
Labels
TotT
104
GTAC
61
James Whittaker
42
Misko Hevery
32
Code Health
31
Anthony Vallone
27
Patrick Copeland
23
Jobs
18
Andrew Trenk
13
C++
11
Patrik Höglund
8
JavaScript
7
Allen Hutchison
6
George Pirocanac
6
Zhanyong Wan
6
Harry Robinson
5
Java
5
Julian Harty
5
Adam Bender
4
Alberto Savoia
4
Ben Yu
4
Erik Kuefler
4
Philip Zembrod
4
Shyam Seshadri
4
Chrome
3
Dillon Bly
3
John Thomas
3
Lesley Katzen
3
Marc Kaplan
3
Markus Clermont
3
Max Kanat-Alexander
3
Sonal Shah
3
APIs
2
Abhishek Arya
2
Alan Myrvold
2
Alek Icev
2
Android
2
April Fools
2
Chaitali Narla
2
Chris Lewis
2
Chrome OS
2
Diego Salas
2
Dori Reuveni
2
Jason Arbon
2
Jochen Wuttke
2
Kostya Serebryany
2
Marc Eaddy
2
Marko Ivanković
2
Mobile
2
Oliver Chang
2
Simon Stewart
2
Stefan Kennedy
2
Test Flakiness
2
Titus Winters
2
Tony Voellm
2
WebRTC
2
Yiming Sun
2
Yvette Nameth
2
Zuri Kemp
2
Aaron Jacobs
1
Adam Porter
1
Adam Raider
1
Adel Saoud
1
Alan Faulkner
1
Alex Eagle
1
Amy Fu
1
Anantha Keesara
1
Antoine Picard
1
App Engine
1
Ari Shamash
1
Arif Sukoco
1
Benjamin Pick
1
Bob Nystrom
1
Bruce Leban
1
Carlos Arguelles
1
Carlos Israel Ortiz García
1
Cathal Weakliam
1
Christopher Semturs
1
Clay Murphy
1
Dagang Wei
1
Dan Maksimovich
1
Dan Shi
1
Dan Willemsen
1
Dave Chen
1
Dave Gladfelter
1
David Bendory
1
David Mandelberg
1
Derek Snyder
1
Diego Cavalcanti
1
Dmitry Vyukov
1
Eduardo Bravo Ortiz
1
Ekaterina Kamenskaya
1
Elliott Karpilovsky
1
Elliotte Rusty Harold
1
Espresso
1
Felipe Sodré
1
Francois Aube
1
Gene Volovich
1
Google+
1
Goran Petrovic
1
Goranka Bjedov
1
Hank Duan
1
Havard Rast Blok
1
Hongfei Ding
1
Jason Elbaum
1
Jason Huggins
1
Jay Han
1
Jeff Hoy
1
Jeff Listfield
1
Jessica Tomechak
1
Jim Reardon
1
Joe Allan Muharsky
1
Joel Hynoski
1
John Micco
1
John Penix
1
Jonathan Rockway
1
Jonathan Velasquez
1
Josh Armour
1
Julie Ralph
1
Kai Kent
1
Kanu Tewary
1
Karin Lundberg
1
Kaue Silveira
1
Kevin Bourrillion
1
Kevin Graney
1
Kirkland
1
Kurt Alfred Kluever
1
Manjusha Parvathaneni
1
Marek Kiszkis
1
Marius Latinis
1
Mark Ivey
1
Mark Manley
1
Mark Striebeck
1
Matt Lowrie
1
Meredith Whittaker
1
Michael Bachman
1
Michael Klepikov
1
Mike Aizatsky
1
Mike Wacker
1
Mona El Mahdy
1
Noel Yap
1
Palak Bansal
1
Patricia Legaspi
1
Per Jacobsson
1
Peter Arrenbrecht
1
Peter Spragins
1
Phil Norman
1
Phil Rollet
1
Pooja Gupta
1
Project Showcase
1
Radoslav Vasilev
1
Rajat Dewan
1
Rajat Jain
1
Rich Martin
1
Richard Bustamante
1
Roshan Sembacuttiaratchy
1
Ruslan Khamitov
1
Sam Lee
1
Sean Jordan
1
Sebastian Dörner
1
Sharon Zhou
1
Shiva Garg
1
Siddartha Janga
1
Simran Basi
1
Stan Chan
1
Stephen Ng
1
Tejas Shah
1
Test Analytics
1
Test Engineer
1
Tim Lyakhovetskiy
1
Tom O'Neill
1
Vojta Jína
1
automation
1
dead code
1
iOS
1
mutation testing
1
Archive
▼
2025
(1)
▼
Jan
(1)
Arrange Your Code to Communicate Data Flow
►
2024
(13)
►
Dec
(1)
►
Oct
(1)
►
Sep
(1)
►
Aug
(1)
►
Jul
(1)
►
May
(3)
►
Apr
(3)
►
Mar
(1)
►
Feb
(1)
►
2023
(14)
►
Dec
(2)
►
Nov
(2)
►
Oct
(5)
►
Sep
(3)
►
Aug
(1)
►
Apr
(1)
►
2022
(2)
►
Feb
(2)
►
2021
(3)
►
Jun
(1)
►
Apr
(1)
►
Mar
(1)
►
2020
(8)
►
Dec
(2)
►
Nov
(1)
►
Oct
(1)
►
Aug
(2)
►
Jul
(1)
►
May
(1)
►
2019
(4)
►
Dec
(1)
►
Nov
(1)
►
Jul
(1)
►
Jan
(1)
►
2018
(7)
►
Nov
(1)
►
Sep
(1)
►
Jul
(1)
►
Jun
(2)
►
May
(1)
►
Feb
(1)
►
2017
(17)
►
Dec
(1)
►
Nov
(1)
►
Oct
(1)
►
Sep
(1)
►
Aug
(1)
►
Jul
(2)
►
Jun
(2)
►
May
(3)
►
Apr
(2)
►
Feb
(1)
►
Jan
(2)
►
2016
(15)
►
Dec
(1)
►
Nov
(2)
►
Oct
(1)
►
Sep
(2)
►
Aug
(1)
►
Jun
(2)
►
May
(3)
►
Apr
(1)
►
Mar
(1)
►
Feb
(1)
►
2015
(14)
►
Dec
(1)
►
Nov
(1)
►
Oct
(2)
►
Aug
(1)
►
Jun
(1)
►
May
(2)
►
Apr
(2)
►
Mar
(1)
►
Feb
(1)
►
Jan
(2)
►
2014
(24)
►
Dec
(2)
►
Nov
(1)
►
Oct
(2)
►
Sep
(2)
►
Aug
(2)
►
Jul
(3)
►
Jun
(3)
►
May
(2)
►
Apr
(2)
►
Mar
(2)
►
Feb
(1)
►
Jan
(2)
►
2013
(16)
►
Dec
(1)
►
Nov
(1)
►
Oct
(1)
►
Aug
(2)
►
Jul
(1)
►
Jun
(2)
►
May
(2)
►
Apr
(2)
►
Mar
(2)
►
Jan
(2)
►
2012
(11)
►
Dec
(1)
►
Nov
(2)
►
Oct
(3)
►
Sep
(1)
►
Aug
(4)
►
2011
(39)
►
Nov
(2)
►
Oct
(5)
►
Sep
(2)
►
Aug
(4)
►
Jul
(2)
►
Jun
(5)
►
May
(4)
►
Apr
(3)
►
Mar
(4)
►
Feb
(5)
►
Jan
(3)
►
2010
(37)
►
Dec
(3)
►
Nov
(3)
►
Oct
(4)
►
Sep
(8)
►
Aug
(3)
►
Jul
(3)
►
Jun
(2)
►
May
(2)
►
Apr
(3)
►
Mar
(3)
►
Feb
(2)
►
Jan
(1)
►
2009
(54)
►
Dec
(3)
►
Nov
(2)
►
Oct
(3)
►
Sep
(5)
►
Aug
(4)
►
Jul
(15)
►
Jun
(8)
►
May
(3)
►
Apr
(2)
►
Feb
(5)
►
Jan
(4)
►
2008
(75)
►
Dec
(6)
►
Nov
(8)
►
Oct
(9)
►
Sep
(8)
►
Aug
(9)
►
Jul
(9)
►
Jun
(6)
►
May
(6)
►
Apr
(4)
►
Mar
(4)
►
Feb
(4)
►
Jan
(2)
►
2007
(41)
►
Oct
(6)
►
Sep
(5)
►
Aug
(3)
►
Jul
(2)
►
Jun
(2)
►
May
(2)
►
Apr
(7)
►
Mar
(5)
►
Feb
(5)
►
Jan
(4)
Feed
Follow @googletesting