hudl 1 tablet by Tesco

hudl 1

Launched September 30, 2013 the hudl tablet was Tesco’s first ever tablet which sold over 500,000 devices, got a lot of positive reviews and won many awards despite competing with other big brands such as Samsung and Google Nexus.

Whilst on the hudl team my main task was to set up an automation framework, review the manual test cases and mark down which tests can be automated, then eventually automate the tests. Given the fact that I had never done automation for a mobile device nor used UiAutomator before not to mention the thousands of manual test cases which needed to be automated, this was going to be a huge challenge.

Unfortunately one of the disadvantages of UiAutomator is the lack of reports it generates. It only shows the status of a test run in the command terminal which isn’t very user friendly and it doesn’t get stored anywhere. So with the help of a talented colleague of mine we managed to create a custom html report which would get the information from the terminal and create a nice graphical report which would show exactly what the test was doing by displaying the steps taken during test. Anyone was able to look at these reports and instantly know at what point a test failed and they could even see a screenshot of the screen at the time of failure.

sample_report
Custom test report we created which shows all the steps taken during the test.

Whilst exploring ways to improve the report and to give us more test coverage we decided to implement a tool which would compare screenshots. UiAutomator is unable to check the screens appearance, so using this tool we were able to know if there were visual problems as well. During the automated tests multiple screenshots were taken and named appropriately, then the tool would compare these screenshots with “master” screenshots based on the names given to them. If the screenshots matched up to the master screenshots we would pass the test, if the screenshots did not match the test would throw up a warning so someone can manually check the image to see what went wrong. The cool part of this tool was the fact that we could set a threshold on how different the screenshots could be. For example as the time on the device was always changing and would be different to the time we have on our master screenshots, we would have to set a threshold so it would not fail when the time was different, but would fail if some text was missing from the screen.

Screenshot comparison
Example of the screenshot comparison message displayed when the images did not match. Along with a link to view the images.

My colleague also managed to set up Jenkins so that we could start tests, view all the reports, flash the devices with the correct builds, side load any apps where necessary etc. Jenkins and the UiAutomator / automated tests work very closely, anything that was not possible to achieve via UiAutomator we would achieve with the help of the Jenkins setup.

It is truly a work of art!

Leave a Reply