Buy @ Amazon

UI Automation Gotchas


Having recently had the opportunity to work on and refactor some of the UI automation projects, I have come across many a gotchas that creep into the automation codebase resulting in not so beautiful, a little un-manageable, random-failure throwing specs. The worst of all things to expect from an automation test suite are random  failures, because it defeats the purpose of Continuous Integration in that the team stops responding to the automation test failures saying that there always is some random test that fails. So, if there is a test that fails randomly, analyse the reason for it and fix it ASAP giving due consideration to the solution that you put in place to ensure that the spec would not fail again for wrong reasons.
  
To aid the folks writing ui automation specs, I list the common gotchas that you should be aware of so that you do not end up making the same mistakes. And here goes the list:

1. Loving to sleep

     Liberal presence of sleep() method in the automation test code is unfortunately a very common thing of the day. You must avoid sleep() method. sleep() can come into existence only upon the team consensus (including the developers) wherein I believe that the team comes to a consensus upon its usage as a last resort. Here again, I would recommend posting the situation in various public tech forums to see what the community adopts as a practice and go with that approach; it is more likely for you to get help from the community on the right approach. Bear in mind that with every sleep(), you drag the time for feedback of automation test suite to that extent. YOU MUST AVOID SLEEPING!

Here is one sample scenario where I found sleep() was used without any thought.

Scenario:
As a user
I should be able to add notes to the table
so that the information is recorded for future reference.

Code: 

Why is sleep bad in the above context? Because there is no certainty on the time it takes to successfully save a new note. Typically, it takes micro seconds for the save to be successful and you are wasting thousands of microseconds here. On occasions though, it might take more than the anticipated sleep time, in which case the test might fail because of early assertions.Alternatively, the sleep() method can be replaced with wait_until method available in capybara. What if you are using a framework other than capybara? This   post is not a recipe. So, go google or stack overflow for a solution ;) 


2. Asserting the messages in the page for grammar and exactness of expected messages

     Let's get it right automation tests are written to test functionality's correctness. Writing tests to ensure the message remain intact as expected is a stupid stretch resulting in test brittleness. And common, if the messaging were to change so be it. For the good customers sake spend your valuable time on protecting the application's functional correctness.


3. Forgetting to create your browser profile with download preferences set to automatically download specific file extensions to a custom path

     The default behavior of most browsers is to prompt for opening or saving the file when it is downloaded. With this default preference in place, it makes automation a little difficult to respond to this prompt. You may well leave the prompt unattended for the automation spec to continue with other test cases, which I have seen as a practice followed by a team out of sheer laziness and lack of interest/curiosity to resolve. Sad! Take pride in good work more than appreciations from others. Love what you do…please!

Every browser uses a profile by default. In your automation boot-strapper file, you can set the download preference for the profile the browser uses to automatically download the files with specific extensions to a custom location.

Find below a sample code for setting the Firefox preference to automatically download the files of csv or pdf type:


4. Not asserting for the presence of right elements upon performing a particular AJAX action

     Common sense is an uncommon thing. Consider a scenario below:

As a user
I should be able to add notes to the table
so that the information is recorded for future reference.

Every new note is added seamlessly by ajax call. And the Page object in the automation test for this scenario looked as below:
What do you think the problem is with the above code? The code above should be perfectly fine for the first row to be added in a table. But for the second row, it isn't right, because the has_link?("Edit note") will return true seeing the existence of it for the first row. This results in the test instability. So instead of looking for the link, the QA should check for the existence of the note text that was added. Again, take this with a pinch of salt, if the context is the new row is added in an in-place editor like table, you might have to think what would be the best fit in that situation.


5. Not employing page object pattern

     Employing Page Object Pattern is the fundamental thingy to UI Automation. If you are not doing it, you should do it for better maintainability and readability. PERIOD.


6. Littering page object with assertions

     I have repeatedly found in projects after projects where the Page objects have methods like verify_XXX containing assertions in it. It is bad in that the Page object is violating the design principle, SoC (Seperation of Concerns). Assertions is the job of the test/spec class. Page object can have methods that return the state of the page or an actions that the user performs on the page and nothing more.

     However, the common reasoning that I hear for having assertions in the Page object is that of Reusability of the assertions across tests or specs. In an attempt to go by the DRY design principle, the SoC cannot be compromised, at least in this case. Having said that, I'll explain with an example of how the trade-off between DRY and Soc can be achieved with good balance.

     Now, assume you have a code in the Page object as below:

     We can re-write the above code segments as below and we would do good to write it this way ;)


7. Narrowed down xpath

     The reasoning I hear for narrowed xpath are two things - it was spit out by recording tool and it helps speed up finding elements faster. This is a classical case where test stability and maintenance should be given more importance over speed or performance.

     So the next time you write one or come across one, make sure it is reasonably accurate.

 
8. Using xpath over css

     You should prefer css over xpath for better automation test stability.


9. Using css over id

     You should prefer id over css for better automation test performance and stability.


10. Not exploiting the automation framework and tools that are put in use

     There are occasions that you would come across wherein you feel stuck at writing an automation test case for a particular scenario. During such times, do not ignore writing the test case all together from automation. Instead, explore for ways to exploit your automation testing framework. Google, stack-everflow, etc - there are many life savers. Do spend time around looking for solutions until you get to a point where you learn that automating the scenario on hand is not possible.


11. Code littered with print statements

     If you see it delete it. If you add it, don't forget to delete it. Print statements are not functionality. You add it/those for debugging purposes. Once you are done with it, DO DELETE IT.


12. Page object littered with dead code

     By dead code, I mean code that is not used. I have seen in many occasions where there exists method definitions in the Page object that are never referred anywhere in that file or other.

     When you come across such code what should be your reaction? Simply DELETE. No joke intended!


13. Code littered with comments and commented out code

     Commented out code is well, dead code. And by now you know what you should do with dead code - mercilessly delete it. It is such a common phenomenon to find code base littered with commented out code. And I hate to see it and so instantly delete it without a second thought.

     Comments describing the method or some lines of code is a smell in itself. Whenever, you find yourself doing it or come across such things, see how you can refactor the code to avoid it.



14. Violating DRY principles in code

     This is small but irritating stuff. Be lazy, very very lazy……to repeat stuff. I have seen automation code bases littered with many instances of duplication. This is REAL PAIN TO EYES! As an example, if you are using a string "random text" in multiple lines in a spec, it is SHIT! Do not do this. Assign the string constant to a local variable and refer to it by way of that local variable in all the required places.

This is very basic stuff and it really pains to see this kind of violation in code bases.


15. Not making use of the language strengths

     You might say how does it matter if I don't use the niceties of the programming language and how can not using some niceties be an automation gotcha? Automation specs/tests is code and code should be…well, readable and maintainable - using the language niceties aid better readability and maintainability. No exception to this at all. So, be curious. Have an urge for intellectual betterment and you'll learn new stuff and will code better.