So you want to achieve web test automation.. now what?

Daniel Garay
Sep 10, 2018
Here, I discuss the hurdles, pitfalls, and successes I encountered on my journey to build a new web automation test infrastructure here at Parasoft, and how we migrated our existing manual tests.

You’re excited, thrilled, optimistic, and maybe even a little nervous about a new opportunity to achieve web test automation, bestowed on you by your manager. But it suddenly hits you. Where the heck do I start? Do I just starting writing test? What automation tool will I use? Should I set up some kind of infrastructure? Do I just start writing tests locally on my machine and then port the environment over to some staging environment? What hurdles should I consider before I move forward? So many tasks to consider!!

Before you take two steps forward, let’s just take a step back and consider what exactly we want to accomplish.

Building a new web automation test infrastructure

Test automation is not a new concept in our industry. There are numerous resources out there that discuss its pros and cons, and many different approaches to achieve successful test automation infrastructure. Here, I'll discuss the hurdles, pitfalls, and successes I encountered on my journey to build a new web automation test infrastructure here at Parasoft, and how we migrated our existing manual tests. Hopefully when it’s all said and done, you can use my experience to streamline your process a bit more efficiently.

First, let me walk you through a scenario I went through when I was given the responsibility of a new team and getting the team’s web automation test infrastructure up and running. The end goal was defined, but it was completely up to me to decide what path I would take to get there.

Defining milestones

The first thing I did was get together with all of the stakeholders involved and define what my milestones would be.

I came up with the following milestones:

  • Do research
  • Define the scope/coverage of the tests
  • Create and maintain automation tests and continue collaboration with team members
  • Publish results

So let's break that down a little.

Do research

Like any other big task, you always want to do your due diligence and research all of the tools necessary for you to get this done. What are some of the items we had to consider, you ask? Let’s see, first there was the issue of what tools we'd use and what scripting language we'd be writing in. Is it scalable? How’s the maintenance, is it something that can fit into the teams existing eco-system? What would be the learning curve for those who would maintain the automated tests? What about the existing development team’s infrastructure, does it integrate with that? And what are we going to do about reporting. We had to consider the team's familiarity with existing tools within the company, and who would maintain the test, both short term and long term.

After considering many factors, we decided to use Parasoft SOAtest for web automation testing and Parasoft DTP for reporting purposes. It addressed the majority of our questions, it was easy to use, and didn’t require any prior knowledge of any programming languages. Every company, every team and even every individual will have a different set of questions to answer before moving forward, but the main point is to try to get as many of your questions answered up front rather than later to reduce the bottlenecks you may encounter up ahead.

Define the scope/coverage of the tests

Next up: what should you define as the scope of tests to automate? Don’t be that person that tries to automate everything. These are web functional tests, so you have to focus on the high-traffic areas or most commonly-used part of the web interface of the application to get the most usage of your automated tests.

For me, since the application under test (AUT) was new to me, I had to work with both developers and existing QA to understand the current test cases and manual smoke test procedure. Their existing manual test cases were at a higher level (for exploratory testing), so the QA engineer couldn’t just point me to existing test cases for automation. It was a constant collaboration in every sprint, and even at times in our daily stand-up to make sure we had the coverage we wanted to automate. Once the scope was defined, we then prioritized the coverage areas so I knew exactly what to work on first. This is a good rule of thumb: even if you know the application you should always work in collaboration with the existing team when defining the scope.

Create and maintain automation tests and continue collaboration with team members

With the infrastructure set up, and both scope and priorities defined, I can finally begin creating the automated tests. Finally! I get to write my first set of automated tests.

For this project, I first used the browser playback feature to get a good understanding of Parasoft SOAtest and then easily migrated to creating my own and/or editing existing browser playback tests. I’m not embarrassed to say that my first few tests were not implemented in an ideal manner. But that’s how we all learn, right? By trial and error.

My initial tests were very dependent to the environment, where it could only be executed in a specific sequence. There was no set-up or tear-down as part of my test. This obviously made it harder to maintain and troubleshoot for other team members. We started using the tool's built-in capabilities to set-up/tear-down test, re-use existing test (shared test as sub-set of another test), and parameterize them so it could be portable in different environments. It was easy to integrate REST API test within our automation web functional test, which made our life a heck of a lot easier to populate any pre-requisite data. Single set of tests were executed against different browsers seamlessly. Occasionally, we ran into a browser specific issue, e.g. unable to perform Click action where element is not visible. But the tool's powerful built-in feature for different wait conditions, capability of executing arbitrary JavaScript, rich documentation and active user's forum, came as a savior to us.

Publish results

The last goal I had identified was the reporting aspect of the test results. And here, it’s all about visibility. This wasn’t some secret formula I concatenated and wanted to keep to myself. On the contrary, I wanted everyone to be aware of the results so that the whole team was responsible for maintaining the tests.

I set up test results to be reported into the Parasoft DTP reporting platform. I was able to easily create a dashboard with multiple gadgets to display the test results, where it could be displayed on the big TV screen within our development department. This way there was no hiding from the truth.

The only way we would benefit from this down the line was if we kept the test results at 100% passing. Otherwise, it would just be noise that no one cares about. Before I even started, I had established with development that this would be a team goal to keep maintained, not a one-person job. They all agreed, and now, when I walk into the office every morning, I can easily look up and see where we are at with the test results from the previous run. It’s music to my eyes.

Final thoughts

Getting everything completed was never meant to be a one-person job, nor did I want to try to complete this on my own. It took a lot of collaboration and support from the team, including management. One thing I learned is that you must stay on top of the tests. Keep it maintained and passing at 100%. Remember, your automated tests are like a living organism -- it has to be looked after on a daily basis, and do not hesitate to optimize your test.

Do your research first before diving into the project and you’ll be able to address some of the bottlenecks ahead of time. All in all, this was a great learning experience for me and I look forward to getting thrown into the firepit of another team, to rinse and repeat the same procedures I just finishing accomplishing.

New call-to-action

API Testing Reporting & Analytics Software Testing Best Practices