Getting up to speed quickly with automation tests
I'm starting a new QA engineering internship and I've been tasked with automation tests. I was going through the documentation for selenium and testNG and it seems straightforward and easy to pickup.
The only thing I find difficult is becoming familiar with the application that's being tested. There are always so many different methods and classes that make up an application that, unless you wrote it yourself from scratch, it would be difficult to know what methods are available, which ones to use, etc. At least in my experience.
But I've always noticed experience software engineers coming in, spend a few days, and they're able to develop new features, automation tests, etc...
So, I was wondering if there are any tips on how to get acquainted with the in(s) and out(s) of an application quickly.
I asked one of the experience Software Test Engineer and he said it's helpful to know how the application works manually first. But even then, to me, it would seem to take awhile to learn the application, then figure out what methods/class relate to a specific action within - which again, could take quite a bit of time.
Anyone have some helpful input on this?
There's no one true way to do this, but there are some common things you can do to get an overview of the system you're working with.
My approach is to start with a series of questions (not in any particular order)
- Who - who is the application intended for? Knowing who the users are tells you a lot about how they're likely to want to use the software and what parts of it they're going to use most.
- What - what does the software do? What's it's main function?
- When - and how often does it do what it should do? The tests you'll want for software that's needs to be available all the time is different than what you'd do for something likely to get occasional use.
- Why - why is the software needed? This does overlap somewhat with what it does, but the difference here is that there's a reason the application exists, and that reason is going to guide your strategy.
- Where - where is the software used? This doesn't just cover whether it's business or consumer software, it also covers things like is it web-based or desktop-based, will it be running over a LAN or WAN or will it be self-contained, and so forth. The environment - software, hardware, and human - tells you a lot about the conditions the application will need to handle, which in turn tells you where to focus your testing.
After I get the overview (the 10,000 foot view - as if you're 10k feet above the ground looking down) the next thing I do is explore the application with frequent reference to the user documentation, use cases, user stories, and requirements. At this point I'm not looking at classes, methods, or actions. I'm looking at how a typical user would interact with the system. If you're lucky, the application itself is self-documenting and you can figure most of it out from the user interface. If you're not (which can often be the case with business-to-business software and specialized software) you have to dig deeper to figure it out. The goal here is to figure out the area of the system that will give the best bang for buck when automated.
Once I've got a handle on that, it's time to start exploring the system with the automation tool. In the case of Selenium and TestNG, if you're working with a web application there's looking at the generated web page source for elements to hook into, as well as deciding on your framework - whether you're going to have a whole lot of granular tests strung together to form a pathway through the system, how you're going to handle dependencies between modules, how you're going to deal with balancing test code duplication and the need to keep tests independent (this doesn't have to be an issue with Selenium or TestNG, but with some of the big box tools it's very easy for it to become a major problem).
Then I start actually building coded automation. Usually the first thing I'll automate is logging in for the simple reason that just about everything I've tested requires the user to be logged in to do anything. From there, I'll start building test suites that cover the most common pathways through the application: log in and sell stuff, log in and run report X, log in and... whatever. At this point (anything from 1-2 days after I first encounter the software to a year or more later) I don't expect to be getting everything or even close to it. The goal is to get something which I and others can add to later.
The only other thing I'd suggest is looking at why you're automating. The usual reason is for regression testing, but there are other uses for automation, including building helper scripts to take care of tedious manual tasks, harnesses to run a common set of actions so you can work from there, performing the same task multiple times (I've built quick and dirty automation to do things like run and exit an application hundreds of times to try to track down an intermittent on-shut-down error, create thousands of products in order to test a receipt overflow, create thousands of transactions to test a payment processor hard limit to the number of transactions it could process in a batch, slightly less dirty ones for things like archiving log files on a daily basis so I didn't have to deal with waiting half an hour for a multiple GB log file to open, and so forth).