This week we return to our series of revisiting articles previously published in the eTestware blog, with a look back at a piece on how best to proceed when you have a mixed QA team. To read the original article, click here, or stay on this page for an abridged version…
At the start of September 2017, when the world was a vastly different place, the experts at eTestware – part of theICEway ecosystem that is also home to ICE Technology Services and CRIBB Cyber Security – decided to focus on a situation faced by many companies.
The article was aimed at those who had made the decision to implement an outsourced QA testing services team, who had a plan for their test strategy, and who knew they needed to document decisions made and ensure that this documentation was easily accessible for review.
Our experienced testers suggested that the test strategy should determine the testing types, schedule and scope required, outlining each in turn with possible examples.
Identify whether or not the outsourced QA team will focus on functional, regression, integration, or performance testing, then decide if they are creating manual tests or automated test cases. Another consideration would be leveraging them as additional resources for automated test maintenance, or if they might be asked to develop code-based test frameworks or unit tests.
The article went on to point out several other options to explore:
- Use the outsourced QA team for functional beta testing across a larger number and range of different user types
- Look at both mobile and web-based applications
When the types of testing have been agreed upon, decide how to proceed – do you divide the testing into teams or resources? Next you must assign people to specific projects and QA tasks, ensuring that if using a combination of internal and outsourced QA teams, each understands what is expected of them.
Good organisation leads to higher productivity.
Assuming that the testing team tasks are assigned and the testing types defined, the next step is to develop the test execution schedule, with specific schedules for each defined testing type being the eTestware recommendation.
The article continued to suggest that the schedules include timings for each test execution, along with start and end dates plus any additional information to help the team track their QA planned test execution progress accurately.
Our experts then pointed out that the depth of testing typically corresponds to the length of the testing schedule, because when you know the scope of each testing effort, it is easier to find defects in the code.
Finally, the piece provided some ‘top tips’:
- Make sure that the QAs test consistently
- Make sure that they define the expected scope in order to increase consistency
- Before the release, time permitting, add in a creative test session so that the QA can try to find defects hiding in uncommon places within the application
- Ensure that the outsourced QA testing team is as productive as possible by clearly defining test type, schedule, and scope
Well-planned testing improves team morale, productivity, and ensures that the optimum number of defects are discovered before code release.
Functional testing – A quality assurance testing process that uses the specifications of the software components being tested to create its test cases
Regression testing – This is defined as a type of software testing carried out to confirm that a recent change in the program or code has not had an adverse effect upon existing features
Integration testing – Individual units are combined and tested as a group in order to expose faults in the interaction between integrated units
Performance testing – Used to determine the speed, responsiveness and stability of a network, computer, or software program / device under a workload
Top TipSee above.