Like all areas of software development, the one traditionally known as Quality Assurance (QA) has evolved over time. We’ve charted previously how we’ve redefined QA, and we’ve continued to adjust our approach and process to support this change. Here’s an update.
These changes can be especially tricky at an agency, with its unique business model. Projects vary widely in scale, scope, and the technologies involved. While challenging, this environment also presents an opportunity to try different methods in parallel and learn very quickly what works and what doesn’t. We’ve been able to create a framework that strikes the right balance between process and progress, allowing us to set clear expectations without sacrificing flexibility: the four Es.
Our QA is embedded, empowered, engaged, and ever evolving:
QA is not a separate silo but is part of a cross-functional project team. QA is involved in the project from the beginning, and the whole team works together on user stories using the same tracking tools. The Director of the QA team works closely with the executive management team to identify technology and staffing needs in relation to project pipelines.
QA is empowered to support projects and add value in whatever way the situation requires, always. Examples include: design reviews, requirements assessments, browser and device support, process, tools, risk assessments, and helping to determine “Definition of Ready” and “Definition of Done.”
An empowered QA team is free to focus on the preventative—just like in healthcare, using QA proactively produces healthier products in the long run. We always look to avoid problems before they arise: “You cannot inspect quality into a product. The quality is there or it isn’t by the time it’s inspected.” (Harold S. Dodge, via W. Edwards Deming).
QA sits with the project team whenever possible, allowing for increased conversation and problem solving in real time. The QA team attends and contributes to all relevant planning meetings and sprint ceremonies and also work directly with clients on quality and testing processes.
An important concept in continual engagement is “Shift Left,” which was introduced at the 2013 State of QA Conference at SXSW: QA teams think about quality and testing at all stages of a projects and not just once code is written.
Here’s an example sprint:
Members of our QA teams are always learning--as individuals, as project team members, and as representatives of a skilled discipline within the organization. Our process and approach to testing evolves to keep up with advances in technology and the changing needs of clients. What works for one client or project might differ radically from another. Flexibility is key.
Project Tasks and Roles.
In order to help set expectations both internally and for our clients, we’ve identified three roles for any give QA team. Given the ever-evolving nature of how we work, these descriptions may vary in accuracy depending on the needs of the project and the specific skills of team members.
“Provides a single point of contact for all QA activities, creates the initial testing strategy for the project, creates and maintains the conditions necessary for testing, provides work estimates, creates all client-facing documentation as it relates to quality and testing, communicates risk, represents quality and testing interests during sprint ceremonies, mentors all project team members on testing, performs testing tasks as needed.”
The QA Lead may or may not be part of a scrum team. It may be considered more of a leadership role that supports the scrum team. The difference lies in whether the QA Lead has been fully allocated and embedded into the project team.
What this means in practice is that the QA Lead will partner with the project manager and the technical lead prior to the start of the project and work with them to review client-provided documentation and determine the testing strategy. She may also assist with setting up the project in Jira and creating a workflow and scrum board.
On larger teams, the QA Lead may also attend sprint ceremonies and other meetings in order to represent the tester and quality perspective and to allow other QA team members to test or otherwise assist the team. On larger teams, the QA Lead may help to delegate tasks to assist in balancing the workload. A QA Lead may also take on the QA Analyst role and perform testing as needed.
“Executes exploratory tests on user stories, communicates risk, provides regression scenarios for automation, mentors non-QA project team members on testing.”
The core of a QA Analyst’s contribution to a project team is exploratory testing skills. While we may document some of our test structure in the form of Gherkin scripts, sessions, or Jira sub-subtasks, the majority of the testing performed is unscripted and undocumented.
This style is advantageous because the majority of projects have become Agile. Increasingly, focus and value lie in quality software over documentation.
“Works with System Engineers to set up Continuous Integration environment for automated regression and performance tests. Writes code for automated testing. Schedules and monitors test runs. Acts on failures. Performs performance tests and provides reports as needed. Solves testing problems with code as needed, mentors team members on testing and checking.”
This role can also be filled by developers, instead of or in addition to dedicated QA Engineers.
For this role, the distinction between testing and checking is an important one. Both James Bach and Michael Bolton have done extensive work in describing this distinction.
Testing is the process of evaluating a product by learning about it through experimentation, which includes to some degree: questioning, study, modeling, observation, and inference.
Checking is the process of making evaluations by applying algorithmic decision rules to specific observations of a product.
A QA Engineer is skilled in both processes and, even more importantly, understands when to use testing or checking. It’s necessary to separate what a machine can do well and what a human can do well and to use the advantages of both. Checking is an evaluation activity within testing that can, in principle, be fully automated. Tools such as Selenium, for example, are used to check not test.
At a very high level, requirements such as business rules can benefit by using automated checks, but look-and-feel and finding edge cases require testing. Testing always involves a human interaction, which may be supported by tools.
Before a project begins:
- Review documentation with an eye toward testing process alignment.
- Work with the client, analytics, and project team on defining browser and device test matrix.
- Does the project require accessibility compliance? If so, design, develop and test according to WebAIM’s WCAG 2.0 Checklist and work with the client to determine which level of compliance (A, AA or AAA) is necessary.
- Develop specific benchmarks and a testing plan prior to the start of the project when stakeholders need additional reports.
- Employ security best practices on all projects.
- Review the timeline and technical stack to inform automation feasibility and, when possible, selection of the framework. It may not make sense to perform automated checks on all projects. When needed, we use a variety of project-specific tooling. For browser testing, this typically includes some combination of a general purpose programming language, Selenium WebDriver, Gherkin, and a test runner.
During the project:
- Work with the Technical Architect to set up project-specific processes that can include Jira workflow, Jenkins workflow set up, user story templates, and test formatting.
- Work with the project team to establish a “Definition of Ready” and a “Definition of Done.”
- Review designs with product owner, UX and visual designers and tech lead with an eye towards identifying possible implementation problems based on experience.
- Attend backlog grooming and sprint planning ceremonies in order to become acquainted with requirements as well as to assist the team in estimations. Testing times (i.e. unit, acceptance, exploratory, automated, and regression) are factored into all estimations. This allows for a more accurate sprint velocity over time and cuts down on instances in which testing only occurs in the last day of a sprint.
- Keep an eye on the overall project process to see what works and what doesn’t. If something is of high risk or just isn’t adding value, adopt a “see something, say something” mentality and work with the team on mitigation strategies.
- Whenever possible the QA teams sits with the cross-functional teams. This allows for spontaneous conversation. It also allows QA to pair easily with a developer or other team member. If a defect is found, it is preferable to discuss it and, when possible, fix it on the fly rather than to take the time to write it up. Real-time pairing may not always be possible as it’s important to allow other team members ample time and focus to work, but what is important to note is that defects are only documented when necessary.
- While QA is the subject matter expert in terms of testing it is expected that all members of the team will assist with the effort. This allows the team and the automation suite to find the “low hanging fruit” defects such as cross-browser compatibility and frees up QA to work on finding the more difficult defects using their unique skill in exploratory testing techniques.
- On some projects, QA may write test scenarios for each user story using Gherkin syntax and the team may take a Behavior Driven Development (BDD) approach to the project. In support of BDD, the product owner, technical lead, developer, and QA review test scenarios in order to ensure that requirements are understood. When the developer knows what tests will be performed, those scenarios are more likely to work as documented once the code is written. This cuts down on the “mutual misunderstanding of requirement” type defects. Many automation frameworks use Gherkin syntax as an input, which allows these scenarios to be added to a continuous integration environment easily and assists with regression checks. This enables someone with testing expertise who does not have a coding background to write tests that more code-savvy team members can implement easily. Not every Gherkin scenario will be automated, and the project team should determine which tests require automation. QA will test the remaining scenarios and complete exploratory testing.
Connect with us.