Skip to content
  • There are no suggestions because the search field is empty.
Insights

Hidden Superpowers of QA Analysts

Hidden Superpowers of QA Analysts Article Featured Image

In software development, quality assurance (QA) plays a pivotal role in ensuring reliability and functionality of your product. However, a key mistake a development team can make is isolating QA analysts until the testing phase.

Let’s explore the critical practices that can transform a good QA analyst into an exceptional one, ultimately enhancing the quality of the application and preventing costly issues down the road.

1, Invite QA analysts to project meetings

The most critical practice to enhance your QA analyst’s day-to-day work is to include them in all project meetings: every one-off, every creative meeting, every strategy meeting, every project-related meeting should have QA represented in some form. While initially this may seem cumbersome or disruptive, its impact on how test cases and guidelines are constructed and reviewed can’t be overstated.

During meetings team members regularly propose innovative ideas or create undocumented requirements, and your QA analyst can ensure new requirements get documented and tested, safeguarding the integrity of the application.

For example, consider a situation in which an isolated save feature—where one field in a form includes a save button that records only that field—was suggested in a meeting. That feature would deviate from the usual requirement that any save request should validate all fields, a practice that helps discover any invalid associations between fields. Involving a QA analyst who comes prepared with a reference list of all documented requirements would help your team identify this discrepancy, saving hours of development work that would be rejected later due to a fault in save integrity.

It costs time across the entire development team when QAs aren’t informed of changes in development. Usually when developers and product owners meet and propose innovative ideas, no individual is the designated scribe. While each person is likely recording notes for themselves, no one is capturing the conversation in a form that reflects requirements and expected behaviors.

A QA analyst sitting in a meeting should be writing up (or at least drafting) requirements as they develop—and then immediately sharing them with the product owner to review verbally while the behavior is being discussed. This can cut the time it takes to design QA tests by more than 50% because it eliminates a separate process in which a QA writes test cases that reflect new requirements and then product owners review them. Instead, that two-step process can happen during the requirement discussion.

The less obvious benefit is when a QA analyst proposes a test case to the developer and product owner during a meeting that then spins into a conversation related to requirements that connect to the latest feature. So an idea that may have been logged as a bug or enhancement during the testing phase instead becomes a dutifully documented requirement.

2. Invite QA to participate in code reviews

It’s vital to keep your QA team informed by including the lead tester or QA manager in all pull requests (i.e., when the team wants to merge a feature branch of code into the main working branch of the application) and code reviews (i.e., when a senior developer analyzes another developer’s code before merging it into the main working branch).

Many code changes impact both business and technical requirements, so each change should be treated as a potential risk to the integrity of the application. Though the sheer volume of communications hitting a QA analyst’s inbox may seem intimidating, this level of information sharing is crucial for the efficiency of development. It helps the analyst spot inconsistencies in decision trees and requirements before a branch is merged into the development branch, preventing rollbacks, rewrites, and frustration. No developer wants to discover late in the testing phase that a feature doesn't meet the specified business or technical requirements.

This is the process we use in the ongoing development of our team’s in-house application—CODA, a web-based content management tool, development environment, and administration system. CODA is designed to be robust and configurable. Configurable applications present an enormous scope for testing; when new features are implemented, one feature could influence or be influenced by 10 different pages with up to 30 different configurable actions or settings. Involving our QA team—before new code is pushed into a lower environment—can save hours of testing with as little as 30 minutes of code review.

Here’s an example: Imagine one drop-down menu influences 10 tables in the database due to associations with the object updated. If the menu isn’t added to one table for any reason (forgotten, saved before the update, etc.), when it hits QA tens or hundreds of test cases could be completed before finding a glitch on the front end, which could take two or three hours. When we do a code review with an informed QA who knows the database and conducts an audit on the programming language needed, we’re likely to discover the bug in a quarter of the time.

3. Invest in your QA analysts’ knowledge acquisition

Keep investing in the professional growth of your QA analysts. Encouraging their knowledge of the software development life cycle, programming languages, development frameworks, and the industry overall enhances their ability to evaluate adherence to requirements. This knowledge is especially valuable in white-box testing, an application testing method in which the tester has in-depth knowledge of all aspects of the application, including design documents and coding. A QA analyst who can assess the behavior of features in pull requests and code reviews can identify defects before they’re integrated into the system.

For example, let’s imagine an online shopping site that involves a large database. A developer may be asked to speed up the application when a user filters items by category. At the time of the request, a new query (a request to the database for data) is sent to the database every time the filter is changed. The developers consider storing the result set of the search in the application—so instead of querying the database with new filters, the application will search the existing result set.

An informed QA analyst consults the client’s requirement documentation and finds a business requirement that the inventory on our search page must be as accurate as our team can achieve. That means not updating our result set between filters will violate the client’s business requirements.

The QA analyst happens to be working on a data-engineering course suggested by the QA manager (with the goal of supporting more database tests to document and double-check performance in our application). While researching, the analyst found that our item category is a string/character column in the item table. In the professional development course, the QA analyst learned that using integers rather than character values increases query speed. The analyst relays that information to the developers and suggests we create a small category table and use the primary key as a foreign key in the item table. Then, instead of storing the item table, the application can store the item category table and use the primary keys for faster queries to the item table in the database.

Had the QA analyst not been encouraged to pursue ongoing learning and been in a position to suggest a change of process, the developers would likely have updated the application in a lower environment. QAs who don’t have visibility to business requirements or knowledge of database design would have passed it. But then it would hit user acceptance testing (UAT), where it would be faulted for costing inventory integrity (the accuracy of products in a search engine).

To reiterate, the distinction between a good QA and an exceptional QA—and thus between a good application and an exceptional application—hinges on how well-informed your QA analyst is about the industry, code, and project processes. Some of the most promising products fall short due to inadequate quality control. By following these practices, you can ensure the quality of your application early in development and at the highest level possible for your team.

Casper LeVeaux is a senior quality assurance analyst for The Lacek Group, a Minneapolis-based data-driven loyalty, experience, and customer engagement agency that has been delivering personalization at scale for its world-class clients for more than 30 years. The Lacek Group is an Ogilvy company.