How to Involve QA in Design, Test Automation

How to Involve QA in Design, Test Automation

As organizations shift testing to the left, they are starting to leverage QA in new ways. Two of those ways are getting QA more involved in product design and helping to build a test automation strategy.

These tasks are traditionally under the purview of product and engineering teams, respectively, but there is obvious value to including QA teams in these processes, given QA’s testing acumen and familiarity with how the product should work in the real world.

In this blog post, we’ll look at how organizations can incorporate QA into design and test automation to fully leverage QA’s skills.

QA Is Proactive in Reviewing Design

Particularly in a DevOps organization, QA testers work tightly with the product team (or whomever is creating the customer requirements). This gives the QA tester an opportunity to add business value by reviewing design elements upfront.

Think about it: the QA tester should know how the application works from end-to-end, and how it works in concert with related applications. They can analyze the app’s design upfront and analyze any shortcomings. By finding defects or missing requirements before the application is coded, the business saves both time and expense.

In a DevOps methodology, the QA team must expand its role beyond testing only Dev changes. Now, the QA team should proactively find and verify defects.

Develop a system where the whole DevOps team defines the deployment process. As you build the process, consider these questions:

  • What are we verifying during the deployment?
  • Are we manually verifying production functionality?
  • Is the team verifying deployed code in production using integrated automated test scripts?

For example, many applications build integrated code-based tests into the code that notify if an error occurs. Most of those errors are around API connection failures, backend data processing failures or database disconnects. The team needs to also figure out a way to do a smoke-level test of the main customer workflows on production while limiting risk, and not creating junk production data.

Building out this process is easier said than done. Take the time as a team to plan, discuss and figure out what works best for your specific application.

QA Builds the Test Automation Strategy

The QA tester needs to analyze the application design and create the overall test strategy. Granted, the QA testers at some organizations won’t actually code the automated tests themselves, but they should define what the test case is covering — or what requirements it tests and an expected result proving it works.

Reviewing the design for test conditions is essential to releasing a quality application, and the QA tester is the one to do it. Developers know the function they are coding, but often don’t understand how the whole system functions as a customer workflow.

Reviewing the design for test conditions is essential to releasing a quality application, and the QA tester is the one to do it. Developers know the function they are coding, but often don’t understand how the whole system functions as a customer workflow.

To be successful, the QA tester and the person coding the automated tests should collaborate to determine what needs to be tested and the priority order. Initially, focus on automating only critical items, such as ensuring the backend connections and processes are functioning. Then, build your suite to also cover the highest-priority functions for the main customer workflows.

QA teams can also get more involved with setting priorities for ‘risk-based testing,’ which helps you prioritize testing based on the risk of failure. If a failed test has an impact on the business, then input should come from product owners. However, for the more complex areas of the application that are likely to contain more coding issues, input should come from development. QA can leverage formal ways to process and manage risk-based testing and report it as a metric.

If your application changes significantly with every release, your automation strategy should include a way to maintain the test scripts so they remain valid and executable. The only errors you want to see are true defects, not automated script issues. A good rule is your automation test script should be as good or better than the script it’s testing.

We’re well past the era of QA being solely responsible for testing at the end of the SDLC. Incorporating QA in more areas, such as design and automation, can pay off and make your products stronger in the long run.

Whitepapers

Craft a Complete Test Automation Strategy

In this white paper, we explain how to put together an effective test automation strategy that blends scripted and codeless automated tests, enabling customers to scale testing with apps.

Read Now
Want to see more like this?
Dan Cagen
Dan Cagen
Product Marketing Manager
Reading time: 5 min

Digital Quality Matters More Than Ever: Do Your Experiences Keep Customers Coming Back?

Take a deep dive into common flaws in digital experiences and learn how to overcome them to set your business apart.

4 Ways to Get Maximum Value from Exploratory Testing

Well-planned exploratory testing can uncover critical issues and help dramatically improve the customer experience. See how to guide testers to where exploration can yield the greatest returns.

3 Keys to an Effective QA Organization

Get your internal, external and crowdsourced testers on the same page

What is the Metaverse? And What Isn’t It?

It’s not far-flung sci-fi anymore — the metaverse is here, and it requires companies to rethink their approach to UX and testing

Why Machine Learning Projects Fail

Read this article to learn the 5 key reasons why machine learning projects fail and how businesses can build successful AI experiences.

How Localization Supports New-Market Launches

Success or failure in a new market is all about how you resonate with customers — don’t skimp on prep work