paint-brush
Just Enough Testing: How Many Automated Tests Should We Write?by@ternarywat
118 reads

Just Enough Testing: How Many Automated Tests Should We Write?

by JohnJanuary 23rd, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

To strike a balance, I use these approaches for writing automated tests: New Code: Write enough to make it easy to add new tests. Existing Code: Only test the code that you changed. Bug Fixes: Write enough tests to prevent the bug from occurring again.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Just Enough Testing: How Many Automated Tests Should We Write?
John HackerNoon profile picture


Regardless of the team, someone is bound to ask me:


How many automated tests should we write?


You could declare that your team tests everything. Or nothing at all. Both are simple directives. I think there’s a better approach.


To me, the right amount of automated tests to write is Just Enough.

Disclosure: This article includes affiliate links to any books I reference. I may receive compensation when you click on links to these products.

What is Just Enough Testing for Automated Tests?

Just Enough Testing is a practical approach to writing automated tests. The goal is for teams to do enough testing to confidently release a feature. No more. No less.


Understanding your team’s context is critical to deciding what “just enough” means.

Much like my feelings on meetings, I want to balance shipping features and the need for quality.


So how can you decide what Just Enough Testing means for your team? I have my teams discuss the following questions:


  • What phase of life is our product in? Is testing worth doing at this phase?

  • What will the impact of bugs have on our users?

  • How easy is it to release a bug fix?


Answering these questions helps my teams decide how much time to spend on testing. Every project will require a different amount of testing. Understanding the impact of the decision is important for success.


Some results I’ve experienced from these questions are:


  • Changing a workflow that affects all users should have more testing before release. The goal is stability for customers.

  • Features that explore new use cases with few users need fewer tests. The goal is to learn quickly — too much testing slows this down.


How can you put this approach into practice?

How do I write Just Enough Automated Tests?

Writing automated tests can become a timesink that affects project estimates. Most tests are easy to write, but it’s tough to quantify the value of the time investment.


To strike a balance, I use these approaches for writing automated tests:


  • New Code: Write enough to make it easy to add new tests.
  • Existing Code: Only test the code that you changed.
  • Bug Fixes: Write enough tests to prevent the bug from occurring again.


These guidelines balance the short-term need to ship with the long-term goal of quality. Bugs are inevitable. Spending too much time on testing is just as risky.


Instead, I want teams to make it easy to add tests in the future. Existing tests remove the friction of adding new ones. As new tests get added, test coverage increases over time.

Conclusion

Software engineering is a job of tradeoffs. The time we invest in testing comes at the cost of shipping.


Instead of declaring a test coverage target, I challenge you to take a more practical approach. Look at the context of your teams and ask:


How can we do Just Enough Testing to confidently ship and maintain this feature?


What are your expectations when it comes to testing? How do you balance the trade-offs of shipping versus polish? Let me know on LinkedIn or Twitter.

Want to learn how to become the successful leader your team needs? Sign up for my newsletter to learn how.

Further Reading

Also published here.