Skip to main content

Test Approach Mnemonic: MICROBE


With TestBash 2017 on the way, I've been reflecting on my journey as a public speaker, since being inspired by the speaker and overall experience of TestBash Brighton 2013. Looking through some older material I discovered 'Testing is Exploratory', delivered at Leeds Testers Gathering, back in 2014. Extolling the virtue of a lightweight and transparent test approach, inspired by my attendance on Rapid Software Testing the previous year...

Here's what I said then:
What is a 'Test Approach'? 
"It is the act of THINKING about the BUSINESS PROBLEM and how TESTING will contribute VALUE using your own BRAIN and that of OTHERS with the outcome of DELIVERY in mind." 
Test Approach Mnemonic:
  • Mission - the primary concern of the testing, your mum should understand.
  • Involving – involve stakeholders, discover their biases, not your own. 
  • Challenges - what questions are we seeking to explore? 
  • Risk - we are in the risk business, so what are the business risks?
  • Observable - show it rather than write it, pictures and spoken words are engaging.
  • Brief – Concise is the real challenge, means choices are required!
  • Epistemic Humility – Have you been humble with your knowledge? Have you allowed challenge and diversity? 
See, Microbe, reminds you to keep it small, see? 

What about now then? I would amend my definition of a 'Test Approach':
"It is the act of CONTINUOUSLY THINKING as information EMERGES about how TESTING can provide ACTIONABLE INSIGHTS using the collective ORACLES around you to DELIVER VALUE to those who matter."

The key additions are the loud notions of 'CONTINUOUSLY', 'EMERGES' and 'ACTIONABLE INSIGHTS' which are inspired by more experience of testing in an agile context. Having been fortunate enough to work on a number of projects and products where the architecture has been allowed, for the most part, to emerge, an adaptive test approach which sharpens with change has been crucial. Creating a great test approach which provides information that was relevant is waste of a high order. In terms of ACTIONABLE INSIGHTS, given we build systems iteratively, timely information from testing can actually be acted upon near the point of discovery. Make it count!

I would also disambiguate 'your own BRAINS and that of OTHERS' to 'ORACLES', as that implies more than just brains, which is a subset of possible oracles. DELIVER VALUE is key addition. Testing, in my view, suffers from the law of diminishing returns, the more you do, the less value you are likely to get. To believe that one's testing can match the infinite variety of the world, is a false position. Delaying delivery is delaying learning.

The actual mnemonic I would change in a few ways too:

  • Mission - the primary concern of the testing, as in there are many concerns, but a choice must be made. After all, we can't test it all.
  • Involving – involve stakeholders, discover their biases, and your own.
  • Challenges - what are the questions that those who matter wish to be answered? What might I need to fulfil the mission that I don't currently have?
  • Risk - we are in the risk business, so what are the business risks?
  • Observable - if you can't observe what you are testing, how effective is your testing? If I can't observe the entity I am testing (via logs for example) how can I remedy that?
  • Brief – Concise is the real challenge, how does one get a breadth of testing lenses so one can sample the product to find important problems?
  • Epistemic Humility – Have you been humble with your knowledge? Have you allowed challenge and diversity?

I still love to set a mission. I like that it feels like a meaningful negotiation. Where shall we focus? What isn't important? Gets to the heart of the matter early I find.

As for the rest, most of the mnemonic has changed apart from two areas, risk and epistemic humility. I still fundamentally believe testing is risk based, and the information exposed by testing should concern those risks to be of value. Extending that knowing that you don't know everything is a something I'm glad I've retained as my career has progressed. Its even more true now. As a proportion, I know a lot less now than I did then.

Always fun to reflect on what you used to think about testing. Try it!

Original slides here:

https://www.slideshare.net/slideshow/embed_code/key/FVzbIPcQoGWFCA

Comments

  1. Thanks for sharing your thoughts with us, loved reading it and going to save these slides for future reference. Keep up the good work

    ReplyDelete

Post a Comment

Popular posts from this blog

Testers Guide to Myths of Unit Testing

One area that testers might be able to enhance their contributions to software development teams is how we perceive and contribute to unit testing. I believe testers busting their own illusions about this aspect of building something good would bring us much closer to developers, and help us realise what other layers of testing can cover most effectively.

Also, I want to do a talk about it, so I figured I would test the premise, see if potential audiences were into it. I put this on Twitter:
Working on a talk about what testers might believe about unit #testing & how we interact with developers creating unit tests. Any challenges/additions for my list below? #development#agilepic.twitter.com/4oT5HE4qs3 — Ash Winter (@northern_tester) December 19, 201730 replies with ideas tends to indicate that people might be into it. 
The ListI thought, as my final blog of 2017, I would provide a super useful list of the myths and legends we as testers might believe about unit testing:
That developer…

Wheel of Testing Part 3 - Applications

I've only had to quit two jobs to finally find the time to finish this blog series. Winning at life. If you need reminders (like I did) check out Part 1 and Part 2 before reading on...

After the first two blogs regarding the Wheel of Testing, I was delighted to receive a few requests for the wheel itself, which got me thinking about applications of it, beyond what its original intent was, which I've explored in detail in part 1 of this series of intermittent blogs. Most models need a little air time to show their value, in software development we crank out models all the time, but I'm not sure how many get used. I am inspired by models such as the "Heuristic Test Strategy Model" by James Marcus Bach, as I have used it and seen the benefits it has brought for my clients, particularly the ability to ask questions. So, I wanted to create a model which has a number of use cases, both real and imagined:

Helping to unlocking a career in testing which may be stuck

It is no…

The Team Test for Testability

You know what I see quite a lot. Really long-winded test maturity models. 

You know what I love to see? Really fast, meaningful ways to build a picture of your teams current state and provoke a conversation about improvement. The excellent test improvement card game by Huib Schoots and Joep Schuurkes is a great example. I also really like 'The Joel Test' by Joel Spolsky, a number of questions you can answer yes or no to to gain insight into their effectiveness as a software development team.

I thought something like this for testability might an interesting experiment, so here goes:

If you ask the team to change their codebase do they react positively?Does each member of the team have access to the system source control?Does the team know which parts of the codebase are subject to the most change?Does the team collaborate regularly with teams that maintain their dependencies?Does the team have regular contact with the users of the system?Can you set your system into a given state…