Skip to main content

Tester in Development


After I shared this on Twitter a little while ago:

I got to thinking about the 'Developer in Test' pattern which has gained in popularity over the last 15 years of my testing career (but probably existed way before that), so here is my expansion on the point. 

Who in their organisation has one (or many) of these?
  • Technical Tester
  • Developer in Test
  • Software Engineers in Test
  • Software Development Engineers in Test
  • Software Engineers in Test Levels One, Two and Three

* Note 1 - As a sub pattern, it seems that the job title for this pattern has got longer over time, although I am being a little naughty with data. However, a would it be the first time a (sometimes) badly defined job got a lengthy title to give it some credence?
** Note 2 - I have never heard of a Programmer in Test. Let me know if anyone has seen/lived this. It may have the benefit of at least describing the primary activity of the job, although maybe testing should be that primary activity.
*** Note 3 - I would very much like to be known as a 'Tester in Development' from now on. Which would be a ludicrous title right? Suggesting that I am some kind of foreign body, rather than part of how good stuff is built? Although, read in a learning context, we are all testers in development.

I get the feeling there will be a few nods out there. 

I guess we could argue about a few of these titles. However, for me, testing is a technical profession, as technical skills are acquired via learning and practice, specific to that profession or the task at hand. Hell, most people who have a profession have theoretically at least acquired specific technical skills, rendering them technical, with what proficiency is a different debate. Testing, I don't see it as an exception. But, you know, testers need to more technical...

I digress.

Patterns within Patterns 

If you do have one of the specific titles above in your organisation, there may be one or more of the following patterns swirling around in that larger pattern:

Not hiring developers who value testing...

Wrestling with your developers about delivering testable chunks of functionality? Thrown over the wall at the last minute? No unit testing? 500 errors when you load the new site (I'll let you decide if I mean http errors)? Unfortunately, developers who do not value testing are still very much part of the software development world. And if you don't determine their propensity to value testing at hiring time, your life will get very interesting in the next few months. So lets just hire a Developer in Test to bridge the expanding gap, rather than those who view testing as a key part of building things.

Developers don't have time to create tools for testing...

Suppose you get the first bit right and you hire a bunch of developers that value testing. You, as a tester, identify a system which you need to be able to control and observe closely to discover information about risk, maybe even save a bit of time down the way, a little early discovery. But no one has time for that. Sounds like a great idea, but we need to be the first to bust open the Iron Triangle of cost, time and scope to deliver this thing. Let's get a Software Engineer in Test to do it. This might get the job done but it also might point to a deeper sustainable pace problem.

Testers playing to weaker skills over strengths...

Programmers are, for the most part, really good at programming and they practice lots. Testers, while a great many can program, are probably less effective at it, they practice less. While I'm not saying testers shouldn't program, but the power of a joint approach to say, building a check automation framework or a tool to observe the health of a system, creates an opportunity to play to strengths on all sides. I can program, but the developers I work with are better at it than me. I encourage them to do it, and work with them to make meaningful checks and information diviners. Or we can compartmentalise that work into a Technical Tester, reduce the level of engagement and probably subtly miss the point from a programming and testing point of view.

Tool aware but technologically unaware testers...

A developer once said to me:
"Ash, now the testers have got hold of {insert popular service layer testing tool here}, it's like having a rail gun full of XML indiscriminately firing the wrong thing at the wrong endpoints."
Well, I'm all for a bit of randomness in testing, it certainly tipped out a boatload of error handling goodies. However, it points to a wider phenomenon. Testers becoming tool aware but lacking technological awareness. As in, I have a means of invoking this RESTful service but I don't understand the structure of it, its strengths and weaknesses, where I need to probe. So, this type of problem might be farmed to a Software Developer in Test Level 3. At the expense of the learning required by others, which just might help them on the journey of becoming technologically aware.

Local Optimisations

Technical Tester, SDiT, SDEiT Level X. Call it what you want, it's still one of my favourite local optimisations in testing, the system may need to fill that role, but we do so by giving someone a job. Based on the patterns above, we may be missing the bigger picture.

Comments

  1. What motivates you that drives but you need to have motivation for sure and I must say one of the great pieces I have read.

    ReplyDelete

Post a Comment

Popular posts from this blog

A Lone Tester at a DevOps Conference

I recently had the chance to go to Velocity Conf in Amsterdam, which one might describe as a DevOps conference. I love going to conferences of all types, restricting the self to discipline specific events is counter intuitive to me, as each discipline involved in building and supporting something isn't isolated. Even if some organisations try and keep it that way, reality barges its way in. Gotta speak to each other some day.

So, I was in an awesome city, anticipating an enlightening few days. Velocity is big. I sometimes forget how big business some conferences are, most testing events I attend are usually in the hundreds of attendees. With big conferences comes the trappings of big business. For my part, I swapped product and testability ideas with Datadog, Pager Duty and others for swag. My going rate for consultancy appears to be tshirts, stickers and hats.

So, lets get to it:

3 Takeaways

Inclusiveness - there was a huge focus on effective teams, organisational dynamics and splitt…

Wheel of Testing Part 2 - Content

Thank you Reddit, while attempting to find pictures of the earths core, you surpass yourself.
Turns out Steve Buscemi is the centre of the world.

Anyway. Lets start with something I hold to be true. My testing career is mine to shape, it has many influences but only one driver. No one will do it for me. Organisations that offer a career (or even a vocation) are offering something that is not theirs to give. Too much of their own needs get in the way, plus morphing into a badass question-asker, assumption-challenger, claim-demolisher and illusion-breaker is a bit terrifying for most organisations. Therefore, I hope the wheel is a tool for possibilities not definitive answers, otherwise it would just be another tool trying to provide a path which is yours to define.


In part one, I discussed why I had thought about the wheel of testing in terms of my own motivations for creating it, plus applying the reasoning of a career in testing to it. As in, coming up with a sensible reflection of real…

The Team Test for Testability

You know what I see quite a lot. Really long-winded test maturity models. 

You know what I love to see? Really fast, meaningful ways to build a picture of your teams current state and provoke a conversation about improvement. The excellent test improvement card game by Huib Schoots and Joep Schuurkes is a great example. I also really like 'The Joel Test' by Joel Spolsky, a number of questions you can answer yes or no to to gain insight into their effectiveness as a software development team.

I thought something like this for testability might an interesting experiment, so here goes:

If you ask the team to change their codebase do they react positively?Does each member of the team have access to the system source control?Does the team know which parts of the codebase are subject to the most change?Does the team collaborate regularly with teams that maintain their dependencies?Does the team have regular contact with the users of the system?Can you set your system into a given state…