Skip to main content

The bigger the rock, the smaller the pieces need to be



You know what I really, really value in a fellow professional in the information technology delivery world? That special, magical ability to decompose a large (and potentially complex problem) into small, simple subtasks.

A child can do this right? This is 'Being a Human Being 101.' So why is it a behaviour that eludes a large percentage of those in the information technology industry. This is a trait of people who I like to call 'people who get things done.' Not through heroism or great feats against monolithic bureaucracies, but a simple application of critical thought.

Is there a problem here? 

People like the idea of building big stuff, stuff to "get hold of", its very grand to say we're building an "enterprise level" application. In that vein, I hear "well, this a step change to the product" or "there is no value in splitting up the project into smaller deliverables" on a regular basis. The justifications of the desperate, determined to protect bloated road maps which perpetuate their own existence.

At its root, the real problem with big stuff is that its is counter to how our brains actually work. We are overwhelmed by it, we cannot hold it within our puny cerebrums. Small stuff is natural, we can encircle it with thought and apply ourselves to it. We can be happy that its done, or at least that its time to stop.

If you are going to be marching for a year, you need plenty of opportunities to stop off on the way. Save it all up for one payload and you are likely to trudge forwards with your eyes to the floor for a large part of the journey. Your destination may well be on the other side of the horizon before you realise. 

So why do I see this all around me? 

Aside from my own bias, its actually a thing which takes thought and effort. It's easier *right now* just to plough on and not consider how an entity can be decomposed. At least that shows progress right?

Wrong. This stems from the perception that skilful decomposition is perceived to be responsible for initially 'slowing down' a delivery, while a slice of functionality is built. Speeds up your ability to generate feedback though. Which then means that you are more likely to deliver the right thing. Which, from experience, means you build what's needed, rather than spending time on what isn't.

Can someone be explicitly taught this ability?

I believe so, although its rarely that simple. At its heart is the ability to recognise flows, change the angle of approach when required, and the application of systems thinking. Decomposing complex systems or problems into simple rules of thumb is critical to an iterative delivery. 

I always like the thought of splitting an entity by the questions you wish to answer about it. Or consider the simplest thing you can do to overcome a constraint, expose information about risk or deliver customer value. I always imagine the entity as sphere and I can go anywhere around its surface. Eventually, I'll see the angle of approach. Hey, its the way my mind works. I have to apply the mental brakes, think, rather than plough on. Its taken some practice and discipline on my part.

This ability enables that most precious of habits, that of delivery of value. For now, the delivery of unvalue is pervasive to my eyes, but I'll strive to ensure that this special but underrated ability continues to have a place in the world. 

Comments

Popular posts from this blog

Wheel of Testing Part 2 - Content

Thank you Reddit, while attempting to find pictures of the earths core, you surpass yourself.
Turns out Steve Buscemi is the centre of the world.

Anyway. Lets start with something I hold to be true. My testing career is mine to shape, it has many influences but only one driver. No one will do it for me. Organisations that offer a career (or even a vocation) are offering something that is not theirs to give. Too much of their own needs get in the way, plus morphing into a badass question-asker, assumption-challenger, claim-demolisher and illusion-breaker is a bit terrifying for most organisations. Therefore, I hope the wheel is a tool for possibilities not definitive answers, otherwise it would just be another tool trying to provide a path which is yours to define.


In part one, I discussed why I had thought about the wheel of testing in terms of my own motivations for creating it, plus applying the reasoning of a career in testing to it. As in, coming up with a sensible reflection of real…

Getting started with testability

At TestBash Netherlands, I said that, in my experience, a lot of testers don't really get testability. I would feel bad if I didn't follow that up with a starting point for expanding your mindset and explicitly thinking about testability day to day, and making your testing lives better! 

In large scale, high transaction systems testability really is critical, as compared to the vastness and variability of the world, testing done within organisations, before deployment, is limited by comparison. We need ways to see and learn from the systems we test where it matters, in Production.
Being able to observe, control and understand is central to testing effectively, and there are loads of resources and experience reports out there to help. I was/am inspired by James Bach, Seth Eliot, Matt Skelton, Sally Goble, Martin Fowler and a little bit of PerfBytes/Logchat, so lets see if it works for you! 

Overall Model:

Heuristics of Software Testability by James Bach

http://www.satisfice.com/tool…

The Team Test for Testability

You know what I see quite a lot. Really long-winded test maturity models. 

You know what I love to see? Really fast, meaningful ways to build a picture of your teams current state and provoke a conversation about improvement. The excellent test improvement card game by Huib Schoots and Joep Schuurkes is a great example. I also really like 'The Joel Test' by Joel Spolsky, a number of questions you can answer yes or no to to gain insight into their effectiveness as a software development team.

I thought something like this for testability might an interesting experiment, so here goes:

If you ask the team to change their codebase do they react positively?Does each member of the team have access to the system source control?Does the team know which parts of the codebase are subject to the most change?Does the team collaborate regularly with teams that maintain their dependencies?Does the team have regular contact with the users of the system?Can you set your system into a given state…