Skip to main content

Train the Trainer - Course Retrospective


What's up with that then?

So, I've been charged with becoming a trainer within my organisation.

Just to set expectations here, I know a tiny amount about how to furnish humans with new knowledge, skills and attitudes. Make no mistake, if this field is an ocean, I am a puddle by comparison. I have dabbled with coaching, but much learning from me will probably have been via proximity and osmosis.

Personally, if I'm going to do something, I want to use my whole arse to do so, not just half of it. I want a set of models to apply in context and (more importantly) a strong paradigm, so when I discuss, create and iterate on training material and courses, I have a starting position to challenge/be challenged on. So, I attended the Train the Trainer course to compliment my own buccaneering learning tendencies.

What did you learn that t'internet didn't know for free?

The internet probably knows some of this stuff but here is a bunch of stuff I have learnt over the last few days:

  • I was pretty worried about creating material, how much time it would take and how I would fit everything else in. It turns out the angle of my thinking wasn't right. Instead of 'how can I create course material?' I should have been thinking 'how can I create exercises which transfer the onus onto the participant to learn.' Still be hard, but feel better.
  • Bloom's Taxonomy - A method of classification for learning objectives split into knowledge, skills and attitudes. If done really well, they will form your assessment too. If turns out my paradigms for knowledge, skill and attitude were a bit wonky too. Especially with reference to the difference between skills and knowledge and how to *really*  tell them part. Here goes:
    • Knowledge - I know how to do something
    • Skill - I can practically apply my knowledge of that something
    • Attitude - I have a belief or a will to do something
    • Simple maybe but its what I'll take forward with me! Have a look at Blooms, its fascinating stuff.
    • Excellent lexicon for objectives too, useful in many contexts.
  • My expectations - It turns out I don't need to try and impart all my knowledge and skills within a certain time period. Also my expectations of others post training course. They might not need to be geniuses. They might need to recall some things, recognise patterns in others, be able to apply for others still. 
  • Fluidity - training courses are not an iron clad, military exercise. They provide a scaffolding which allows room for manoeuvre, but the ability to flex on what really matters to the participants. Simple questions at the beginning of a topic like 'what is your experience of X' can help to frame a session, streamlining as appropriate to meet needs.
  • Objectives linked to activity is key. The opportunity to learn, reflect, add to our theoretical knowledge and apply that knowledge should be embedded in each activity. If that is simple matching of paired subjects or attempts to build competence in complex modelling techniques, I really appreciate the set of heuristics the course furnished me with to assist.
  • Me - I'm a pushy so and so. If you are not careful, I'll be in there, taken over the whole show and be happily reshaping things in my own glorious image. I shouldn't do that anyway. I really, really shouldn't do that in a training context. I'm not creating Cyber-men, I must curb my natural tendencies. I think this will be good for me.
It was worthy of the investment. Now, I look forward to getting the sharp nail of experience through my foot and the associated tetanus jab. Time to apply that knowledge, the real test one might argue.

And finally an external view on 'IT bods'.....

It was wonderful to spend time with people from background whose primary focus isn't technology. It can be a closeted world and certainly challenged my ability to explain the fundamentals of testing and agility in context!

Oh and those guys from different career paths and domains still perceive all 'IT projects' to be late, of poor quality and rarely solve the original problem. Or the problem doesn't exist any more by the time we are done. Or the company doesn't. So far still to go. 

Comments

  1. This comment has been removed by the author.

    ReplyDelete
  2. Those of the team building approaches and creating around most of the interest in general public and create around more of the sales further introducing out your experience is a procedure that will also make you an expert in such. knowledge skills and attitudes

    ReplyDelete
  3. By going through your post I realized one thing is learning never stops.As a trainer in online training institute for IT courses I follow your thoughts.thank you.

    ReplyDelete

Post a Comment

Popular posts from this blog

A Lone Tester at a DevOps Conference

I recently had the chance to go to Velocity Conf in Amsterdam, which one might describe as a DevOps conference. I love going to conferences of all types, restricting the self to discipline specific events is counter intuitive to me, as each discipline involved in building and supporting something isn't isolated. Even if some organisations try and keep it that way, reality barges its way in. Gotta speak to each other some day.

So, I was in an awesome city, anticipating an enlightening few days. Velocity is big. I sometimes forget how big business some conferences are, most testing events I attend are usually in the hundreds of attendees. With big conferences comes the trappings of big business. For my part, I swapped product and testability ideas with Datadog, Pager Duty and others for swag. My going rate for consultancy appears to be tshirts, stickers and hats.

So, lets get to it:

3 Takeaways

Inclusiveness - there was a huge focus on effective teams, organisational dynamics and splitt…

Wheel of Testing Part 2 - Content

Thank you Reddit, while attempting to find pictures of the earths core, you surpass yourself.
Turns out Steve Buscemi is the centre of the world.

Anyway. Lets start with something I hold to be true. My testing career is mine to shape, it has many influences but only one driver. No one will do it for me. Organisations that offer a career (or even a vocation) are offering something that is not theirs to give. Too much of their own needs get in the way, plus morphing into a badass question-asker, assumption-challenger, claim-demolisher and illusion-breaker is a bit terrifying for most organisations. Therefore, I hope the wheel is a tool for possibilities not definitive answers, otherwise it would just be another tool trying to provide a path which is yours to define.


In part one, I discussed why I had thought about the wheel of testing in terms of my own motivations for creating it, plus applying the reasoning of a career in testing to it. As in, coming up with a sensible reflection of real…

Getting started with testability

At TestBash Netherlands, I said that, in my experience, a lot of testers don't really get testability. I would feel bad if I didn't follow that up with a starting point for expanding your mindset and explicitly thinking about testability day to day, and making your testing lives better! 

In large scale, high transaction systems testability really is critical, as compared to the vastness and variability of the world, testing done within organisations, before deployment, is limited by comparison. We need ways to see and learn from the systems we test where it matters, in Production.
Being able to observe, control and understand is central to testing effectively, and there are loads of resources and experience reports out there to help. I was/am inspired by James Bach, Seth Eliot, Matt Skelton, Sally Goble, Martin Fowler and a little bit of PerfBytes/Logchat, so lets see if it works for you! 

Overall Model:

Heuristics of Software Testability by James Bach

http://www.satisfice.com/tool…