Thursday, June 20, 2013

A tale of a tester that is tester no more

I listened in to a story today about a traditional tester in a scrum team, where the team helped the tester out of the "I'll wait for features to complete, let you show me a demo and then test what you showed me" and introduced this fellow to programming tasks and full team membership. End result: the fellow quit his job, and changed careers out of software development.

Thinking about this makes me feel sad and upset, because it shows so many things that are so wrong to me.

I have no idea who this fellow is, but the story is either a slightly changing urban legend starting from one person, or this type of thing is getting increasingly common. I don't know if the fellow hated his job in the first place and is truly happier with change of career - perhaps. So, I'll speculate based on what I am, what I do, and what I get to hear regularly about my work and profession: testing.

Someone external might describe my work as "waiting for the features to complete to ask a demo then". Someone external might be puzzled why the 1-hour test for a small change may take me days, as we don't share a model of what "testing" is. I find that it's up to me to explain more of my thinking and where the time goes, but I also respect that some people might not be inclined to do all that, in an "agile" environment that is supposingly based on trust.

Testing that I do is not a wave of a magic wand on top of all things done already to confirm all things we know are true. I find the testing that I do to be somewhat complex learning process, with focus on a lot of different models. And while I work with my team early on to try to make sure we know what we're aiming at developing and delivering, given a few days for the information to sink in and learning to happen, I find new ideas, risks, things to consider. And the implemented feature, in the product context, gives me even more ideas.

I seek to provide information about the product that may be defects in the sense that it doesn't do what we promised, but I also provide information about how the choices we've made make the product in the long run potentially less valuable to its stakeholders, and all too often information about the fact that the value just might not be there with the way we've decided to go about solving the problem. I provide the information by learning myself. If I'd have the answers to begin with, I'd pass them on. But saying what we should have known is so much easier in hindsight.

While I appear to be "waiting", I might actually be working on another feature, that we released earlier and I tested then, but since that time the info has again sunken in deeper, and I have a new idea that I need to see about. Or, I might be building a model, that will enable me to do a better job at testing the feature, with data in the context of the product / system it's part of. The model might be a coverage map, it might be something to help me make sense of my assumptions to allow me to change as I learn while I test, or it might be a model of risks, usage scenarios and value. Or I might just be simply going through a 50+ page list of "so you think you're done, did you think of these" -examples. Or knowing the customer promises of tight schedules with compromises on some aspects of quality, I might be just as pressed with schedule and work on some other feature in a shallow way while waiting for the other feature to complete. And when my team mates show me a demo of what they completed, they need to really think down of me if they consider that I would test the things they showed, instead of just using that as one more practical way of learning more into what I should focus on to do valuable things.


While I nowadays again automate some tests, I hate the idea of dumbing down the work I do - most of it is still thinking and trying things out by hand. Squeeze the timeframe enough, and the testing I do starts to look a whole lot like what I was demoed as the thinking gets lost. And filling my days with programming tasks to be equal with the other team members is squeezing the timeframe. But that's not the intent I had while empowered to consider testing and I'm responsible defending that, but in an environment that trust me to learn from my mistakes on allocating too much time on learning something that did not turn out to be relevant.

To add a piece of my current context: if our team doesn't test, the customers don't seem to mind / complain. They mention occasional problems in passing, and we end up using our time on recompleting the same things again, after considerable wait time. It makes us unhappy. 

If I would happen to work in an environment, who show little respect for my skills and my profession, I too would rather go work in a fast food chain. Staying sane is important. Care more and try to believe the people whose work you have not done - testers or product managers or programmer-developers or user experience specialists - actually might have a point invisible to your eyes until you dig deep enough.