Saturday, June 29, 2013

Milestone for a test resistant team

Whenever someone ask me if we're "agile", I notice I start to feel the need to apologize. Sure, we're on that journey with some criteria, but not there on so many others.

For example, my team is somehow trying to have everyone do "testing", but their idea of testing easily could be "development buffer", "try it once" and "go for coffee, you can relax now". When they manage to find the motivation to pair up with another developer, it's a little better, but the enthusiasm never last long for repeating anything.

While agile teams could be test infected, this one is long on the route to be test resistant and the probabilities of a sudden infection are not getting better yet.

The team has rounds of "let's automate tests in some way" behind them. The most recent one on unit tests went the furthest, with some of the team managing to refactor the code to add tests (one out of nine) and others came up with all sorts of reasons why legacy code made things too difficult for them. The round before that was to take a summer intern and ask him to record tests based on test cases the team had done. Those tests were never run by anyone other than the intern.

For the unit testing effort, I negotiated 1/4 of the teams time over a 6 month period, so unlike usually, the external schedule pressure was really to the problem. We created structures where we could do relevant refactoring, but too  little of that happened. And we had an external coach to help us with the tests just a little, with the end result of us realizing the vastness of refactoring / architectural change that would be needed to get where we want to be.

With  the unit test automation gone and the problems of not enough testing happening, we agreed to do some group exploratory testing sessions. With a few of them under our belt, they seem to do the job for now, but with the late feedback and the repeating nature of checking similar situations are wearing out the developers. This will not last, not for this type of information need.

In one of our group exploratory testing sessions, I drifted off to look at my team mates and think about the next experiment in the efforts to make it better for us. We had failed with UI automation and none was interested in touching it. No one wanted to spend their summer on automating tests that would not be helpful. So I called a consulting colleague in testing, and we talked about the smallest possible thing and timeframe we could set up to do a selenium proof of concept. I could do that myself, if it did not include learning that I did not have time for right now. But I knew enough to explain to my boss why we'd invest on an external to do something like this right now.

The proof of concept was delivered this week, in a meeting with the whole team. The proof of concept showed developers a style of tests they would accept as part of their work. This time, the external did not set up extensive suite of fragile scripts to throw to the developers to maintain, but a few structured examples and a useful explanation of what doing these include for our software. The external did not introduce new tools or languages, but just a driver for the browser interface that could be called from the code as we know it now. Owning small scope from someone else created positive buzz, and if the time frame of setting up was longer, we'd have more implemented but less sense of ownership.

The tests were integrated right away into the local code repository the consultant had no access to. And they ended up being run with the build cycle to find a problem on their first day of existence, a side effect of common services change that did not work in this area quite as the idea was with a big visible error message - just the type of information we usually look for in the release testing. And the problem was quickly fixed to get the tests back running again.

For the first time, I think there is an actual chance this will fly. And next I'll need to find ways of showing what else there is in the lovely world of testing.

Thursday, June 20, 2013

A tale of a tester that is tester no more

I listened in to a story today about a traditional tester in a scrum team, where the team helped the tester out of the "I'll wait for features to complete, let you show me a demo and then test what you showed me" and introduced this fellow to programming tasks and full team membership. End result: the fellow quit his job, and changed careers out of software development.

Thinking about this makes me feel sad and upset, because it shows so many things that are so wrong to me.

I have no idea who this fellow is, but the story is either a slightly changing urban legend starting from one person, or this type of thing is getting increasingly common. I don't know if the fellow hated his job in the first place and is truly happier with change of career - perhaps. So, I'll speculate based on what I am, what I do, and what I get to hear regularly about my work and profession: testing.

Someone external might describe my work as "waiting for the features to complete to ask a demo then". Someone external might be puzzled why the 1-hour test for a small change may take me days, as we don't share a model of what "testing" is. I find that it's up to me to explain more of my thinking and where the time goes, but I also respect that some people might not be inclined to do all that, in an "agile" environment that is supposingly based on trust.

Testing that I do is not a wave of a magic wand on top of all things done already to confirm all things we know are true. I find the testing that I do to be somewhat complex learning process, with focus on a lot of different models. And while I work with my team early on to try to make sure we know what we're aiming at developing and delivering, given a few days for the information to sink in and learning to happen, I find new ideas, risks, things to consider. And the implemented feature, in the product context, gives me even more ideas.

I seek to provide information about the product that may be defects in the sense that it doesn't do what we promised, but I also provide information about how the choices we've made make the product in the long run potentially less valuable to its stakeholders, and all too often information about the fact that the value just might not be there with the way we've decided to go about solving the problem. I provide the information by learning myself. If I'd have the answers to begin with, I'd pass them on. But saying what we should have known is so much easier in hindsight.

While I appear to be "waiting", I might actually be working on another feature, that we released earlier and I tested then, but since that time the info has again sunken in deeper, and I have a new idea that I need to see about. Or, I might be building a model, that will enable me to do a better job at testing the feature, with data in the context of the product / system it's part of. The model might be a coverage map, it might be something to help me make sense of my assumptions to allow me to change as I learn while I test, or it might be a model of risks, usage scenarios and value. Or I might just be simply going through a 50+ page list of "so you think you're done, did you think of these" -examples. Or knowing the customer promises of tight schedules with compromises on some aspects of quality, I might be just as pressed with schedule and work on some other feature in a shallow way while waiting for the other feature to complete. And when my team mates show me a demo of what they completed, they need to really think down of me if they consider that I would test the things they showed, instead of just using that as one more practical way of learning more into what I should focus on to do valuable things.


While I nowadays again automate some tests, I hate the idea of dumbing down the work I do - most of it is still thinking and trying things out by hand. Squeeze the timeframe enough, and the testing I do starts to look a whole lot like what I was demoed as the thinking gets lost. And filling my days with programming tasks to be equal with the other team members is squeezing the timeframe. But that's not the intent I had while empowered to consider testing and I'm responsible defending that, but in an environment that trust me to learn from my mistakes on allocating too much time on learning something that did not turn out to be relevant.

To add a piece of my current context: if our team doesn't test, the customers don't seem to mind / complain. They mention occasional problems in passing, and we end up using our time on recompleting the same things again, after considerable wait time. It makes us unhappy. 

If I would happen to work in an environment, who show little respect for my skills and my profession, I too would rather go work in a fast food chain. Staying sane is important. Care more and try to believe the people whose work you have not done - testers or product managers or programmer-developers or user experience specialists - actually might have a point invisible to your eyes until you dig deep enough. 


Tuesday, June 18, 2013

Doing something to be successful

I had an inspiring discussion with a colleague in testing the other day. A significant project had just come to a successful conclusion with a launch in a project that had it's share of challenges in learning what to deliver. My colleague had just been asked as the test manager to meet up with head of development to explain what they do in testing so that the team is so successful - in relation to others, in particular.

As I'm quick to jump to conclusions and have talked with the colleague every now and then, I started listing things they could have done:
  • Collaborate? (this is not a testers only team, but a development team)
  • Test continuously? (I know they do, having heard bits of interesting yet vague information)
  • Focus on value, not documents? (Some information is valuable, some is worth remembering)
  • Create test automation? (Couldn't expect any less from a great exploratory tester my colleague is)
  • Have good people who learn and become better? (It wasn't always easy)
My colleague replied: "All of those. Plus, in comparison to other teams, we have dedicated testers inside the project. And we don't do just test automation, we do exploratory testing too. And we have good developers that actually care about testing their own code." 

I love hearing good news. And I love delivering useful, working software with quality information.

Monday, June 17, 2013

Agile Finland 2013: Creating a Larger Community

A few weeks back, I made a personal commitment to direct my energy to the local agile community, showed up late in Agile Finland annual meeting just in time for voting of the new executive committee members, volunteered and got elected. With agile being a meaningless buzzword but also an ideal of the type of software projects that are productive and fun to be in, I want to do something to advance the ideal and help the agile community grow to some direction.

From a discussion we had in the first executive committee, Antti Kirjavainen wrote his view into a blog post: http://learninggamedev.wordpress.com/2013/06/17/agile-finland-2013-reaching-the-larger-community/

Just looking at the picture, I find myself disagreeing. The picture takes a view of other communities, that need to be introduced to concepts of agile. The other communities in the picture don't overlap, only depicted common ground is "agile".

What I thought we were talking about back in the executive committee meeting is to split our agile into interest groups where everyone is invited, but there would be natural starting points for all sorts of views into agile. It's not just development & testing (craftmanship stuff), but that stuff is really relevant and needs some grassroot activity to build up skills. It's not just development with various agile models, there's such a thing as business agility. It's not just the people who manage to live up to all the ideals, but also those who are no longer in the waterfall world,  but not agile practice- or culture-wise. And I would definitely hope it's not about developers complaining about managers not getting it and managers joining in the conferences and meetups finding this extremely uncomfortable. All in all, I would hope we would allow and promote diversity, without the silos - bringing same and different interests and skill sets together in whatever way people find useful.

I'd like to see people becoming members. As of now, Agile Finland officially has 394 members. The linkedin group has 1387 members, and the yahoogroups list has 438  members and facebook group has 187 members. The cost of people becoming members in any of these forums is their time. I'd like to see us focus on good value for who we have, and allowing for more people to join. It's great if we have people who never become members, but join some event. But what I'd really like to see happening is people doing sessions for their own interests and needs, and making them available for others while at it. Agile Finland is a great platform to practice giving presentations or guiding dojos (practice sessions of all sorts) or facilitating discussions to get deeper into any topic close to your heart.



I think Antti's post was about Participants in the above picture. But I feel our main challenge and focus should lie with Members. We need good stuff for the Members, and we need some of the members participate in building the good stuff. There's value in the community for both joiners and organizers. At least I feel every hour I've spend on organizing has been worth it for the discussions and lessons I get from people who did not organize but chose to commit their time and joined in.

Also, I'd like to see us create a better possibility of members who belong to advance things in the name of Agile Finland without assuming the executive committee is always involved. There's very few things that actually need executive committee decisions in running a community. Self-organizing within limits should be what the executive committee enables.

Next for me in this theme:
  • Investigate how to bring back the dojo culture for coding and testing. It might take supporting new people wanting to learn to do things like this, connecting with those who do, or something completely different. I'll organize a session to discuss this in August, and organize a testing dojo. 
  • Build larger events that would regularly serve the multiple needs of the community, sometimes with more focus on one theme and another the next time. Some might be multi-track conferences, some might be single-track. Similarly, whole-day / half-day / short sessions.
  • Take the less discussion oriented events and build them into a webinar series, where we could have the Finnish community stories represented in a large variety. I'd like to see some of these in Finnish, some in English. 
  • Introduce LAWST/LEWT-style facilitated workshops to bring people really together to share and discuss experiences. There's similarities to unconferences, main idea is a session of peers where everyone contributes.