2013-11-13

Following on from Chris Georges excellent article a couple of weeks back, and some of the other skills based papers we’ve published over the last couple of months, I wanted to dig a little deeper into how different people and organisations train their [tester] people.

This time around Leah Stockley talks about her experience of building a Testing University.

If you’re interested in contributing to the discussion, email me or leave a comment below, and I’ll be in touch.

Who are you, where do you work and what’s your role/job title?

I’m Leah Stockley, blogger at inspiredtester.com… and currently on maternity leave. I’ve been a tester for 15 years and spent the last 4 working within a large international Investment Bank, where I joined as a test manager. However 18 months ago I pitched a new role to my manager and 12 months ago he agreed to fund it. The title was Head of Strategic Project Services, which meant I was responsible for two things 1. Assisting test teams to review and improve their test strategy and approach and 2. Building the internal ‘Testing University’

Do you actually test (are you hands-on) or are you more of a test manager (are you hands-off)? 

I spent 10 years hands on testing (mixed with test automation) but the last 5 have been more hands off, focusing on management, strategy and training testers. That said, the last year brought me closer to actual testing and the challenges faced by testers every day. This led me to seek opportunities to get some hands on experience again.

What skills attributes are you looking for when making hiring decisions?

I’m much more interested in mindset than experience. Often those with a large amount of experience (especially if gained in one company) can get hung up on a process or blinded to improvements and innovation by their previous experience. So I look for enthusiasm, logic, common sense, good communication skills and the ability and willingness to learn quickly. Of course there are times when particular technical skills or business skills are required but I would rather have someone with aptitude, common sense who is a quick learner than someone with that experience but no drive to do their best for the project. Often the ‘make or break’ question for me when hiring someone is his or her answer to “Why did you become a tester?”

How do you distinguish between more junior and more senior testers? 

I guess the simple answer is years of experience… but the reality is I’m looking for ‘drive to improve’ and problem solving abilities. I’ve seen some of the best testing skills in graduates so I happily employ ‘juniors’ on my team, because I love their enthusiasm and find it infectious. I’ve also had the misfortune to work with some ‘junior’ testers who think because they have less than 4 years experience, their job is to sit back and wait for instruction. And I’ve met ‘senior’ testers who think the number of years in the job is what counts so don’t bother to improve. There is no place in my team for anyone like that regardless of years/ days of experience. To be honest I think this is caused by the managers of those people who are often responsible for creating that mentality in the more junior members of their teams… then wonder why the people that work for them never develop the ability to think and learn for themselves once they hit the ‘senior’ milestone.

Can you describe your organisations approach to training testers?

Yes, because we created it for ourselves. As we embarked on the biggest rollout of Context Driven Testing (CDT) within a firm, it was essential that our testers received training in the skills we were asking them to demonstrate.

The standard training program for the bank was understandably focused on business and soft skills. We were able to apply for funding for specific courses, but external training, though needed, was very expensive and therefore limited to less than 10% of our testers globally. We initially worked with James Bach who trained, inspired and energised a few of us and let us start out by re-using some of his material to train others.

We then took this much further and created the Testing University. We created 12 of our own courses internally, ranging from 2-hour courses delivered via Webex to 2-day classroom based courses. Initially I had to travel to several global locations to deliver the classroom training, so I also created a ‘Train the trainer’ program. I was very particular that the courses we created were experiential and hands on. Delivering this kind of training is very different to ‘giving presentations’ so we had to ensure that our internal trainers not only knew the topic well but also knew how to manage the class and ensure plenty of active involvement from the attendees.

Ultimately we created a ‘core curriculum’ for our testers, based on their roles. This was made up of mandatory, recommended and optional courses spanning across Testing (CDT, Static Testing, Excel for Testers, Combinatorial testing etc.) plus Business and Soft Skills.

We also supplemented this with internal testing challenges, competitions, ‘tester meet-ups’ and Experience Reports to encourage our teams to practice their testing skills and talk about how they test.

Why did you choose this approach, and did it work out?

Because we had a critical mass of testers, we were able to identify a core of passionate individuals who were happy to develop skills as trainers and deliver courses to their colleagues (in addition to their day-jobs).

Though cheaper than external training, this still represented significant investment, so we tried a few different ways to assess the effectiveness. We tried surveys, which helped but had low response rates. We asked people to come forward with success stories and case studies, again to a low response. We also tried ‘interviews’ but people felt like they were being audited/ judged.

The most effective approach was a yearlong ‘innovation’ program. We trained up a team of ‘consultant interviewers’ who met all project teams with a goal of finding the most innovative solutions for Test Planning, Test Analysis, Test Design and Test Execution. Where innovations were identified, we encouraged the team to present experience reports to their peers, to share their solutions and gain feedback. Additionally we had a team of journalists who helped create and publish white papers and case studies of success stories. This helped prove that skills training had achieved the desired result of helping teams improve their testing, but it had an additional benefit. We got to have conversations with those teams who had yet to take time out for the training or were unsure how to apply what they had learned. We could point them to other teams who had solved similar problems, or offer them tailored support and advice; we noticed a dramatic uptake of CDT and attendance of training due to this program. Not only did we gain the measures of effectiveness we needed to support the training spend, but did it in a positive way that gave teams credit for the improvements they made rather than judging them for those they had not.

 Do you use specific learning aids, tools, resources? If so, can you tell us about them?

We used a variety of learning aids.

Our preference was classroom-based courses, and our survey showed that the majority of attendees preferred that too due to the focus they could apply to the topic. However we supplemented it with VC/ webex based sessions where feasible, to reach our global teams. These were especially effective for training on the more technical aspects of testing as they could be very hands on to keep people engaged in the course whilst sat at their desk. We also developed a few ‘self-study’ courses made up of slides with exercises and supported by videos where possible, especially useful as tutorials for new software (i.e. a test automation or mind map tool)

We had an online repository available to the whole test centre. This contained a Library of white papers, a ‘techniques’ repository to capture various approaches used by the teams (i.e. mind-maps, wiki for knowledge base etc), a tools repository where people could share tools and utilities they created for their testing (i.e. data generators or function libraries) and lastly, an internal ‘facebook’ where people could talk testing, share articles or ask questions of their peers.

Do you have a learning philosophy?

Personally, it’s to learn something new every day. That something can be just a small snippet! Sometimes we might want to aim for a qualification to enhance our CV but personally I have found life more interesting since setting myself a goal to learn or understand something new each day instead of limiting my learning to a training course or certificate.

Accepting that, even with 15 years experience, I still have much to learn has certainly reignited my passion for testing but it applies to my personal life too, especially if that day’s snippet is a better understanding of what makes other people tick.

Is there anything else you’d like to tell us about how you think testers should develop their skills?

Start by working out what skills you need and then look for ways to attain and improve those skills in a way that suits your context. For me it’s not about attending the standard testing certification courses that are available. In a fast paced dynamic environment where every project had different constraints and priorities, there was little value in getting every tester certified with a standard public course. This was not going to help develop the skills and strategies needed to survive in their environment. The best way to learn or build a ‘desired skills list’ is to speak with other testers, find out how they do things (or read blogs and magazines for inspiration) and then consider what you can learn and apply from that (or seek a tailored course if needed). Lastly, practice your skills and open yourself up to your peers. Show them how you test, practice discussing your ideas, gain feedback from them. Every interaction you have with others is a chance to gain new skills, regardless of how much experience you or they have.

About Leah Stockley

Leah has nearly 15 years experience in Software Testing for consultancies, banks and software houses across multiple industries. Her context driven testing awakening began 3 years ago. In that time she has become passionate about inspiring and coaching testers to be innovative, empowered and to continually improve the art of testing.  Most recently at a large investment back, Leah was responsible for defining & improving the Test Strategy and Testing University across the Global Test Centre, developing solutions to the concerns that arise when introducing & applying new approaches such as Exploratory Testing. Leah also shares her experiences on www.inspiredtester.com.

The post Training Testers – Leah Stockley appeared first on Ministry of Testing.

Show more