I am amazed at how much emphasis people put on technical and domain skills when asking for a tester. Back in the days when I was looking for Test Analyst or Test Lead jobs, I would often be discouraged from applying because it would state something like; “Must be proficient in Cobol programming” (yes, I’m that old!), or, “Extensive knowledge of the mortgage industry essential”. Now, two decades later, just the briefest glance at job listings today throws up; “Previous experience working with CRM systems” and “Experience of working in Telecommunications sector” as essential requirements for Test Analyst roles and similar statements can be seen in almost every listing. Even then, I used to think to myself, “Well, I’m a good tester. I can read requirements and derive tests from them. Why do I need an in depth knowledge of the system or business area?”.
More recently, as somebody who hired test analysts, I saw three distinct types of applicant: solid testers with little or no experience in the system or domain; poor testers with a lot of experience in the system or domain and (rarely) solid testers with a lot of experience in the system or domain. Of these three, I won’t deny that the latter often worked out best if I could find them, but solid testers without the system experience often proved to be almost as effective and always out-performed poor testers with system knowledge. Today, I would always recruit the career tester who demonstrates an interest in and commitment to understanding the processes of testing but doesn’t have the specific system knowledge over an applicant with an in depth understanding of the system but little perceivable interest in or commitment to testing as a career. Why? Because, simply put, it’s almost always easier to teach a good tester enough about a system to be an effective tester of it (which isn’t necessarily to an expert level) than it is to teach a system expert to be a good tester.
I have seen organisations spend enormous amounts of time and money training their testers to be experts in their systems and not realise any appreciable benefits from that investment. Especially considering the increasing use of collaborative working methods, testers today need a much more rounded set of skills: test process understanding; communication; stakeholder management; risk mitigation and so on. Assuming that strong system knowledge is all that is required to ensure effective testing is a dangerously naïve view.
So, what should we be looking for in our testers? In my opinion, I would look for the following attributes, in descending order of importance:
- Commitment to a career as a tester
- Understanding of application of effective test processes
- Appreciation of risk mitigation strategies in testing
- Communication skills; the ability to cogently explain plans and outcomes of testing to stakeholders
- Relevant system and domain knowledge
- Relevant technical knowledge
How does this relate to what I do now; assessing quality processes and advising organisations on how they can optimise and improve the way that they work? As a TMMi Lead Assessor, I speak to people at all levels of an organisation to assure that there are appropriate, effective and consistently applied processes in place to ensure that the risks associated with software development are identified and mitigated through testing activities. The TMMi framework talks a lot about the skills that I have discussed above: test process; risk mitigation and communication with stakeholders at all stages of the development lifecycle. It doesn’t really specifically mention technical or domain skills. I know that organisations that have been assessed to a TMMi maturity level are carrying out effective testing and, of course, that many of them will have testers with domain and system knowledge. However, I would contend that this knowledge is secondary to the implementation of the processes that TMMi describes. We need to take this on board and ensure that we recruit testers for the most relevant skills that they have.