Great article. I don't think I'm alone on this but I'd love to see what google QA uses for test case management...especially if it's an internally developed one. Is there any news about this?
I'd have to say that this article most insightful! I view myself as pragmatic and somewhat of a purist when it comes to testing processes and documentation. This is my take away: For US Federal IT test contractors/consults, I think the "10 Minute Test Plan" brings to attention a perspective that we all probably were already aware of for some time now - most test plans (documents) are nothing but "paragraphs of prose". The concept of using a “time box” method forced the team members to really focus on the three areas of project management that mattered most: time, cost, and scope. Time – as limited as it was required finding creative and effective ways to cut corners that were never cut before. Cost – the fear of either losing your job or having your performance looking less than par in comparison with coworkers. Scope – the need to focus on what was “really” important to convey to users of the test plan and ignore what was not.
I’ve worked for a variety of Federal agencies and many have templates which contractors have to tailor and adhere to. While it would be near impossible to cull the fat from these test documents, perhaps the real lesson to learn is the necessity of focusing on ONLY what is important to document and ONLY document the things that will be continuously referenced throughout the testing life cycle.
Sometimes less is more when less contains JUST the essentials.
I agree but the 20% will take few more hours/days to complete. It's the fine precision things that take a lot of time. The complicated work-flows that verify some edge conditions that need detailed description and long lists of expected results. Generating data for a security check or configuring a system for testing that consumes the most time.
Sounds really nice, but I am I right to assume that it works only for "self-contained" projects, and not really for integration projects, where the final product is a collection of smaller projects? I mean, test plans for single projects are almost always a waste of time, and can be better represented by code itself[1], but I have still to find a good way to reduce the size of the test plan for integration projects, where several components are released simultaneously, and you should be prepared to test the scenarios you thought of in advance, even without knowledge about the implementation.
[1] Of course, there are cases where a test plan certainly helps, but I usually see the test plan as a tool for test design, not something which should live forever.
You asked some questions. really isn’t 80% enough?
Most of the time maybe it is. If your life, business or your livelihood depended on it, probably not. For example, if your bank calculated your balance correctly 80% of the time, is that enough?
We know full well that we are not going to test everything so why document everything?
We are not even going to test everything we think we are going to test when we exclude some things. We always seem to run out of time. The reason to have a complete list of everything we'd like to test is so that when we report the results of the test we can not only report the results of the test, but report what we did not test. A test report is to enable those who make the decision to go to make a well informed decision.
But, taking in to consideration my comments, this appears to be a useful exercise. Time boxing the planning to a very short amount of time is something I'll make part of my approach. If it generates 80% completion in less than a hour it is well worth it.
I'd say it takes me longer than 10min to figure out what the product/feature is about.
While I agree that #1. (Attributes) is really needed as part of a Test Plan, I claim that #2. (Components) & #3. (Capabilities) should already be there as part of requirements.
We do waste a lot of time, "planning" items which could be automatically generated by an ALM - If we just evaluate sub-features, an easy calculation can give us rough estimation of test writing effort, execution and automation. That will leave us with lot's of free time to really put an effort on strategic planning.
The reason most plans are not maintained, is that it is simply too hard to do. Again, here proper features in ALM tools can make it feasible, and reduce the time we spend on "managing" papers and statistics for status meetings.
Again, this sounds really nice and I'll try to use it in the future, but I still fail to see how effective (or dangerous) this would be. If you have complex systems with complex interations and you are spending at most 10 minutes at analyzing the relationships, thinking about which parts are critical to test and write down your thoughts, you will certainly miss something important sooner or later.
Unless the idea is to spend 10 minutes at writing only, then, I agree 100% :-)
One way to think about testing plan is that the result, i.e., the testing is not important, but the process of writing testing plan is important. By asking people to write testing plan, we force them to think about the feature to be tested.
Another way is from the view of testing plan reviewer. As a reviewer, I like to know 1) whether the tester understand the feature to be tested, such as user scenario, risk idea. 2) whether the tester have a reasonable testing strategy. so that I can have some feeling about the testing.
With this experience behind you, could you tell if complex tests were also produced during this phase? My fear is that, in most cases, only basics and same kind of flows would be produced... Also do you have any indicators on the "productivity" of these test plan? did the 80% found the interesting stuff you expected to? the most important bugs?
This is indeed a nice experiment. Agree with you that most of the contents of Test plan are nothing but copy paste from previuos release. Instead of doing this copy paste, one should rather focus on actual or important thing, then it makes much more sense. Good one.
Really liked the post! I do agree that we spend too much time on documentation like the test plan, and we could do a lot better with less. But we have a hard time changing this issue, since clients and some process vigilants keeps demanding for more...
Nice post. I am wondering if you are actually using this approach in daily testing activities (or whenever you need a plan)? If yes, how often and how is that working? thanks
James Whittaker has followed up on this blog post with a video presentation on the 'ten minute test plan.' Catch it in full on EuroSTAR TV: http://bit.ly/qisw0H
I hate to create the document called Test Plan - because as James W has mentioned it becomes a dead document. In my case, even before test execution starts.
So, I'm going to try this ACC to see whether it work. I'm yet come across a practically meaningful method / approach to derive a great set of test cases.
Btw, I like this phrase in this article "...We know full well that as we start testing, things are going to change so insisting on planning precision when nothing else obeys such a calling for completeness seems out of touch with reality " !
Nice! Love the idea! I think it's the most sensible thing I've seen that "upgrades" the thinking around test planning to a similar level as Agile did with highly collaborative planning meetings.
The comments from people who are worried about "incompleteness" assume that adding more time for analysis will get them better Test Plans, which is true but depending how you facilitate (I'll get to that and maybe this is what you've done) this meeting, you may get 80% meat, skip a bunch of garbage (your premise of this article) and do it all fast. By reducing the garbage, your reducing the cost of information maintenance so that more time is spent on effective work rather than BS. Also, I bet the effort of test plans follows the usual Quality versus Time curve where increasing quality requires an asymptotic increase in time (to get beyond 80% Test Plan quality is very expensive and likely not worth in except in cases of life and death.) So overall, the process "sounds right"--getting good enough quality in ten minutes so we can get back to our workstations, and reducing the inventory of useless information which drags at us daily, and if done highly collaboratively--leveraging group think--I expect the results to often outstrip the old process of working alone. I will add in one important filter is that the team (or a large percentage of them) must have some history with the project under test. Otherwise, the old process of working alone or outside the meeting will allow them to develop some competency. In that case they are doing more than creating test plans.
Highly Collaborative Test Planning Meeting (leveraging group think, in situ review and plan creation) Get the group together, outline the information as Whittaker mentioned (ten-minute time box), and then ask the group to debrief for ten minutes, creating a collated notes which becomes the test plan.
Variations: During ten minute time box, do it as a team brainstorm (properly conducted brainstorms are very efficient and 10 minutes is a lot of time for a brainstorm, many people don't know the rules of brainstorming). During the 10 minute timebox, facilitator silently posts each team members artifacts (one detail per sticky). For some apps, the 10 minute process will result in a *few* items that need further research because you don't have the right person in the room or it's obscure or it needs a PHD to research the topic (I'm serious here.) So the ten minute process will allow you to discover the *few* things that will require the day to weeks. (Where if the team members work alone, lack of communication will drive *most* of the items to require independent research.)
First I would like to say thank you for sharing such a time saving and cost saving concept. I watched a short video on this concept, and instantly felt relief. I'm now able to complete 25 test cases in one day by myself.
Our team is moving to Agile from waterfall and deciding whether we will drop written test plans so this article is timely. I think the 10-minute idea is great. Writing something down at least forces some focused thinking. At a minimum a bullet list of key test areas would provide a good list for reviewing the test effort with the developers and PMs.
This concept is really cool. I've been trying to see if the concept could be used on our projects. I've a few questions though.. Do you write test plans at the application level? So for example, you'd have a test plan for Google Plus. How does it work with features within an application? For example a comment feature? Would you write this type of test plan for things like that? Or is that too granular?
The 10 minute test plan was an untested theory proposed by James (who no longer works at Google) in 2011. I don't know of a single team in Google that uses this approach in practice.
On this topic, I will be posting a Google Testing Blog article on a very different approach to authoring test plans in the next several weeks. Stay tuned...
Great article. I don't think I'm alone on this but I'd love to see what google QA uses for test case management...especially if it's an internally developed one. Is there any news about this?
ReplyDeleteI'd have to say that this article most insightful! I view myself as pragmatic and somewhat of a purist when it comes to testing processes and documentation.
ReplyDeleteThis is my take away:
For US Federal IT test contractors/consults, I think the "10 Minute Test Plan" brings to attention a perspective that we all probably were already aware of for some time now - most test plans (documents) are nothing but "paragraphs of prose". The concept of using a “time box” method forced the team members to really focus on the three areas of project management that mattered most: time, cost, and scope. Time – as limited as it was required finding creative and effective ways to cut corners that were never cut before. Cost – the fear of either losing your job or having your performance looking less than par in comparison with coworkers. Scope – the need to focus on what was “really” important to convey to users of the test plan and ignore what was not.
I’ve worked for a variety of Federal agencies and many have templates which contractors have to tailor and adhere to. While it would be near impossible to cull the fat from these test documents, perhaps the real lesson to learn is the necessity of focusing on ONLY what is important to document and ONLY document the things that will be continuously referenced throughout the testing life cycle.
Sometimes less is more when less contains JUST the essentials.
Great experiment, has it succeeded in freeing up your testers to actually do more testing than documentation?
ReplyDeleteI agree but the 20% will take few more hours/days to complete. It's the fine precision things that take a lot of time. The complicated work-flows that verify some edge conditions that need detailed description and long lists of expected results. Generating data for a security check or configuring a system for testing that consumes the most time.
ReplyDelete" ...plans are useless but planning is indispensable" - Dwight D. Eisenhower
ReplyDeleteSounds really nice, but I am I right to assume that it works only for "self-contained" projects, and not really for integration projects, where the final product is a collection of smaller projects? I mean, test plans for single projects are almost always a waste of time, and can be better represented by code itself[1], but I have still to find a good way to reduce the size of the test plan for integration projects, where several components are released simultaneously, and you should be prepared to test the scenarios you thought of in advance, even without knowledge about the implementation.
ReplyDelete[1] Of course, there are cases where a test plan certainly helps, but I usually see the test plan as a tool for test design, not something which should live forever.
If you would share the plans and experiments or explain how do you state that 80% was complete, this would be a much much better post.
ReplyDelete80% of the task can take 20% of the time :)
You asked some questions.
ReplyDeletereally isn’t 80% enough?
Most of the time maybe it is. If your life, business or your livelihood depended on it, probably not. For example, if your bank calculated your balance correctly 80% of the time, is that enough?
We know full well that we are not going to test everything so why document everything?
We are not even going to test everything we think we are going to test when we exclude some things. We always seem to run out of time. The reason to have a complete list of everything we'd like to test is so that when we report the results of the test we can not only report the results of the test, but report what we did not test. A test report is to enable those who make the decision to go to make a well informed decision.
But, taking in to consideration my comments, this appears to be a useful exercise. Time boxing the planning to a very short amount of time is something I'll make part of my approach. If it generates 80% completion in less than a hour it is well worth it.
Jason: It's called Google Test Case Manager and will be mentioned/demoed at GTAC 2011.
ReplyDeleteRich: That is precisely the idea!
James: I've heard this as "the value is in the process, not the artifact" couldn't agree more.
Juraci: Google doesn't have any self contained projects so I can't say. Everything we have is integrated.
Julio: The appendix of How Google Tests Software will have a complete ACC test plan.
RentonRebel: Calculating bank balances is easy, testing is not. Apples to Grenades dude.
I'd say it takes me longer than 10min to figure out what the product/feature is about.
ReplyDeleteWhile I agree that #1. (Attributes) is really needed as part of a Test Plan, I claim that
#2. (Components) & #3. (Capabilities) should already be there as part of requirements.
We do waste a lot of time, "planning" items which could be automatically generated by an ALM - If we just evaluate sub-features, an easy calculation can give us rough estimation of test writing effort, execution and automation.
That will leave us with lot's of free time to really put an effort on strategic planning.
The reason most plans are not maintained, is that it is simply too hard to do.
Again, here proper features in ALM tools can make it feasible, and reduce the time we spend on "managing" papers and statistics for status meetings.
halperinko - Kobi Halperin
Again, this sounds really nice and I'll try to use it in the future, but I still fail to see how effective (or dangerous) this would be. If you have complex systems with complex interations and you are spending at most 10 minutes at analyzing the relationships, thinking about which parts are critical to test and write down your thoughts, you will certainly miss something important sooner or later.
ReplyDeleteUnless the idea is to spend 10 minutes at writing only, then, I agree 100% :-)
One way to think about testing plan is that the result, i.e., the testing is not important, but the process of writing testing plan is important. By asking people to write testing plan, we force them to think about the feature to be tested.
ReplyDeleteAnother way is from the view of testing plan reviewer. As a reviewer, I like to know
1) whether the tester understand the feature to be tested, such as user scenario, risk idea.
2) whether the tester have a reasonable testing strategy.
so that I can have some feeling about the testing.
With this experience behind you, could you tell if complex tests were also produced during this phase?
ReplyDeleteMy fear is that, in most cases, only basics and same kind of flows would be produced...
Also do you have any indicators on the "productivity" of these test plan? did the 80% found the interesting stuff you expected to? the most important bugs?
This is indeed a nice experiment. Agree with you that most of the contents of Test plan are nothing but copy paste from previuos release. Instead of doing this copy paste, one should rather focus on actual or important thing, then it makes much more sense. Good one.
ReplyDeleteJames,
ReplyDeleteHow does the 10 minute test plan fit in with the CFC (Component Feature Capability) analysis?
Chris
Really liked the post! I do agree that we spend too much time on documentation like the test plan, and we could do a lot better with less.
ReplyDeleteBut we have a hard time changing this issue, since clients and some process vigilants keeps demanding for more...
I would like to ask one question that in which phase Test Plan is created, whether Test Strategy is covered under Test Plan or not.
ReplyDeleteNice post. I am wondering if you are actually using this approach in daily testing activities (or whenever you need a plan)? If yes, how often and how is that working?
ReplyDeletethanks
This is true and really happening in most of the product development companies.... the thought is very insightful
ReplyDeleteNice Article. But I won't agree with below statement.
ReplyDelete"Anything in software development that takes ten minutes or less to perform is either trivial or is not worth doing in the first place."
Because, we can do lot of good things within 10 minutes time.
You can draw a DFD.......
DeleteJames Whittaker has followed up on this blog post with a video presentation on the 'ten minute test plan.' Catch it in full on EuroSTAR TV: http://bit.ly/qisw0H
ReplyDeleteIn the test plan describe only the System capability,Functionalities to be tested and the Test schedule that will be enough to document
ReplyDeleteI hate to create the document called Test Plan - because as James W has mentioned it becomes a dead document. In my case, even before test execution starts.
ReplyDeleteSo, I'm going to try this ACC to see whether it work. I'm yet come across a practically meaningful method / approach to derive a great set of test cases.
Btw, I like this phrase in this article "...We know full well that as we start testing, things are going to change so insisting on planning precision when nothing else obeys such a calling for completeness seems out of touch with reality " !
That's great James :)
ReplyDeleteI really like this test planning 10 minute approach. I will going to try this on my side too and look for the fast, most test covered output.
I think we can also try your experiment on test cases creation and there execution priority number.
Thanks for the great idea :)
Kapil
http://testing-mines.blogspot.in/
Nice! Love the idea! I think it's the most sensible thing I've seen that "upgrades" the thinking around test planning to a similar level as Agile did with highly collaborative planning meetings.
ReplyDeleteThe comments from people who are worried about "incompleteness" assume that adding more time for analysis will get them better Test Plans, which is true but depending how you facilitate (I'll get to that and maybe this is what you've done) this meeting, you may get 80% meat, skip a bunch of garbage (your premise of this article) and do it all fast. By reducing the garbage, your reducing the cost of information maintenance so that more time is spent on effective work rather than BS.
Also, I bet the effort of test plans follows the usual Quality versus Time curve where increasing quality requires an asymptotic increase in time (to get beyond 80% Test Plan quality is very expensive and likely not worth in except in cases of life and death.) So overall, the process "sounds right"--getting good enough quality in ten minutes so we can get back to our workstations, and reducing the inventory of useless information which drags at us daily, and if done highly collaboratively--leveraging group think--I expect the results to often outstrip the old process of working alone. I will add in one important filter is that the team (or a large percentage of them) must have some history with the project under test. Otherwise, the old process of working alone or outside the meeting will allow them to develop some competency. In that case they are doing more than creating test plans.
Highly Collaborative Test Planning Meeting (leveraging group think, in situ review and plan creation)
Get the group together, outline the information as Whittaker mentioned (ten-minute time box), and then ask the group to debrief for ten minutes, creating a collated notes which becomes the test plan.
Variations:
During ten minute time box, do it as a team brainstorm (properly conducted brainstorms are very efficient and 10 minutes is a lot of time for a brainstorm, many people don't know the rules of brainstorming).
During the 10 minute timebox, facilitator silently posts each team members artifacts (one detail per sticky).
For some apps, the 10 minute process will result in a *few* items that need further research because you don't have the right person in the room or it's obscure or it needs a PHD to research the topic (I'm serious here.) So the ten minute process will allow you to discover the *few* things that will require the day to weeks. (Where if the team members work alone, lack of communication will drive *most* of the items to require independent research.)
Whittaker, glad you posted this!
Lance Kind
First I would like to say thank you for sharing such a time saving and cost saving concept. I watched a short video on this concept, and instantly felt relief. I'm now able to complete 25 test cases in one day by myself.
ReplyDeleteWhat video did you watch?
DeleteThat incomplete 20% may have 80% of bugs..just like Pareto Principle......:)
ReplyDeleteOur team is moving to Agile from waterfall and deciding whether we will drop written test plans so this article is timely. I think the 10-minute idea is great. Writing something down at least forces some focused thinking. At a minimum a bullet list of key test areas would provide a good list for reviewing the test effort with the developers and PMs.
ReplyDeleteThis concept is really cool. I've been trying to see if the concept could be used on our projects. I've a few questions though.. Do you write test plans at the application level? So for example, you'd have a test plan for Google Plus. How does it work with features within an application? For example a comment feature? Would you write this type of test plan for things like that? Or is that too granular?
ReplyDeleteThanks!
Hi Fayeez, see my comment below.
DeleteThe 10 minute test plan was an untested theory proposed by James (who no longer works at Google) in 2011. I don't know of a single team in Google that uses this approach in practice.
ReplyDeleteOn this topic, I will be posting a Google Testing Blog article on a very different approach to authoring test plans in the next several weeks. Stay tuned...
A new test planning article is now published:
Deletehttp://googletesting.blogspot.com/2016/06/the-inquiry-method-for-test-planning.html