By Jim Reardon
The test plan is dead!
Well, hopefully. At a STAR West session this past week, James Whittaker asked a group of test professionals about test plans. His first question: “How many people here write test plans?” About 80 hands shot up instantly, a vast majority of the room. “How many of you get value or refer to them again after a week?” Exactly three people raised their hands.
That’s a lot of time being spent writing documents that are often long-winded, full of paragraphs of details on a project everyone already knows to get abandoned so quickly.
A group of us at Google set about creating a methodology that can replace a test plan -- it needed to be comprehensive, quick, actionable, and have sustained value to a project. In the past few weeks, James has posted a few blogs about this methodology, which we’ve called ACC. It's a tool to break down a software product into its constituent parts, and the method by which we created "10 Minute Test Plans" (that only take 30 minutes!)
Comprehensive
The ACC methodology creates a matrix that describes your project completely; several projects that have used it internally at Google have found coverage areas that were missing in their conventional test plans.
Quick
The ACC methodology is fast; we’ve created ACC breakdowns for complex projects in under half an hour. Far faster than writing a conventional test plan.
Actionable
As part of your ACC breakdown, risk is assessed to the capabilities of your appliciation. Using these values, you get a heat map of your project, showing the areas with the highest risk -- great places to spend some quality time testing.
Sustained Value
We’ve built in some experimental features that bring your ACC test plan to life by importing data signals like bugs and test coverage that quantify the risk across your project.
Today, I'm happy to announce we're open sourcing Test Analytics, a tool built at Google to make generating an ACC simple -- and which brings some experimental ideas we had around the field of risk-based testing that work hand-in-hand with the ACC breakdown.
Defining a project’s ACC model.
Test Analytics has two main parts: first and foremost, it's a step-by-step tool to create an ACC matrix that's faster and much simpler than the Google Spreadsheets we used before the tool existed. It also provides visualizations of the matrix and risks associated with your ACC Capabilities that were difficult or impossible to do in a simple spreadsheet.
A project’s Capabilities grid.
The second part is taking the ACC plan and making it a living, automatic-updating risk matrix. Test Analytics does this by importing quality signals from your project: Bugs, Test Cases, Test Results, and Code Changes. By importing these data, Test Analytics lets you visualize risk that isn't just estimated or guessed, but based on quantitative values. If a Component or Capability in your project has had a lot of code change or many bugs are still open or not verified as working, the risk in that area is higher. Test Results can provide a mitigation to those risks -- if you run tests and import passing results, the risk in an area gets lower as you test.
A project’s risk, calculated as a factor of inherent risk as well as imported quality signals.
This part's still experimental; we're playing around with how we calculate risk based on these signals to best determine risk. However, we wanted to release this functionality early so we can get feedback from the testing community on how well it works for teams so we can iterate and make the tool even more useful. It'd also be great to import even more quality signals: code complexity, static code analysis, code coverage, external user feedback and more are all ideas we've had that could add an even higher level of dynamic data to your test plan.
An overview of test results, bugs, and code changes attributed to a project’s capability. The Capability’s total risk is affected by these factors.
You can check out a live hosted version, browse or check out the code along with documentation, and of course if you have any feedback let us know - there's a Google Group set up for discussion, where we'll be active in responding to questions and sharing our experiences with Test Analytics so far.
Long live the test plan!
Awesome!!
ReplyDeleteI'm already creating my test project to get familiar with all the features...
i cant access test anylytics. can you please help me?
DeleteGreat was really looking forward to this day!
ReplyDeleteWas already building an ACC excel version based on the presentations...
Thanks for sharing the great tool. I will give a try.
ReplyDeletehow to import the test details? iam not clear
ReplyDeleteHow large a product does this get used on? Yes, I know that google has some big products, but how big and complex is this tool used on?
ReplyDeleteHow does this relate to requirements, or is this intended to supplement requirements?
Crazy question, but is this something that could be run inside a companies firewall instead of on Google App Engine?
I like this idea because it looks like it will make developing a rich coverage list fast. And the summary/reporting information you can generate looks useful too. And no long paragraphs are required.
ReplyDeleteWhere do you deal with issues like;
- what test data will be needed
- what test equipment will be needed e.g. mobile handsets, TV's, browser combos
- estimates of how long testing will take
- what you need to do to put together an integrated test environment - say you have legacy systems you need to integrate with etc
Would you deal with these issues in another document?
Do you have any documentation on how to deploy this code to a tomcat installation or to your own appspot? What configuration or manual work is needed?
ReplyDeleteI strongly agree that traditional way to create test plan is a big waste of testing resource. The best usage of that is help tester to thinking about test cases in a structured way.
ReplyDeleteIn terms of increasing test coverage, an aftermath test case review works better than reviewing a plan that not really last (due to the consistent spec changes through the developing process).
However, the breakdown of the attributes here appeared really arbitrary to me. How we ensure the breakdown is logical and efficient is a big question. That is an inevitable problem when we are planing our test efforts.
I do not really understand the risk part. The association of actual test cases to the matrix is kind of questionable in lots of cases. And how many test cases should we run to gain enough confidence for a box in the matrix is also vague. It is more like another nice looking "Fake" test coverage representation to me.
But nevertheless. I think it is a nice tool.
In James Whittaker webinar More Bang For Your Testing Buck you can see alot of cool graphs used in "Testify". Why is that not included in Google Test Analytics?
ReplyDeleteMe thinks, well er .. can it be? Say it not so. Even some at Google have gotten RISK wrong. Risk is: Likelihood times Consequence. Independent of each other. And independent of mitigation and test plans. As presented here, "Risk" appears to be more like a combination of "Likelihood" reduced by testing. It is a very common mistake; e.g: flying is less risky than driving. What the ACC needs is a third axis or quantifier for "Consequence", which can help prioritize the test plans. Just my humble 2c since every organization outside of NASA seems to have their own way of defining and managing Risk.
ReplyDeleteAwesome.Thanks for sharing
ReplyDeleteAll the images are broken! Could you guys fix it?
ReplyDeleteImage links are busted
ReplyDeleteThe images in the post appear to be broken. Cheers.
ReplyDeleteThanks, the images are now fixed.
ReplyDeleteNice information & with updated images it looks nice & more informative.
ReplyDeleteMay anybody explain me how can I export data from Test Analytics?
ReplyDeleteHello,
ReplyDeletedo you know if this project is still active or abandoned?
Thanks