Take your average developer and ask "do you know language/technology X?" None of us will feel any shame in admitting that we do not know X. After all there are so many languages, frameworks and technologies, how could you know them all? But what if X is writing testable code? Somehow we have trouble answering the question "do you know how to write tests?" Everyone says yes, whether or not we actually know it. It is as if there is some shame in admitting that you don't know how to write tests.
Now I am not suggesting that people knowingly lie here, it is just that they think there is nothing to it. We think: I know how to write code, I think my code is pretty good, therefore my code is testable!
I personally think that we would do a lot better if we would recognize testability as a skill in its own right. And as such skills are not innate and take years of practice to develop. We could than treat it as any other skill and freely admit that we don't know it. We could than do something about it. We could offer classes, or other materials to grow our developers, but instead we treat it like breathing. We think that any developer can write testable code.
It took me two years of writing tests first, where I had as much tests as production code, before I started to understand what is the difference between testable and hard to test code. Ask yourself, how long have you been writing tests? What percentage of the code you write is tests?
Here is a question which you can ask to prove my point: "How do you write hard to test code?" I like to ask this question in interviews and most of the time I get silence. Sometimes I get people to say, make things private. Well if visibility is your only problem, I have a RegExp for you which will solve all of your problems. The truth is a lot more complicated, the code is hard to test doe to its structure, not doe to its naming conventions or visibility. Do you know the answer?
We all start at the same place. When I first heard about testing I immediately thought about writing a framework which will pretend to be a user so that I can put the app through its paces. It is only natural to thing this way. This kind of tests are called end-to-end-tests (or scenario or large tests), and they should be the last kind of tests which you write not the first thing you think of. End-to-end-tests are great for locating wiring bugs but are pretty bad at locating logical bugs. And most of your mistakes are in logical bugs, those are the hard ones to find. I find it a bit amusing that to fight buggy code we write even more complex framework which will pretends to be the user, so now we have even more code to test.
Everyone is in search of some magic test framework, technology, the know-how, which will solve the testing woes. Well I have news for you: there is no such thing. The secret in tests is in writing testable code, not in knowing some magic on testing side. And it certainly is not in some company which will sell you some test automation framework. Let me make this super clear: The secret in testing is in writing testable-code! You need to go after your developers not your test-organization.
Now lets think about this. Most organizations have developers which write code and than a test organization to test it. So let me make sure I understand. There is a group of people which write untestable code and a group which desperately tries to put tests around the untestable code. (Oh and test-group is not allowed to change the production code.) The developers are where the mistakes are made, and testers are the ones who feel the pain. Do you think that the developers have any incentive to change their behavior if they don't feel the pain of their mistakes? Can the test-organization be effective if they can't change the production code?
It is so easy to hide behind a "framework" which needs to be built/bought and things will be better. But the root cause is the untestable code, and until we learn to admit that we don't know how to write testable code, nothing is going to change...
My experience is that making sure a design is testable inserts a constraint that helps focus on a simpler design, or one that is organized in a more straightforward way. I often have fewer interfaces and more clarity once I have focused on the issue of verifying my code.
ReplyDeleteMy conclusion at the end of this article is that you've hit the nail on the head. It is tempting to think that knowing how to code is knowing how to test. I have been writing my own tests for three years and in many ways I am just getting started. There is always more that can be removed!
Cheers,
John
Thank you for that. Very true!!
ReplyDeleteLearning how to test seems to be a never ending story.
This is a great post! Not sure how such an idea will be taken in the developer community at large. I personally don't know many developers who think (or care) about making testable code.
ReplyDeleteAnd... Misko, sorry, but did you read your post for spelling, text errors?
Very good points.
ReplyDeleteI think knowing or not knowing how to write tests is scenario-based and contextual rather than an absolute fact!
I enjoy writing testable code and I have felt the simpler design it provides in many of my applications but at the same time I am not too concerned about it because there are tools out there e.g. TypeMock which helps you test your code without making it necessarily so called "testable".
Whatever helps having simpler and more understandable code/design I prefer including writing testable and reusable code / proper refactoring, etc.
You raised some very good points and I agree Developers should think in terms of testing while designing the code. I would say key integration of tester is always helpful in design phase. This way tester can think for different hooks and cases which are difficult to test in isolation.
ReplyDeleteI would say language is not that important when thinking of testing but the actually cases. Somehow after reading this blog I felt there is a need of more white-box testing.
I have to agree with Sue. I really like the post but Misko's spelling/grammar could do with some QA of its own. ;-)
ReplyDeleteI knew the spelling/grammar police would come a visiting on this article.
ReplyDelete(and yes I put "a visiting" on purpose)
However I don't thing it really matters and its a shame it gets mentioned at all.
The main point of the main blog post is easy to make out and it's the content of the post that should be reviewed or commented on.
As for the post I think the poster hits the nail on the head as previously said.
A few good suggestions would be closer working with the testing and devs teams involved, and to set up peer reviews where the testers get involved at the design stage, in an ideal world before any code is written.
Virtually all dev's want to write decent bug free code (it's a matter of pride to most of them). So I find that talking things through from a testing point of view usually sparks their interest.
A last hint is to get the dev's to hand over their test plans to you as part of the integration/smoke tests.
this way you get to see their test to code coverage ratio.
PS - Great Post.
Interesting post here. Well done.
ReplyDeleteDave Evans and Mike Scott here http://skillsmatter.com/podcast/open-source-dot-net/testable-software have got some interesting thoughts about testability.
Their course on Test Driven Development, which I attended last week, is pretty impressive too.
Keep up the good work...