Executable Specification


Recently I was listening to a podcast on Hanselminutes which is hosted by Scott Hanselman. The podcast aimed at getting some insights into BDD, Cucumber, Gerkin and SpecFlow which are the tools mainly used for acceptance tests in many DotNet projects. You can download and listen to the podcast from http://www.hanselminutes.com/default.aspx?showID=267.
While listening to the podcast something immediately struck me. I was able to relate to it. It reminded me of two separate instances which I had encountered on two different projects. One wherein lack of Acceptance tests had caused us a lot of headache and the other one where a possible situation was avoided by having a simple acceptance test in place. I would like to share both these instances here. The point to note here is that in both the cases we were working in distributed teams in different geographic locations.

Case one without Acceptance tests in place

During this project my team was developing a client application which was customer facing. We were consuming a service developed by another team. At the start of development both the teams had agreed on the contract of the service and both sides had developed their pieces of functionality. Both teams were following Agile and had done a thorough unit testing of their code. These tests were working fine on either side. We had all the unit tests in place which were ran with the every check in as part of the continuous integration process.
At the time of integration a strange thing happened.  On the client side which was the code developed by my team we were expecting a field from the service of 2 characters in length. We had built the client code assuming that we will always get 2 characters from the service. Instead what we got was a long string of many characters. There was some error in the service code and as a result instead of passing the required code down to the client the service was sending the namespace of the object which was holding the code. When we saw a long cryptic namespace appearing in place of a simple code it was not a good sign. It was a small error and took very little time to fix it. But it was really embarrassing for the teams because it was identified at a very late stage in the project during integration phase. Luckily for us the change was very minor.
There have been cases in the past where substantial code changes had to be made and lot of time was wasted redoing stuff. On many occasions lot of time was spent debugging things to make things work.

Case two with Acceptance tests in place

In my current project we are using Fitnesse as a Acceptance testing tool. I will not go into the details of what is Fitnesse and how to use it because there is already lot of material available on the Internet. This time also the situation was similar that a service was providing the data to a client. All the code had been unit tested. As part of definition of done we also had to ensure that all the tests are passing. These tests include not just the unit tests but also the acceptance tests developed in Fitness. The Business Analysts in our team maintain a set of Fitnesse tests for each user story.
While running the Fitnesse tests all but one test was failing for a particular scenario. Myself and the BA who had written the story started investigating the reasons for failure. It turned out that one of the value stored in the reference tables in database was wrong. As per the acceptance test in Fitnesse BA had written an expectation that the service should return a value say “ABC XYZ”. When the service call was made the actual value returned was “ABC  XYZ”. There was an additional space between the two words. After fixing the spacing issue everything was passed and the user story was marked as Done.


As we can see from the above instances, it can be really helpful to have a suite of acceptance tests to validate your functional requirements. Many times people think that a good code coverage by means of unit tests is enough. But there are scenarios where unit tests might not be able to give you 100% coverage. Also because the dependencies are always mocked in unit tests you are most likely to miss scenarios which will pop up during the integration tests. Acceptance tests provide  a good way of doing integration testing and validating the user requirements.
I remember having a brief discussion with Colin Bird the founder of Ripple-Rock while I was in India about acceptance testing. Colin's view was that we need not write acceptance test for each and every scenario because there is steep learning curve involved in learning something new like Fitness or any other acceptance testing tool. Instead we can identify scenarios which are very critical to the success of the project and turn them into acceptance test. While doing this we should ensure that the tests which we have cover almost 80% of the scenarios. If those scenarios are passed we can say that all the minimum marketable features or must have use stories from the product backlog have been successfully completed.
Acceptance tests give us an early feedback during the development of user stories. We need not have to wait till the integration stage.  They can also be run as part of automated build process and provides a level of regression suite as well. It is true that writing acceptance test and also executing them takes a lot of time compared to unit testing. But overall the benefits of having acceptance tests are much more than the time spent on developing and maintaining them. Having ripped the befits of using Acceptance tests I would recommend them for all types of projects wherever applicable and possible.
If you happen to listen to the podcast Scott makes a point that your user stories which might be written on index card or word documents cannot be executed. There is no way to validate a word document and say that the user story is working as per the specification. Specification tests is one way of making the functional specification as executable entities and validate the functionality. Another advantage I feel is that when a Business Analyst writes an acceptance test it can actually help him validate his own understanding with the business users. Many a times we see things lost in translation where in the business users expects something and the business analysts interprets them differently. These kind of mismatches can be rectified early in the development cycle as the business analyst can demonstrate to the business users their understanding before a single line of code has been developed by the developers. It can save a lot of time for developers because they can assume that whatever the Business Analyst have created in the fitnesse tests is what is expected by the business users and the end users.
Fitness is just one option. In the podcast Scott Hanselman also discusses other tools like Cucumber and SpecFlow. All these tools basically help you in doing the same thing. Fitnesse seems to be the one which is widely used. In my previous company I know that some folks were using Cucumber as well as SpecFlow. Its really a personal choice. You can pick and choose what suites you and your team needs.
Until next time Happy Programming Smile


  1. Good post, Nilesh. Investing in quality techniques that address the lower-level tiers have a much better return on investment. I blogged about it myself here: http://www.jessefewell.com/2011/01/22/the-quality-iceberg-game/

  2. @Jesse Thanks for the comment. I read your post and fully agree with you. In an Agile project since the focus is on delivering working software and quality over quantity its very important that we build quility right from the scrach. In the past I have seen people thinking of quality only during the later stages of development. In my opinion that defeats the purpose of Agile. If we don't deliver quality software then we are wasting our customers money and the teams effort and time both of which are invaluable to the success or failure of a project.

  3. It’s a very useful & qualitative information shared on integration testing.
    Indeed a good share.