Yesterday I went to see Uncle Bob at the Skills Matter Exchange in London. Having read and enjoyed Clean Code and Clean Coder it was great to see Uncle Bob in the flesh.
The talk was on automated acceptance tests. Such a simple topic - we all automate our acceptance tests don't we?
A few points I took away from the talk:
- Can we get our stakeholders to write our acceptance tests? If not is it at least business analysts or QAs? If it
is developers you're in trouble!
- Developers are brilliant at rationalising about concepts such as "is it done?". Don't trust a developer to tell
you something is done!
- Acceptance tests should be automated at the latest half way through an iteration if your QAs are going to have
time to do exploratory testing
- The QAs should be the smartest people in your team. They should be specifying the system with the business
analyst not verifying it after it has been developed
- Your functional tests are just another component of the system. If that part of brittle it means your project is
badly designed! Go back to the drawing board.
A final point that stuck with me is that acceptance tests don't need to be black box tests. The language they are
written in should be high level (it was your stake holder who wrote it right??). But the implementation could
interact with a version of your system that has the database or HTTP layer mocked out. Think of it this way:
- How many times do you need to test the login functionality of your application? Once!
- How many times will you test it if all your tests go through the GUI/web front end? Hundreds!
Hearing Uncle Bob speak reminds me that even when I am working on a project I think is being developed in a
fantastic way, with fantastic testing - I can still try and make it better.