Showing posts with label Test driven Development. Show all posts

Unit Tests... Time Waster or Life Saver?

It's been my experience that there are two camps as it pertains to the writing and use of Unit Tests.  Those who are for them say it saves time and money, as the saving of one = the savings of the other and those against them complain about the time and repetitiveness of writing them.

Automated Unit Tests Save Time and Money


My first position in the Software Development industry was installing and supporting software at the Client's site.  We would download the latest build from the Development team, install, test, and report any issues found.  With any luck, we found and fixed any problems before the Client saw them.  Too often, though they found them first.  It is with this first experience as a fledgling member of a new company that I would start to see the challenges NOT having high quality, automated Unit Tests.  Too many times to count I was left scrambling trying to resolve new issues in code both old and new.

With this new perspective of what DOESN'T work I was determined to improve quality.  When I was promoted to Development that is precisely what I started with.  It is with that, that I ran into the below challenges:

Reluctance

I began writing Unit tests, and encouraged other developers to do the same.  I gave several 'Lunch-N-Learn' seminars and demonstrations on how to write good Unit Tests, Test Driven Development, and showed the time-savings associated with it.  To my surprise, most Developers still didn't want to write tests.  Writing Unit Tests was a task that was (in their mind) unnecessary, time-consuming, and actually took time away from adding functionality.  After all, their bonuses were based on functionality added, not tested.

Code Coverage

As I continued to write my tests, I also added coverage reports to the build process.  Management liked to be able to see how much of the code was covered by automated tests, and had the other developers cover any new code by 80% line coverage.



That seemed like a good idea, and - to be honest - I was all for it.  What I didn't foresee was that the other developers continued to view writing tests as a waste of time.  They were writing their tests AFTER their functionality had been added and tested, and put the bare minimum effort into them.  So, their tests were akin to that of a dogs breakfast, terrible.  They would write a test that invoked their code, and it would succeed as long as no errors were thrown.  They included no checks to verify that the code actually did what was intended to do, nor had any testing of edge case or failing conditions.  Yes, they met the 80% code coverage rule to appease Management, but their tests weren't actually testing anything.  The only thing the tests verified with any degree of certainty is that no error was thrown when using the exact case the test was.

False Sense of Security

From a code quality perspective, having no tests is better than having tests that don't actually test anything.  Having automated tests and associated coverage reports gives the feeling of security in the quality of the software, but given that those tests didn't actually do anything, that feeling is completely unjustified and misleading.

Unit Test Quality

Code coverage by itself cannot be used to determine the quality of automated Unit Tests. The only way to determine how good a suite of unit tests are is to review the tests manually.  Each method under test should have several test cases testing a variety of different input parameters, check to ensure the method did what it was supposed to, handles errors appropriately, and verifies the outcome.

So What's The Upside?  How Can We Change Perceptions?

There can be much resistance to writing tests, and shoddy tests written to achieve this 80% line coverage management seems to have fallen in love with.  The question becomes how can we show that this upfront effort will save time and money?  How can we show that branch coverage is preferable and the quality of testing is vital to ensure the usefulness of the procedure?

The Code - Deploy - Start Container - Navigate - Test - Debug Cycle

Develop Build Test Cycle

Following Test Driven Development practices actually saves development time for me.  Most software runs in some type of container (i.e. web server, mobile OS, etc.), therefore to test it the way a User would see it requires that the code be deployed to the container, and the container itself be started.  The cycle of code --> deploy --> start the container --> navigate to area of application --> test, can be a long one.  If you do this process over and over again, you can see where the time gets wasted.  However, if the code is running tests, this process is reduced to a fraction of the time (even better if mocked tests are used).  Plus, the tests could (and I argue should) be executing different scenarios, so many scenarios are being tested at the same time during the development phase.  If you are using an expensive tool that does not require the container to be restarted when code changes are made, the start-up process is removed, but you are still able to test only one scenario at a time.  I have worked on projects where it was extremely difficult (and therefore time consuming) to set up the data from one scenario to another.  Automated Unit Tests change all that.

Cost Savings

It is industry-wide knowledge that the cost of fixing a problem in Production code is greater than fixing it during the Development phase.

Relative Cost of a Bug Fix
More scenarios running more often as early as possible in the life of the software means any bugs are found earlier on, whether automated or manual.  But, automated tests run faster, and can run more often (i.e. on a scheduled basis), so bugs are found earlier, and are therefore less expensive to fix.

Quality Code Stands the Test of Time

Any good piece of software will be continuously improved, added to, and ultimately changed (this is called Refactoring) throughout its lifetime.  The only way to ensure that it still works as it was intended is to test it; and test all scenarios.  This can be done manually, but as new functionality is typically added to products, this manual process can take longer and longer each release.  If these scenarios are automated, they can be executed and the results validated very quickly.

Cost of Automated VS Manual Testing

Attrition

People do not stay in an organization forever.  Be it the Developer(s) who wrote the code, or the Tester(s) who tested it, or the Business People who thought it up, eventually, those people move on.  Having tests automated in code ensures that the original functionality works as it was intended, and there's always a record of what it does, and how to use it.

Automated Unit Tests Save Time and Money

I have heard almost every reason (excuse) to not write Automated Unit Tests.  I have found each and every one of them to be completely untrue, and generally spoken by Developers who just don't know how to write proper tests.





Picture credit: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggjxxKISvxdmRw0TyzxyuFiRTci7Z3ikZBal77aB_UaZac0WqJelryd0cT-O8jPnwzb9JwPoIneOSx7H09_95RINBMr6Pi1MDjCgcTeYLTcjKUhYLZdf9Yvh8sptKSjfuUA-8GGCEj7zTH/s320/bigstock-Timeismoney-81634613.jpg

Picture Credit: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhlzvITPXbuXoHybvBFWvNC4_gLU7dAIY0pZY7hC-UTeiGV_kg8_ZULMbcymTJUqwL5zgEqPnPzP_xLfUT65MttFs33aE_Ueqe2HT8pvbNATIEm9sCz7LEUDzkffzqNWqcmOcWM8x6YcQHS/s320/Code+Coverage.jpg

Picture Credit: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEisOs1vsZHEPts57DsATS-hWq1k8nkWaiHTChbyoynadbclviq56_Ifv3tEzvnLPZ1E-fWQmRLTbJReSco5ibkvkCeju_dK3RGSkbFtlyTGkIMi-E3UCDohlcmhM92BqYuwbcLwXYV8nUQB/s200/continuous-integration.png

Picture Credit: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgM0eqeuccBCyGdZmZkVxwKMatWVAniOA36so5KJ-HLkxKD29caiDXg6F9SFVJ4oxFVpHmV8aaDCn8A7kyHWarzOsCjW2-FN9sBGg39nhGXY03ki9RP_LHPZycboBnMU9H0VgH0zT2qA8xB/s320/relativecostbugfix.png

Picture Credit: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhovhnUylpWZyZReL94JniDGPs0JrWHIvJGD-Vri3Ic9IG9_1f-J87EkGB4H5ColVyNsfGd_i-tbuJQyMMZke1shcyYQqPZQmVlLqRs3z9fIm5ZUQ_ZttXNMbsKUf_PcApva_fT2m1Yb2GN/s320/Cost+Automated+VS+Manual+Testing.png