Test-Driven Development TDD By Example

Advantages of acceptance tdd test-driven development


TDD (Test-Driven Development): Putting Automated Unit Testing at the Center of the Software Engineering Lifecycle

Test! test!! test!!! Today we're going to discuss TDD test-driven development. Test driven evelopment tdd has been talked about a lot over the years as potentially providing benefit, but it is not widely embraced at many payment systems company. Why TDD?

You see, the code we put in payment systems production sometimes (often?) includes defects relative to requirements (in both design and coding).

Let's consider whether the methodology known as Test-Driven Development TDD would help.

But first, a bit of historical perspective and definition on what is TDD all about.

Is TDD a problem that we should seriously undertake to remedy?

Do TDD benefits outweigh the costs? Should it be strictly focused on functional testing, or be extended to non-functional testing?

Io today we explore these questions together, from the standpoint of both practice and theory.

TDD Definition, Goals and TDD History

Like Agile, TDD test driven development is aimed at increasing software development productivity and software quality, and introducing frequent checkpoints at which new customer requirements can be adopted or releases can be put into Production more quickly.

Test-driven development TDD's particular emphasis is to place the responsibility on developers to create automated tests that not only validate software behavior, but actually define the living requirements for application change (in lieu of rigid formal specification of written application requirements and designs).

TDD also prescribes a particular software development process to support that objective.

Here are some of the assumptions regarding goals of the software engineering process that under lie TDD test-driven development:

1.  Requirements - We want requirements to be precise, actionable and therefore, testable.

The most precise requirement is one that is coded as a test. User stories and other written expressions of requirements are very important to get the process started, but the faster the requirement can be codified as a test, the better.

2.  Design - Code designs should reflect business requirements but also best design patterns and implementation techniques.

Einstein said, everything should be made as simple as possible, but no simpler; accordingly, our code designs shouldn't be fancier or more complex than product requirements suggest, but requirements change, so code design shouldn't be fragile with respect to evolving requirements either.

Finally, except at a high level, code design cannot be divorced from code implementation; Agile has taught us that prototyping and testing alternative detailed designs is preferred to spending a lot of time spent creating extensive detailed design documents.

3.  Coding (Implementation) - Application code will be testable from the start, so that defects will be identified and fixed much earlier (certainly before we release code, if we're faithful to high levels of requirement coverage in our testing).

The code implementation will not be fragile. It will accommodate future requirements and be susceptible to refactoring, as required.

It will be measurably as defect-free as possible. It will be readable and maintainable, even by beginners. Redundant, duplicate, and dead code will be eliminated.

4.  Automated Tests - Unit test code should provide as close as possible to 100% branch and code coverage of the application-under-test as possible.

That code should faithfully embody and validate the requirements of the application-under-test (both business and technical, therefore, NFR requirements).

Test code should leverage test frameworks. It should not be fragile. It should accommodate future requirements and be susceptible to refactoring, as required.

It should be readable and maintainable, even by beginners. In short, it should be as high-quality as that of the code of the application.

Test-Driven Development TDD History

TDD History - Historically, it is no accident that the person who introduced the concept of Test-Driven Development is Kent Beck, who was one of the 17 original signatories of the Agile Manifesto which initiated Agile software development.

Test driven development Kent Beck was also the creator of Extreme Programming (XP), one type of Agile that gained a lot of notoriety.

Kent Beck Blog.'s commitment to testing as central to the software engineering lifecycle was not only theoretical - he personally pioneered the xUnit test or xunit theory series of testing frameworks (notably JUnit for Java).

Beck actually wrote jUnit with Erich Gamma, another notable, and here the plot thickens even more. Erich Gamma was one of the original Gang of Four (GoF) who started the original Software Design Patterns movement by co-authoring the now highly influential object-oriented design theory and practice textbook Design Patterns:

Elements of Reusable Object-Oriented Software. So here we see two innovators contributing to automated testing frameworks early in their careers, then later introducing two of the most influential software ideas of our time - improving recurring software design best practices (Design Patterns); and improving software engineering best practices (Agile).

Many of the payment systems company do pursues Design Patterns and Agile to improve our Engineering Lifecycle, perhaps we could stand to improve by following TDD, as well.

TDD Process and Assumptions

TDD Process: TDD accomplishes these goals with this step-wise tdd process:

1.   A desired application improvement, functionality, or non-functional quality is identified.

2.  An automated test case is written that defines how, in the context of the application code, that desired improvement or function would be implemented and tested in the application.

Since the automated test case relies on changes to internal interface of the application-under-test that haven't been built yet, the new automated test inevitably fails.

3.   A tdd developer produces the minimum amount of application code to pass that test, experimenting with the implementation (and its underlying design) until all automated tests pass that validate correct functioning of the improvement or function pass.

4.  TDD developers refactor the code (both application and test code) to reflect desirable coding standards across the application and the unit tests (to satisfy the code qualities mentioned in the preceding section).

During this tdd process, the code of the application and the test cases co-evolve, and all tests continue to pass upon every iteration.

5.  Every check-in of new application improvement or functional code is accompanied by the full complement of new automated test code, that satisfies all the code coverage and other qualities expressed earlier.

If the tests don't run successfully at check-in, then the check-in is automatically backed out, and the developer goes back to the drawing board.

6.  As additional improvement, functionalities, or non-functional quality requirements arise, the cycle iterates.

From the TDD developer's perspective, there are very stringent definitions and rules at play during this tdd process:

A.  Test Failure - An automated test is defined to fail if it does not compile or it does not run successfully.

B.  When initially writing an automated test, it must fail - If it initially succeeds, it's not actually providing validation for the new requirement (since by definition, pre-existing tests already provided complete coverage for pre-existing application capabilities).

C.  Only write sufficient code to pass the new test - You may not write more application code than that, because it would not be testable.

If to satisfy the new requirement, more application code needs to be written, first go back to Step #2 to write the test for that additional application code.

D.  During the process, refactor only when you are sure it is absolutely needed to make subsequent effort efficient -

Software development is a continuous process, so continuously take the time to pause and assess, but don't over-react to the need for re-factoring.

The first implementation (even though it passed the tests) may have maintainability or readability issues and other challenges, but it is more important to strive for fully functioning code first (with fully covering unit tests) than it is to re-factor before you have accomplished that milestone.

Below you will see an advantages of acceptance of tdd tutorial book pdf story that helps understand why, later.

E.  Rinse-and-repeat the process - until the requirement(s) are satisfied, test coverage is 100%, and the code is adequately refactored.

What is TDD & why tdd?

TDD meaning is "Test-Driven Development" - TDD produces the application source code itself, and the code for automated unit tests, which together become the central artifacts of the Software Engineering Lifecycle process itself from soup to nuts, as they say.

Automated test source-code formalizes, in human readable form, the requirements for the application-under-test.

All preliminary write-ups of requirements (for example, user stories in the Agile tradition), however useful initially, are not the actual requirements to be realized by the application - the automated tests are the requirements.

That's because if you can't validate a requirement, it's not really a requirement (kind of like the old philosophical adage if you weren't there to see the tree fall in the forest, you can't claim it fell).

Working tests never become outdated or irrelevant the way requirements documents do.

The payback from investing in thorough testing is high assurance of requirements validation if the tests are complete relative to requirements, and they pass.

Additionally, using TDD process (especially in conjunction with Agile), application source-code in combination with test source code formalizes, in human readable form, the design of the application.

Again, all preliminary or high-level write-ups of design intent, however useful initially, are not the actual design as realized by the application - the code provides that.

Design documents may become outdated and irrelevant very quickly (or even lost).

Successful TDD Advantages and Tutorial
Make it Work and Tested, Then Make it Right

Test driven development advantages include getting code to work (as proved by passing a set of automated tests) provides a benchmark, or reference point, for getting the code to work correctly.

This is a huge advantage of TDD - it gives us rock-solid tests by which to make the journey from just working to the right code in an efficient manner.

Sometimes making it work is as easy as copy-and-paste the code, or adding a few lines into existing method.

And although duplicate code, long methods, or long complex branches are not acceptable for Production, if it works, it's a starting point for validating that requirements are being satisfied, and then testing as one re-factors to meet acceptable code quality standards.

Following just making it work, there may be opportunities for re-design with an entirely different internal class structure, or just simplifying interfaces and/or implementations of existing methods.

The first version that just works could be a maintainability disaster, but without the tests in place that facilitate re-factoring, so too could the refactoring process be time-consuming and error-prone.

Isn't there a risk of thinking more about how to create and manage changing tests instead of coding the application?

Let's think about this questions in the context of stable vs. flexible code interfaces and stable vs. flexible implementations.

For functional tests, stable interfaces make things easy - if the tests are truly providing adequate coverage, implementation changes for stable interfaces will be easily validated by existing tests.

For example, for library code whose basic features set is designed well at the outset, if the tests are properly testing the initial implementation, then they will be equally useful in validating underlying implementation changes.

One can have reasonable confidence that those changes will not wreak havoc with the rest of the system.

What about for classes implementing business logic, which might require more flexibility at the interface level. Test driven development helps you to maintain discipline during the design process in a couple of areas.

First, when initially designing and implementing the interface, one will design it to be as stable as possible, in the interest of wanting to minimize changes to the tests that validate it (as well as other software that depends upon that interface).

Second, even if the interface must change in the future to satisfy new requirements, building up the new version of the test against the new interface will benefit from the existence of the earlier version.

Contrasting the old test code with the new can help isolate which parts of implementation are affected by the interface change, so that tests are developed to focus on those areas.

This doesn't always work, but it can help with code maintenance. There are certain pieces of code that change over time that, if not for the availability of unit tests, would be very difficult to change at all.

In short, if the original design of your application code and its tests are sound, and anticipate change to maximize interface stability, the process of interface or implementation change should be manageable.

TDD Unit Testing Coverage

TDD unit testing? So, how many tests do we need? How much essential code (or how many essential code branches) should be covered? 100% is the correct answer, otherwise the goals of TDD are compromised.

And when following TDD's procedure/process to write tests first, it's much easier to achieve 100% coverage.

This will contribute to QA finding nothing wrong when validating a new build, and deployments to Production never failing.

Regarding the attitudes of others in the industry towards test coverage, tdd python tells the following story:

3 or 4 months ago he took a professional programming test designed for the highest level of software architecture and Java programming proficiency.

Round 1 of the test consisted of 4 Java EE (Servlet, JSP, EJB, etc.) coding questions to be solved quickly, which he did well.

Round 2 was a test regarding software engineering judgment, in which he was expected to transform a pre-existing software project with known defects into a high-quality and maintainable application, including possibly re-factoring for Enterprise operation.

TDD python chose to refactor first (creating distinct models, services, dao classes, and re-organizing artifacts in conventional Java project locations) before investing his limited time in creating unit tests; he judged that attaining 80-90% coverage on only the most important classes was good enough.

He received feedback that he had made the wrong call, and that creating much higher, more consistent test coverage was pre-requisite to ongoing elimination of defects as the application evolved.

The point that tdd python learned is that if he had provided the tests first, he would have had the basis for re-factoring in an error-free fashion later.

The route he chose was risky, both relative to his immediate goals of finding all the existing defects, and with respect to the goal of creating a body of unit tests by which others could validate future changes as well (a characteristic of Enterprise applications).

TDD Benefits

TDD python tells the story of asking the following question of people who came to him for help with bugs on tdd cycle projects:

Why is the code so bad? Why are they only performing minimal changes, and not refactoring the code where it obviously needs it?

Answer: we are afraid of change.

Everybody has a story about this, here is one from TDD python himself.

He saw some custom collection classes in the tdd cycle software platform that operated against an Object type, which he wanted to change to operate against the exact type(s) employed and to use generics.

Turns out that he didn't realize the component was used by the legacy open software platform, which used a different type, so upon release, the open software product failed.

Was the mistake that he should have been afraid of change? No, the mistake was in not having had automated tests that would have caught the open software problem during pre-Production testing.

With 100% code/branch coverage of automated tests, there is no fear at all to enhance the code for new functionality, refactor it to improve design or maintainability, or to improve the NFRs.

If we don't have 100% coverage (or close to it), every change has risk. Change is inevitable, but uncaught fatal bugs are not inevitable.

To be immobilized by fear will undermine our effectiveness in moving forward. The way to overcome that fear is with automated tests. TDD forces the discipline to do so, across the board.

Tests Should be F.I.R.S.T. - Benefits of TDD

Benefits of TDD - What does it mean to make one's automated test code clean and readable?

The criteria is whether they lose their usefulness and/or understandability after a few iterations. F.I.R.S.T is a handy acronym to help you remember the important characteristics of you tests:

F.  Fast - Tests should be fast. If you can't run them quickly then you won't find the issues early in the development cycle where they're least expensive to deal with.

Heavy integration tests should not be run as part of a build (hence the focus should be on unit tests, not complex integration tests). It's better to run those tests separately and on private server like Jenkins.

I.  Independent - Tests should not depend on each other. You should be able to run them in any order.

R.  Repeatable - Tests should be repeatable in any environment whether it's Dev or QE.

S.  Self-Validating - Tests should fail or pass. You should not have to examine a log file to figure out whether the tests pass.

T.  Timely - Write the tests before the code otherwise, your code may not be testable.

Ideally, test should be as simple as this: prepare input, pass it to the method under test and compare the output of that method with expected result.

Functional tests should not depend upon implementation details (although, as we will see later, non-functional tests expressly need to test the implementation details).

And if you do not follow the TDD process/procedure to write tests first, you will likely need to write tests eventually after you've struggled with manually finding bugs because you didn't have the tests in hand earlier. So give it a try to write the tests first.

TDD Testing Should Be Readable

We write code with words and sentences and it's logical if tdd developers write readable code as well-written prose.

There are several techniques and practices that help developers to express the logic of their programs in a well-written form. Here are some of them:

  • Choose readable names for you variables, methods and classes. Names should expose the intent of them.
  • Don't use devlopers notation (at least for strictly typed languages like Java)
  • Keep your public methods short
  • Keep your methods implementation short (ideally 3-5).
  • Avoid long expressions having more than 3 operators (if it's not one-liner). You could always refactor long expressions as a new method.
  • Divide and conquer.

Many of these tdd practices are documented in the best tdd book.

TDD Amazon's Focus on Testing Non-Functional Aspects of the Implementation

Test driven development Amazon: The development organization of Amazon AWS is highly focused on automated tests, but with an interesting difference from TDD from which payment systems company can learn with respect to how automated testing might contribute to NFR validation, capacity planning for SaaS applications, and manageability/supportability of those applications in Production.

Recently, an interview a Senior Manager at Amazon regarding Amazon's perspective on the role of automated testing and TDD to the development of AWS services.

That manager is responsible for implementing their new Elastic File System (EFS) service, whose sole purpose is to provide scalable file storage for use with Amazon EC2 instances.

The service creates and configures NFS-compatible file systems and mounts them to Amazon EC2 instances with a standard file system interface and file system access semantics.

File systems' functional requirements are well-understood, and there are plenty of functional test suites available for them.

The Manager interviewed described the functional testing challenges as relatively trivial.

Instead, most of the challenges are non-functional: EFS is designed for high availability and durability, and provides performance for a broad spectrum of workloads and applications, including Big Data and analytics, media processing workflows, content management, and web serving (tdd web development).

The scale on which EFS is used in Production is extraordinary - many thousands of server and I/O infrastructure instances spread out all over the world, with every aspect of the service implementation needing to handle massively concurrent operations, fast response-time for each, and immense aggregate I/O throughput.

TDD What to test is really difficult is automatically testing the implementation to ensure adequate attention to performance, scalability, resilience, and availability characteristics across all the use-cases.

During the software development and testing process, this requires incredibly high attention to automated tests focused on non-functional characteristics of every conceivable interface or design feature that might represent a point-of-failure or unnecessary resource contention when the service is under load.

It is not possible to simulate Production loads in the Lab - it is simply too large and too varied.

Therefore, Amazon has created an amazing approach to non-functional automated testing that combines aspects of TDD testing, aspects of capacity planning, and DevOps. Here are the TDD essentials of that approach:

A.  TDD Developers are responsible for defining and building test cases and tests for the service component that they are responsible to build.

Both system code, and test code, are of course heavily peer-reviewed/code inspection. There is no separate workforce writing tests.

B.  TDD Developers are also responsible for defining and implementing all the internal real-time metrics by which the component's NFR health should be assessed under load, either at test-time and/or Production run-time.

TDD Examples include queue lengths, resource starvation measures, and over-consumption of scarce resources (e.g. memory, CPU, network connections, etc.)

C.   The tdd developer is responsible for building interfaces within their component for reporting those metrics in concise tdd form, and also interfaces for simulating load and environmental fault injections into their component.

D.  The tdd developer is responsible for building all automated unit tests to validate correct response to failure modes, and points-of-stress and contention under load, in relation to all health metrics for all functionalities supported by their component.

These automated test driven development not only validate correct functioning under various conditions of simulated load and stress and contention at each one of the injection points, but also validate that each of the component's health metrics stay within acceptable tolerance under conditions of load.

E.  When the component of test driven software development is checked-in, so are the associated tests, and the tdd software testing frameworks ensure that all health metrics are within tolerance in the context of running the full body of tests against all components that support the service, under conditions of multi-component global load.

There are literally thousands of such metrics being self-reported by the application to the test frameworks, and to the Production monitoring/alerting system, to facilitate catching and remediating impending software NFR problems before the software is released.

F.  When the component of test driven software development is deployed to Production, the health metrics are attached to Amazon's internal monitoring/alerting system.

There are literally thousands of such metrics being self-reported by the application to Production monitoring/alerting system, to facilitate catching and remediating impending software functional and NFR problems before they have customer impact.

These metrics act as leading indicators of the need to provision new component or infrastructure instances, as service and component capacity needs grow.

G.  Finally, the tdd developer is responsible for writing a RunBook automation that describes to newbie beginner developers or Operations team members who have on-call duties in Production, regarding how to interpret the health metrics, and how to pursue finding and fixing problems.

The goal is to avoid reliance entirely on the original author of the code, and they usually achieve this goal.

Regardless of the order of activities, TDD Amazon's focus on automated test creation by tdd developers is extraordinarily high by comparison to payment systems companies status quo.

Old Chinese Proverbs - Give a Man a Fish, and You Feed Him for a Day. Teach a Man To Fish, and You Feed Him for a Lifetime

As the old chinese proverbs about life - What should one do when a team member is having issues and gets stuck on a bug?

Maybe you should ask them: Do you have a unit test or integration tests for this?

If they say No, then maybe you should suggest they write one and see if they can discover what's going on.

It's often faster than explaining the requirements and code to another person and expecting them to manually discover the problem.

It's often faster than building and redeploying the whole application and debugging it. 2 years ago, TDD python remembers one of tdd developers struggling to reproduce an issue which was occurring only with multiple threads (he had spent one week trying to reproduce it).

You've guess it - TDD python suggested to him, write a test, he did and he found and fixed the issue that day.

When your team member is able to identify the issue in their code from having written a test, (s)he will be more likely to write the next test, and the next one after that.

Acceptance test driven development
Why Not Pursue the Creation of 100% Test Coverage?

With Acceptance test driven development: The big argument (dare we say excuse used) for not pursuing this aspect of TDD is the expense, particularly for building lots of automated tests against legacy code.

Admittedly, project timescales or maintenance budgets are limited. Sometimes legacy code is not well-structured for automated test driven development testing and it needs to be refactored first, exacerbating the expense.

All of which is a concern. But unless a product is already of high-quality and will not change, there is often a higher, hidden cost of NOT retroactively creating automated tests that provide high code coverage.

For example - due to a poor quality track-record for a payment system product, the test automation team went on a mission in 2015 (4 years ago) to create automated functional tests for payment system product.

Although not at the tdd unit-testing level (the development team was not available to do this), the test automation team achieved much higher coverage on a complex product than ever before using combinatorial testing techniques (close to 100%).

According to the Architecture team for this product, this is one of the reasons (not the only reason, but an important one) why payment system product's quality has shot up to be one of payment system company's most reliable SaaS products at this time.

Think of what level of quality would be possible if this same level of coverage had been possible at the class/method interface level throughout the product.

Best Book on Test Driven Development
(Additional Sources of Book Information)

Test driven development book - The following resources on best book on test driven development TDD are highly recommended:

Clean Code: A Handbook of Agile Software Craftsmanship by Robert C. Martin - Publisher: Prentice Hall; 1 edition (11 Aug. 2008) - ISBN-10: 0132350882 - ISBN-13: 978-0132350884

Test-Driven Development by Example by Kent Beck - Publisher: AWP (2002) - ASIN: B00OHXRJTQ

Growing Object-Oriented Software, Guided by Tests by Steve Freeman and Nat Pryce - Publisher: Addison Wesley; 1 edition (12 Oct. 2009) - ISBN-10: 0321503627 - ISBN-13: 978-0321503626


On the related topics of legacy code and refactoring:

Refactoring: Improving the Design of Existing Code by Martin Fowler - Publisher: Addison Wesley; 01 edition (28 Jun. 1999) - ISBN-10: 0201485672 - ISBN-13: 978-0201485677

Working Effectively with Legacy Code by Michael Feathers, etc. - Publisher: Prentice Hall; 01 edition (22 Sept. 2004) - ISBN-10: 0131177052 - ISBN-13: 978-0131177055





            Contactless Card Home | About Us | Affiliate Agreement | Anti-Spam Policy | Contact Us
            Privacy Policy | Dmca Notice | Terms of Use | Link to Us