TDD isn’t for the Data Persistence Layer

Most of the TDD examples I see in books and videos are based around problems that can be solved with a pretty simple set of a few rich domain objects, which don’t require a database or a GUI.   Databases aren’t much to worry about anyway – you just stub that out, right?

Well, what if your database access is where a lot of the important work in your application is done?  Well, then we need integration tests.

We have DBUnit or whatever for that.  Now we are testing DAO’s together with their SQL queries and/or object mappings.  And at that point, there are no interesting design decisions to be made.   Why?  Because the structure of our database, and the structure of our domain objects, will be determined by our functional tests and requirements, not by our data integration tests.  Data integration tests only validate the solution, they don’t drive the solution.

Its only TDD when we use the tests to drive the design.

From an integration testing point of view, the DAO’s hang hang together with the SQL or ORM mappings.   Code coverage tools obscure this fact, because they measure coverage of the Java code in the DAO only, and ignore any SQL or ORM mappings.   This encourages us to test our own DAO Java code and mock out the SQL or ORM mappings part.  But without the SQL/ORM part, testing the Java part has limited value.

For example, in Lasse Koskela’s book Test Driven: Practical TDD and Acceptance TDD for Java Developers we find one of the few cases where testing the data access layer is discussed in detail.  Koskela gives a good account of how to write integration tests, and how to mock out your 3rd party data access layer so you only unit test the code you wrote, not the integration – and not the SQL and ORM mappings.  He uses MockObjects and EasyMock to first mock the JDBC api (resultsets, connections, statements etc.), then a Spring DAO (JdbcTemplates, Rowmappers), and finally a Hiberanate DAO (Sessions, Queries).  The tests are, of course, horrible to write and horrible to read.  What is striking is that having gone to all that effort to set up his DAO unit test, he proceeds in a flash to the DAO implementation and – guess what – it looks exactly like every other DAO implementation you’ve ever seen that was not test driven.  Doing TDD made no difference to the DAO implementation.

But now, having gone to all that effort of unit testing the DAO without SQL or mappings being tested against the database, we have to do an integration test, which will have exactly the same expectations as the unit test!  Koskela says we should only test ‘a subset of all read and write operations’ in our integration tests (p. 226).  But wouldn’t it be easier and more robust to just skip the unit test and create a comprehensive integration test around the DAO – one that does test all of the read and write operations?  I think the answer is obviously, ‘yes’, because since TDD makes no difference to the DAO implementation, nothing is to be lost by skipping straight to the integration test.

We need to free ourselves, conceptually, from the thought that the implementation of DAO’s can be test driven.   We can test drive the DAO interfaces, though, as we shape our domain in a TDD way.  But the implementation of the DAO is not something for which the tests will have any impact on the design.

The TDD books and examples often seem to ignore the data access layer, even though in your average Java web app, the data access layer contains a large part of the value of the system.  The reason is, though its not normally well stated: TDD is for the parts of the system for which the tests will shape the design.

Advertisements

One thought on “TDD isn’t for the Data Persistence Layer

  1. For JDBC based project, JDBC connection can be mocked, so that tests can be executed without live RDBMS, with each test case isolated (no data conflict).

    It allow to verify, persistence code passes proper queries/parameters (e.g. https://github.com/playframework/playframework/blob/master/framework/src/anorm/src/test/scala/anorm/ParameterSpec.scala) and handle JDBC results (parsing/mapping) as expected.

    Framework like jOOQ or Acolyte can be used for: https://github.com/cchantep/acolyte .

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s