Should Unit Tests Touch the Database?

tdd database integration 0 comments suggest edit

UPDATE: For the most part, I think young Phil Haack is full of shit in these first two paragraphs. I definitely now think unit tests should NOT touch the database. Instead, I do separate those into a separate integration test suite, as I had suggested in the last paragraph. So maybe Young Phil wasn’t so full of shit after all.

I know there are unit testing purists who say unit tests by definition should never touch the database. Instead you should use mock objects or some other contraption. And in part, I agree with them. Given unlimited time, I will gladly take that approach.

But I work on real projects with real clients and tight deadlines. So I will secretly admit that this is one area I am willing to sacrifice a bit of purity. Besides, at some point, you just have to test the full interaction of your objects with the database. You want to make sure your stored procedures are correct etc…

However, I do follow a few rules to make sure that this is as pure as possible.

First, I always try and test against a local database. Ideally, I will script the schema and lookup data so that my unit tests will create the database. MbUnit has an attribute that allows you to perform a setup operation on assembly load and teardown when the tested assembly unloads. That would be a good place to set up the database so you don’t have to do it for every test. However, often, I set up the database by hand once and let my tests just assume a clean database is already there.

Except for lookup data, my tests create all the data they will use using whichever API and objects I am testing. Each test runs within a COM+ 1.5 transaction using a RollBack attribute so that no changes are stored after each test. This ensures that each test is testing against the same exact database.

This is the reason I can be a bit lazy and set up the database by hand, since the none of the tests will change the data in the database. Although I would prefer to have a no-touch approach where the unit tests set up the database. For that, there is TestFu which is now part of TestDriven.Net.

From my experience, I think this approach is a good middle ground for many projects. A more purist approach might separate the tests that touch the database into a separate assembly, but still use NUnit or MbUnit to run them. Perhaps that assembly would be called IntegrationTests.dll instead of UnitTests.dll. It’s your choice.

Found a typo or error? Suggest an edit! If accepted, your contribution is listed automatically here.

Comments

avatar

3 responses

  1. Avatar for Jeremy Miller
    Jeremy Miller October 21st, 2005

    Definitely test actual data access code against the database (mocking ADO.Net itself == pain). I do exactly what you describe in the last paragraph and separate test assemblies for unit and integration testing for convenience. For non-data access code, I think it's usually more work to test against the database with the database setup overhead than it is to use mock objects.



    Your client/customer certainly doesn't care about your TDD purity score, so whatever gets you to done is right.

  2. Avatar for Thom Lawrence
    Thom Lawrence October 21st, 2005

    You're right: is it unit testing? Maybe not. Is it worth doing? Absolutely.

  3. Avatar for Abhijeet P
    Abhijeet P March 29th, 2009

    In the case of using the ADO.NET Entity Framework, it seems to me that you don't have much of a choice besides connecting to a real DB to unsure that your implementations sitting on top of the entity framework are working as expected. Entity Framework does not support POCO and its not trivial to have mock object working with using an ObjectContext which performs change tracking.
    What I'd love to see(and Phil mentions this as well) is the no touch deployment where in the database is created as part of the test setup and dropped as part of the test teardown.
    I'm using the Microsoft test framework that ships as part of Visual Studio 2008. Is there a way to achieve this out of the box or would I need to call out to an external script for the DB creation and drop as part of setup and teardown?