Wednesday, December 3, 2008

The need for speed

I say this a lot, but I love TDD. Besides the list of reasons about how it improves code or simplifies solutions, TDD is fast.

TDD is fun because it goes so quickly. Write a test, make it pass, refactor, run more tests.

I love the flow. I love adding a new interface and using it without worry about implementation. I love knowing after I make a new test, something in the app is different and it works. I love living in the tests and only running the application before checking in.

However, the moment the tests slow, the practices start to slip...
  1. Writing code and verifying it works by launching the app, then creating the tests
  2. Making the test pass before verifying it failed.
  3. Not running all the tests before check in.
  4. Not refactoring because it takes too long
  5. It stops you from working close to 5 because you don't want to wait for the build
  6. Not adding tests at all because you don't want to break the build and don't want to wait to find out
  7. It ruins all the fun.
These things all lead to poor test coverage or tests that don't actually do what they say. Yeah, on paper your coverage may look nice, but there's plenty of code you can delete w/o breaking tests. Coverage makes sure a code path is followed, it does not verify the path was actually tested.

So, how do we keep our tests running quickly?
  1. Mocking. Reduces the amount of time spent creating dependencies and their dependencies and the rabbit hole of dependencies that nobody can even see.
  2. Use a one to one ratio of test assemblies to application assemblies -- don't waste time waiting for things to build that you're not testing.
  3. Keep interfaces and implementations in separate assemblies. Dependents should only be recompiled if the contract changes, not the implementation. This allows you to complete the usage of a change, without worrying about the implementation. Sure, you can just resharper, but are you really going to add tests for a new method right now?? If you do, you break the rhythm, if you don't, you have to remember, somehow, that there is code that needs to be implemented somewhere that your tests won't pick up... you'll start thinking about the implementation or just forget and break the app without anyone knowing.
  4. Simplify. Keep tests short. Only create items in setup that are used in all tests. Avoid factories for creating concrete classes because they hide dependencies and make it too easy to use real implementations instead of mocks. Avoid repeating yourself in tests, don't assert the same thing twice, it doesn't help to have the same failure twice but it does slow everything down.
In general, follow the best practices of test writing! They're there for a reason. No point in experiencing the pain of the nice people who figured them out. Take advantage w/o having crappy tests, lots of pain and failure.

7 comments:

hammett said...

Share some code dude :-)

Seriously, would love to see a screencast exemplifying this proposed separation, and the working process.

Unknown said...

Reading your post brought up a trauma. Here's the story, I'll try to make it short.

We were writing a library. There was a similar library that provided a subset of the services that our library provided.
Our library didn't use the existing library for good reasons.

At one point I realized that I could test the core services of our code by comparing data returned by our API with the data returned by the other library's API. This was quite nice because I didn't have to compute the expected values.

Then someone decided to step up a notch. He wrote a testing infrastructure (quite complicated. It had its own DSL) that made sure that the two library were exactly the same for more than 100,000 different inputs (read from a huge input file). Testing became terribly slow. Most people stopped running tests before committing.

At the next phase, he added code that extended the other library with all the extra services that our library had. This allowed him to compare *every single* API. Testing was even slower. But more importantly, the logic of the extra services was now implemented twice: (1) in our code; (2) in the extension to the other library. Of course, the logic in the extension was more buggy than in our library (which makes sense: our library provided higher level abstractions).

At this point we stopped trusting the tests, (and) we stopped trusting our code.

Kyra said...

Hey, is that anime-looking girl on your gamercard from a (gasp) ...jrpg?! Which one?!

wendy said...

Hey Kyra! No its not... but there are some pretty good jrpgs on the xbox. Aren't there jrpgs for you to play in japan?

popdelart said...

Hi Wendy Friedlander, test-driven development is great, code has never been so much fun.

btw, Im looking for your company website.

Is it www.agileusa.com?
It seems outdated and the e-mail is not responding.

/cheers

wendy said...

popdelart,

Glad you're enjoying!

My company doesn't have a website, I always think about making one, just haven't :P

Jonathan Beckett said...

Just ran into your blog completely by accident while looking for something else...

Excellent article about efficient development practices - adding you to my feed reader right now :)