I love TDD. And yet I have seen many attempts to live by its principles fail. Most often due to developers work not fitting in its rigid bounds. In the quest for the balance I think I have found a perfect set of heuristics guiding the code-first versus test-first decisions.
The rules are quite simple and as I will be stating them, I will also be providing a few points in their defence.
New code is always code-first.
When I am starting a new application - a microservice or a library - I often have only vague idea about how exactly its internals are going to work. Any test will only slow me down and will not provide any real benefit.
Bug fix is always test-first.
In the modern world of ever growing number of inter-dependencies between various services it gets ever harder to reproduce the issue in the ticket exactly as it was in production. For example, if Cassandra suffered a split-brain and the application did not handle it nicely it will be so much more work to reproduce in comparison to a small unit test in which a mock behaves like a split-brain data storage when we want it to.
Refactoring is test-first, but not only it means writing new tests, it also means deleting old ones that no longer apply.
As trivial as it sounds, having no tests is better than having broken ones. The goal is to have only new tests to be failing until refactoring is complete, and once complete, all tests are green. Working on tests before the refactoring provides a perfect launchpad for entering into the flow - a crucial condition for a speedy refactoring.
Performance optimization is test-only.
Meaning that a test suite is written in a special framework to prove better performance of a new code. Then it is thrown away (you should always throw away the proof-of-concept code) and the new code is written inside the application.
This covers 99% of code changes I do. Unit testing for a bug provides me with a cozy environment to fix it. Not unit testing the new code allows me to write it quickly. Of course I will write the tests after the fact, but only when I am positive the solution works and know exactly which parts are critical and should get the best tests.
At no point I am even looking at the test coverage reports.
A properly done TDD is a blessing to work within. Just recently I was working on a complicated bug that involved timezones and DST transitions. Another developer wrote a set of tests around the issue outlining exactly the expected behaviour in each possible case and state. It was not an easy bug to fix, but having the tests meant that I spent zero time preparing for the fix. All the time I spent on that issue was very productive, the kind of work that makes me proud of my trade. I can not express enough how grateful I was for the tests.
Not often you hear a developer saying he or she really enjoyed fixing a bug, don't you?