R + F + M => $$$
For one thing, they recognized: not all code is equally valuable.
Dan's team didn't refactor each feature to near-perfection before committing it and trying out in production. Many features would change or get deleted within a few weeks of implementation. Time spent refactoring features that soon become obsolete is worse than wasted: reducing duplication means introducing indirection, which makes other code less simple.
Don't get me wrong - Dan's team refactored constantly. But they spent their refactoring energy on code that had proved its value, rather than on today's hot new thing. They put module-level tests around the new stuff, and deeper and more detailed tests around code that turned out to be critical. They let time pass before determining which code was important. Much like developers suck at predicting performance problems as we code, and we're lousy at predicting change, sometimes we also stink at predicting which piece of code is important.
What if we could measure the value of code the way retailers evaluate customers?
There are metrics we could use to suggest an expected value of code - the importance of the code to the business and to developers.
R - Recency: the more recently the code was written, the lower its expected value. Our VCS can tell us.
F - Frequency: how often is the code executed in production? We could sample this with a profiler.
M - Modification: how many times has the code changed? VCS has the answer.
What if these three metrics were compared against estimates of localized technical debt? Tools like SONAR quantify technical debt, but they say nothing about code value. If we could stack up code coverage, style violations, and complexity against the importance of a piece of code, this would help us focus our refactoring efforts.
What of the low-value, high-technical-debt code? The ugly stuff that's still accumulating in our codebase because it is not important enough to fix? Delete it. A premise of the delayed-refactoring strategy is that some code will go away. Pruning little-used features out of the codebase is necessary to keep it sane. Deleting code is the best refactoring: it's like declaring technical-debt bankruptcy, but without the bad credit rating.
Google does this, nixing the least-used features and products. We think, "Why did they take it away? It was already working, it wasn't costing them anything!" but it was. Because every new feature comes with that silent requirement "... and everything else works too." And the rat's nest grows.
Another potential problem: what if we go back to refactor and we can't figure out what the code does? Ay, there's the rub. Delaying refactoring doesn't mean writing cryptic code. It is even more essential that your code convey its intent with good naming and concrete expressions of what you're doing. Avoid abstraction. Let repetitive code live a while longer. Keep it contained, though: favor adding code over changing existing code in many places, even if that means you're copying and pasting.
The Last Responsible Moment for design is later than you think, and not all technical debt is created equal. There is a cost to careful design, and we can weigh this cost against a value only measured in production.
 Premature Optimization Is The Root Of All Evil -- Knuth
 You Ain't Gonna Need It -- XP
 It's like promising your kid chocolate, and then by snacktime she decides she doesn't like chocolate anymore and you get to eat the chocolate. Good-bye, bad code. Hello, faster compile times!
 There's this great song by Bobaloo that goes,
"My hair had a party last night.Feature-heavy applications feel like that. Gotta brush 'em out and split 'em up. Or cut 'em off.
When I laid down everything was all right.
It started out friendly but there must have been a fight!
My hair had a party last night."