I seem to have been around enough times that I'm starting to detect a pattern. I thought I'd share my observations and see what other people had in mind. It's kind of a frustrating one to me - I'm a huge fan of continuity - because it seems to define that all commercial projects are doomed to die.
So in phase one, we have the wild west. The project begins life - sometimes with formal blessing and sometimes without. The developers are the die-hards who believe in the project and put in crazy amounts of effort to putting it together. There's minimal oversight at this point - the team trusts each other. The team is usually in constant communication about what they are doing and what needs to be done next, but it's hardly formalized.
In phase two, the efforts of the developers have paid off. The project has crossed a major milestone, been revealed to all, and possibility even filled a hole that the company needed filled. Everyone is very excited, not least because the project seems to have come out of nowhere to meet a need. Now management is involved in taking this generally-still-rough-around-the-edges project and "finishing" it. Sometimes this is also called productization. But the main point is that someone is now in charge of choosing a future for the project. Everyone is still pretty happy and additional developers are often added. Source control becomes formalized. Everyone wants to use the product, although it requires updates to meet their exact needs (which the developers are happy to do).
In phase three, the product is more or less mature. It still has rough edges, since the resources planned for the grand designs in phase two were diverted to unexpected feature requests. New developers are added to increase the head count and try to catch up - costs increase. The product sees massive growth in feature set but little polishing. New features are published early in order to get back to the main tasks and frequently cause bugs to be seen by end users, resulting in a tightening of code quality standards such as code review and a decline in reputation.
In phase four, upper management begins to ask the project management why the product is still so unpolished and why it has so many bugs, despite the time and money put into it. Demands for better tracking of resources and money are put into place. Bug tracking and time tracking become formalized, and statistics are often added to the regular process, increasing the time spent managing the product. Demands begin to be made that the developers focus only on what the manager dictates needs to be done, and not side issues that the developers consider important. Small bugs linger and user satisfaction begins to fall. The developers begin to complain that they need permission to work on the aging architecture and to correct user satisfaction bugs.
Phase four may linger for a long time, but eventually it grows to phase five. Management decides that the number of things that lead developers have stated need attention is too large a list to handle in a cost-effective manner. The project is put into a bug-fix-only mode while a replacement is designed and built from scratch. Note that although there's often some planning sessions, the replacement project usually starts in phase one.
I've been all the way through this loop a few times, and been at various points of it on various other projects, and I am becoming convinced that it's inevitable. Look, software is expensive and it's very difficult to quantify. It's hard to plan, it's hard to maintain, and it's hard to get right. We need to stop pretending that there must be a silver bullet out there if we just find the right way to manage the project.
The truth is that phase one probably accomplished 80% of everything the product needed for its entire lifespan. And that should really be the statistic that makes the most sense. No matter what some people who don't write real software for a living may claim, it's not something that you can just plan out to the smallest detail, because unlike so many sciences that can (I'm thinking architecture here), SOFTWARE DOES NOT EXIST IN THE REAL WORLD. That's right - it's completely virtual. It doesn't obey any laws of nature.
To write software, a person needs to have a firm grasp on how to tell an imaginary concept to do a real world task. They need to be able to abstract thousands of steps, put them in the right order, and then be able to anticipate all the wrong ways that a person will try to interact with those thousands of steps. BUGS ARE INEVITABLE. You can mitigate some of them (see earlier blogs), but by god stop pretending you can prevent them all. Assume that EVERY LINE OF CODE WILL NEED ATTENTION SOMEDAY.
It's also important to keep in mind that if you are not a user of the software and you are not a developer of the software, you are not qualified to determine what the software NEEDS. If you're a manager and you came in to manage the product before you even knew it existed, then get over yourself. You need to buddy up with one of the developers, understand what it does, and work with them when you're making decisions. It's all too easy to focus on the wrong parts of the product, or to choose a forward path that is completely against all of the design to date (been to this one specifically). This will make the product worse, not better. Even better, become a user. Use the software daily. Make it something you HAVE to use. Then you'll better understand why those little usability bugs are a big deal. ;)
I'm pretty convinced that the longer a project survives before phase three, the more successful its life will be. It's good to have some structure - source control is critical. Code reviews are useful. Even a bug tracker is a good thing to have. But when you get into late phase three and phase four - all of these useful things start to be used against the project. Why is there so much code churn? Why are so many people wasting time with code reviews? Why are there so many issues in the bug tracker? Good things become bad because they are seen as wasting money, when in fact they are preventing waste by reducing the issues before they are seen in the field.
What's your experience with the commercial software life cycle?