How many times have you seen the latest technology injected into a project, used for the duration of a feature / release, and then left to whither? I've seen it more times than I'd like, and it's got to stop.
Don't get me wrong. I love new technology -- a lot. I love learning how it works, and I love to figure out where it's appropriately used. I also know that you can't just throw new technology into a software project without a plan, and yet I see it happen over and over.
Last week, I saw someone try to shove Entity Framework into a project on the sly, as if nobody was going to notice. Chaos ensued when this plan blew up, and repairs are now being made. The bungle itself was alarming, but I was even more disturbed to reflect on how many checks and balances were already blown before the right people learned what was going on, and why it was a bad idea.
This is a failure on multiple levels.
First, developers themselves should know better than this. The reason EF was chosen in this case was nominally because it was supposed to help the team deliver a feature more quickly. As developers, we've all seen this argument fail dozens of times, and yet we fail to learn our lesson. New technology certainly improves our craft over time, but the first time we use any new tool, we run into teething problems. If we grab a brand-new, shiny box of tech goodness off the shelf and honestly think that it's going to work perfectly the first time we plug it in, we should be hauled out back and bludgeoned.
Next failure: architectural guidance. In this case, there exist architectural standards that cover data access, so at first glance, it would appear that this is an open and shut case. In practice, though, the standards are very poorly socialized, and they're badly out of date. In short, they have the appearance of not being relevant, so it's easy for developers to discount them. Architectural standards need to be living and breathing, and they need to evolve and grow so that designs can adopt new technologies at a measured pace. To do less than this is to erect a static roadblock in front of developers. The developers will drive around it.
Finally, management allowed this failure in a couple ways. Early in this process, a dysfunctional conversation occurred. Management needed a feature by such-in-such date. Development thought this wasn't nearly long enough. Wringing of hands and gnashing of teeth ensued, and eventually, the developer capitulated, claiming that we could make the date, but only by departing from our normal development standards and using this new tech tool instead. Some form of this conversation has been responsible for more software disasters than I could count.
No matter how much time we put into defining our processes, no matter how many years of experience we draw upon, and no matter how many times it's proven that shortcuts kill, we keep getting suckered into them.
Personally, I draw a couple of conclusions from this. First, we just need to have a little more personal integrity and discipline. That's sort of a cheap shot, but it's true. The second conclusion, though is more of a reminder to us as an industry: if we're so desperate that we'll take long shots like this, despite the odds, then the state of the industry must be pretty bad, indeed. As an industry, we need to acknowledge that we're causing this sort of reaction, and we need to find a way to be more productive, more reliably.
But not by throwing out the process just when we need it most.
Related articles by Zemanta
- The Evolution of a Scrappy Startup to a Successful Web Service (slideshare.net)
- PHX Session #5 : Architecture Without Big Design Up Front (Garibay) (slideshare.net)
- Delivering products in agile (Smart) projects (sanderhoogendoorn.org)