Software Development And The Sunk Cost Fallacy

SunkA large group of lumberjacks are cutting down trees in a forest and doing a really good job of it. Trees are falling left right and center, everyone is working with focus and tenacity. During a lull, one of the lumberjacks climbs up to the top of the biggest tree still standing and sees that in all the confusion to get the work started, they've made a bit of a mistake and should infact have been cutting down the forest next door to the one they're in. So, he yells down to the rest of the team:

Hey guys, I think we're in the wrong forest!

To which the foreman yells back:

Shut up! We're making progress!

I don't remember where I first saw this story (I guess that's the one downside to reading a lot of books :)). But, this story (or one very similar) is actually a commonly used leadership metaphor. It is often used to demonstrate the difference between leadership and management. As interesting as that may be (and it is actually interesting enough that I might look at it further at a later date), to me it also demonstrates another common phenomenon – that of the sunk cost fallacy. You see the foreman, wasn't really an idiot, he knew the climbing lumberjack was right. But the team had invested so much time and effort already, it would be a shame to waste it. Then there is explaining the mistake to upper management – could get awkward, plus think of all the paperwork. No, it is way too embarrassing and costly to admit the mistake, better to pretend that this is the way things were meant to happen all along, hopefully it might all turn out for the best and we'll be able to salvage the current situation.

The Sunk Cost Fallacy

The sunk cost fallacy, also known as the Concorde fallacy, is a very interesting phenomenon. What it basically boils down to is the fact that it is human nature to throw good money after bad. The more resources (time, money) we invest in something, the more likely we are to stick with it despite all the indicators of our venture being a failure (I am not going to give generic examples; you're welcome to check out the links above). It is therefore no big surprise that when it comes to software development, we're not immune. Infact we take the sunk cost fallacy to new heights of awesome :).

In the world of finance, the sunk cost is accepted. If the money is spent and can't be recouped, it can no longer influence any further decisions. On the other hand, in software, once we have spent any kind of effort/money on a feature/project, we just can't let go. We would much rather delude ourselves and everyone around us that we can still turn everything around and make it all come out for the better. Developers do it, managers do it, it's an industry wide trend. No matter how flawed the product vision turns out to be, we are much more likely to try to adapt the whole ecosystem to the flawed product/feature rather than starting over from scratch and building something that will better fit the ecosystem we have. More than that, we will go to great lengths, bring in extra people, do overtime, whatever it takes, as if we can make an incorrect decision right by sheer force of will and sweat. I am not just talking at the project level, even at the code level we (developers) will often stick with a technology/library choice through thick and thin long past the time we should have abandoned it and found something that fits our needs better. There are always, good reasons to justify all this, but what it comes down to in the end is self-delusion  – the sunk cost fallacy at work.

Lalalala, I Don't Want To Hear It

I once worked on a product that started out as an offshoot of a bigger project. Initially it was only adapted to work as part of that bigger project. But, after a while a decision was made to turn it into a framework. There was nothing wrong with the idea. This would have been a perfect point to abandon the existing code; it was unwieldy and difficult to work with. It would have made sense to merge it into the bigger project and start from scratch. But noone was willing to make that decision, money had been spent on development, it was practically a framework already, just needed a little extra effort and then we'd be able to use it on all the other projects in the company and save all kinds of money. A business proposition too good to refuse! Except, in trying to munge the existing code into a framework, everything became even more unwieldy, no project really wanted to use it. But, it was decided that the situation could still be salvaged, all we needed to do was turn the framework into a full-blown product. Once again not a bad idea on its own, and had we started from scratch here, we may have come up with some cool stuff, but we couldn't just abandon all that 'good' code. So, we went on another retrofitting exercise, building more code on top of ever shakier foundations. At this point the 'product' had a terrible reputation within the company, noone except management wanted to have anything to do with it, but that just meant it was pushed even harder.

The company wasn't willing to make the necessary decision to either abandon the project all together, or to start over and build something solid. The money that had already been invested was guiding all the decisions. Recouping losses was the word of the hour and in pursuit of that goal even more money was sunk into a venture that didn't deserve it. If this were an isolated incident it wouldn't be so bad, but this story probably sounds awfully familiar to many and there are more than a few horror stories that are way worse.

Delusion Industry Wide

How about an example of sunk cost self-delusion at an industry wide level? You need go no further than JavaScript. I don't want to bad-mouth the language it has done some decent service and I won't deny that it has some good parts. But as that photo I saw a few weeks ago clearly demonstrates, there are good parts and then there is the rest.


People tolerate JavaScript but most developers don't really love it. It can be unwieldy and hard to use, some things just don't make sense and yet it has become the de-facto language of web 2.0. It's not as if there are no better scripting languages around, and even if there weren't, it's not like we can't come up with anything better if we tried. But, all the browsers already support it (well, kinda, none in quite the same way) and there are all the frameworks that make everything much better. No, there is way too much invested in it, I am sure we can make it perfect eventually. In the meantime it makes a lot more sense and is infact much easier, to put millions of developers through daily pain, rather than phasing out JavaScript in favour of something better (anything would probably do). Seems like sunk cost fallacy at its finest. Admittedly though, I am no JavaScript guru, so if you are, please come along and tear my argument apart.

End of the day, there may not be much that each one of us, can do to affect grand change at an industry-wide level, but we CAN influence the projects we work on. If you see the beginnings of the sunk cost fallacy rearing its ugly head, don't stay silent. It is never too late to remove a technology or library if it was clearly the wrong choice. Be ruthless, if it doesn't do the job the way it should, get rid of it, find something better, there are always alternatives. For the management the advice is the same, just at a higher level. Don't keep throwing good money after bad. Remember if all the other lumberjacks (including the foreman) had just stopped and taken notice when the first one told everyone of their error (regardless of the embarrassment and inconvenience), they could have avoided cutting down the rest of the magical woods – home of the unicorn (cause that was what they were chopping). Instead they could have started chopping the dark dreary forest, like they should have been doing in the first place and thereby made life better for everyone, including themselves.

Image by amateur_photo_bore

  • Steve Conover

    I don’t think the js example works. A sunk cost fallacy occurs when you argue that you should stay with what you’re doing simply because of all you’ve invested in the past and not because it’s the best choice starting right now.

    My best choice for doing development on the browser client side, no matter how much I’ve invested thus far, is Javascript.

    Javascript seems to me to be more of a collective action problem, or (closely related) an example of game theory/Prisoner’s Dilemma: if you and I and every other developer could enter into a contract (a contract with real penalties, that’s enforceable) that says “we shall not use javascript, we shall use client-side X (ruby,python,whatever)” – the world would change pretty quickly. Something approaching this seems to be happening with ie6 right now.

    Incidentally I think there’s a good-sized group of people who disagree with you conclusion about js (though I’m sure they’d agree with some of your points that led up to it). There’s a lot of energy around node.js these days, Javascript was chosen in part because the language naturally supports expressing callbacks:


    • Hi Steve,

      The JS example, is not an example of a sunk cost fallacy on your part, but on the part of the whole industry. Your only choice for rich client side development is Javascript because no matter how bad javascript has been over the years it was still supported by the browser vendors. Noone was brave enough to make the decision to get rid of it and stop supporting it which may have prompted the rest of the industry to follow.

      • Mannemerak

        No, many tried to move away. Even the almighty Microsoft tried with ActiveX, but failed (debatable if this was any better than JS); SUN with Java (again…debatable).
        How ever you look at it, JS is not the wrong forest, it is probably not the best one to cut down, but not so bad to abandon and take the risk with something unknown.
        I agree with the game theory explanation.

        • You’re right, I was discounting things like ActiveX and Java applets both of which were a failure for different reasons in my opinion. One for not acknowledging the fact that there is a world outside Microsoft tech, the other for trying take the same road that Flash later took more successfully (but still not nearly successfully enough :)).

          • ActiveX and Java were never meant to be replacements for JavaScript but for HTML altogether. JavaScript is a means of automating the existing web. Neither ActiveX nor Java interact with the page itself but exist within it inside a container. Very different concepts.

      • Rick Knowles

        Hmmm … I’d argue that your admitted lack of deep understanding of javascript is misinforming your choice of example.

        It’s fairly common for people who don’t get javascript to blast it as a toy language, but once you’ve studied design patterns and compiler design / language theory rigorously you begin to see that javascript is by far a superior language to many of the so called Enterprise languages like Java / Ruby / C#. Python is about the only language I would put on par with javascript.

        Don’t let the o’reilly book covers fool you – there aren’t any better choices than javascript for the web environment: both for individual developers and the industry as a whole.

        • Hi Rick,

          You have added another voice in defense of javascript, which makes me lean even more towards making it my next language to learn (properly :)).

          • Sam Asjubas

            I agree with Rick. Javascript is like Python with a syntax that looks like C or Java. It’s actually one of my favorite language to code in, and the only annoying problems come from cross browser inconsistencies.

        • I think you have Python and Ruby in the wrong order.

          Ruby is by far more expressive than Python.
          The similarities between ruby and python are staggering too.

          If anything Ruby has more in common with JS than python does.

          • Dave

            This is easily confirmed by studying how Javascript frameworks like Prototype adapted Ruby-style concepts such as a built-in iterator (each()) and as a result added much more Ruby-isms like collect(), with very little effort.

            Javascript can be one of the most annoyingly painful languages in the world to learn largely because 99.999% of the tutorials out there were written by 12 year olds making “kewl pages that do flashy things” rather than real applications. Take a look at something like jQuery to see how incredibly simple-yet-powerfully-expressive Javascript can really be with the right framework, or check out the Yahoo stuff to see “big architecture” in action, all in Javascript.

            For that matter, since you are considering learning Javascript, I urge you to read Crockford’s writings. He calls Javascript “Lisp in C clothing”, and he makes excellent points. He also essentially invented JSON and the OO inheritance mechanism that enabled large-scale Javascript frameworks like Prototype.

            JavaScript: The World’s Most Misunderstood Programming Language

            The Little JavaScripter (porting The Little LISPer tutorial from 1974)

          • Hey Dave,

            At this point I am almost 100% certain that javascript will be the next language I will look at deeply. I’ve gotten my hands on javascript the good parts, now all I have to do is find time to actually read it :). Oh well, I will get to it eventually.

      • Ted Matherly

        I’m not a programmer, really, but what you’r describing Alan is not a sunk cost issue. The rationale you’re describing here is consistent with the reasons I’ve always heard for using JS, which is a network effect. People use it, for better or worse, effectively because other people use it. Its widely supported and can do the job it needs to do (even if somewhat inelegantly). As another example, people don’t develop applications for the iPhone because they’d invested a lot of time into learning how to program for the iPhone, they develop for it because its ubiquitous.

        Its an interesting argument, but I don’t think anyone would rationalize their use of JS as being because programmers as a whole have sunk a lot of resources into its development and implementation. That said, the sunk cost fallacy is just as pervasive as you describe, and I’m sure there are plenty of other places in software development where it comes up.

  • Marco

    I don’t think JavaScript is a good example, clearly it’s biggest “bad part” is not a fault of language itself, but rather browser makers. There are some inconsistencies sure, but it’s very hard to get different browser makers working together to fix them, and you’d get the same problems there whatever the language.
    With regards to the books, you shouldn’t compare apples and oranges.. I own both and they serve different purposes.

    I’m under the impression that the JavaScript *hate* days are over now that we have libraries that fix browser inconsistencies, and that the more powerful language features and now widely know by the community, but I’m a bit biased as I work w/ JavaScript every day :)

    • As I said, I am no javascript guru, so maybe I should just bow to superior wisdom :), but it seems to me that javascript is a language that was quickly hacked together and almost accidentally worked out to be not as bad as it could have. It gained a bit of traction due to no viable alternatives and now has too much momentum and code behind it for a superior language to come along and displace it.

      • Dave

        Javascript was created by Netscape during its crazy startup phase, when programmers were sleeping under their desks (literally) and they wanted a way to provide scripted interaction with the page objects. They got lucky and had a designer who knew what he was doing. Back then it was called Livescript. Then they nearly screwed the whole pooch by renaming it Javascript as a marketing ploy, to play off the growing Java name recognition. It worked too — millions of people conflated the two, to the chagrin of Java and Javascript developers everywhere.

        I believe another language could displace it, but it would have to be adopted by all the big browser players (Microsoft, Mozilla, Apple, and now Google), and realistically would have to become an official standard to really gain any traction with the web elites. From that point I expect it would be about five years before you started seeing Javascript displaced in significant applications, and even then it would be phased in alongside Javascript rather than replacing it 100% at once.

        Kind of like the other comment regarding not throwing out decades-old legacy code. Because by that point, Javascript WILL be a “decades old” language… :)

  • Al

    I’m sorry, but it is stupid to use that photo as evidence of *anything*.

    Have you read Crockford’s book? “JavaScript: The Good Parts” is an incredibly dense book; Crockford respects the reader’s intelligence. I frequently see books 5 times bigger cover 1/2 the volume of content.

    Most developers who think they don’t like JavaScript actually don’t like the DOM and how browsers ignore standards. I’m not a JS guru either, but in my opinion, JavaScript is a beautiful little multiparadigm dynamic language. You can use OOP patterns; you can use Functional Programming patterns.

    In my opinion, it is somewhat naive to use JavaScript as an example of the dangers of the sunk cost fallacy . JavaScript has some bad parts and some neutral parts that are easily abused, but what language *is* perfect?

    Download jQuery and look through the source. It really showcases how elegant JavaScript can be. Then download Node.js or or fab or any of the new generation of out-of-browser JavaScript libraries. There are some pretty cool things happening in that space.

    However, it is valid to criticize JS for not evolving quickly enough, but much of that is due to its ubiquity and its dependence on the evolution of browsers, which is incredibly slow.

    • I have looked at “JavaScript: The Good Parts” and I agree that it is an excellent book, as you said very dense. However I disagree that javascript is a beautiful language, you can use functional and OO patterns in many other languages as well, languages that are better designed with more though behind them. This is just a gut feel reaction for me though, so I may once again be wrong here :).

      The very fact that we had to have jQuery come along is an interesting case. We no longer code in javascript we code in jQuery, no other languages needs such a crutch that I know of, but as people have said thay may be more of an example of borwser incompatibility issues rather than javascript crappiness.

      • Marco

        Alan, jQuery (and other libraries) mainly exist because of issues with the DOM ( not problems with JavaScript as a language. The DOM specification is pretty much language independent (or so should be), but it’s not consistently implemented across browsers, and it’s too verbose.
        Just look at what can be done with projects like Node.js, maybe it will change your mind a bit about JavaScript, but if you don’t like the language syntax or features nothing can be done about that. I don’t like Java, or PHP, and there’s nothing people can tell me about them that will change my mind.

        • I always keep my mind open, I do have strong opinions, but they are weakly held :) (don’t remember where I saw that analogy :)). Perhaps I will learn Javascript as my next language (either that or give Scheme another bash), at which point I might indeed change my mind, but right now I’ve seen way too many WTF’s regarding Javascript, to leave any kind of positive impression.

      • BlackAura

        jQuery isn’t there to make JavaScript (the language) more usable. It’s there to provide a simpler interface to the browser’s APIs (the DOM, events, and AJAX), which works correctly across different browsers.

        It also implements a few JavaScript features that are missing or broken in some web browsers, like a foreach loop and a JSON parser, and adds a couple of extra features, like the effects system.

        If you’re only using modern web browsers (latest Firefox, Safari, Chrome, Opera, and apparently IE9), you don’t actually need jQuery. These browsers already have nearly everything that jQuery provides, and they behave consistently enough that you don’t need to abstract the differences between them.

        Even in those browsers, jQuery is useful. DOM manipulation code is so clumsy to do manually.

        jQuery’s just a framework to help writing client-side web applications. Same as Rails is a framework for writing server-side web applications in Ruby, or ASP.NET / ASP.NET MVC / Castle / whatever are for C# / VB.Net, or any of the frameworks for Java.

        That said… Some bits of JavaScript can be a bit nasty. Getting anything resembling traditional OO working is fraught with problems, and you need to get it wrong plenty of times before you work out how to get it right. jQuery really does nothing to shield you from those kind of problems.

  • Hi Alan,

    Nice article. What do you suggest as the antidote to the SCF? Telling managers to be “more aware of it” feels like a solution bound to fail. :(

    • Hi Dave,

      You’re right that is not the answer. Rather you and every other developer need to be vigilant and speak up if you see it happening, but not just speak up as a single voice, but speak up as a whole team. It is easy to ignore one developer or even two, they are just trouble makers, but when the whole team speaks, a manager would have to be stupid not to listen. That’s one way :).

  • Jay

    I feel the same way about C++

  • I can relate to everything about this article… until you bring up Javascript. Although plenty of developers don’t understand how Javascript works (Prototype-based inheritance, the fact that it’s a functional language, closures and “this”, etc.), that doesn’t make it a bad language, a waste of time, or a wrong choice.

    The language a developer chooses to use is a wrong choice *if and only if* he doesn’t understand it well enough to write decent code, and at that, there’s something to be said about working at a language and learning how to kick ass with it.

    Finally, it should be brought up that no one’s desire should be to make Javascript (or any other programming language) work perfectly. Any idiot could tell you that *every* programming language has its strong and weak points and trying to make Javascript “perfect” (or, again, any other language) is a complete joke.

    • Hi Josh,

      Perhaps javascript wasn’t the best example, but it is not about developer choice when it comes to javascript. Developers really never had a choice when it comes to rich client side development. And javascript has a LOT of things wrong with it, but the industry as a whole never really had the guts to replace it with something better or at least provide an alternative (besides the whole flash debacle), due to the early traction that it received.

      • There were alternatives to Java but both were IE specific ones – VBScript and JScript. Both (thankfully) seem to have died out. I agree on the lack of credible alternatives though I think that might be also due to the awesomeness of Javascript :-).
        Interestingly just before this post, I was reading about an idea to implement the .NET CLI which is and ECMA standard in the browser engine (

  • Matt Jankowski

    The current goodness or badness of Javascript as it can be used on the web, via browsers isn’t really a good example of sunk cost, as much as it is a market outcome, or a constraint/externality.

    For example – let’s say I want to start making cars and sell them in the US – but I want them to run on electricity! Well, that’s going to be tough, because there are decades of infrastructure in place around petroleum powered cars. Is that an “industry wide sunk cost”? No…well, what if we all agree that electrical cars are better and more efficient, and less noisy, and honestly, look more futuristic? No, it’s still not a sunk cost – it’s the result of tons of successful endeavors, each of which was better than whatever it replaced.

    That being said – I agree with your general assessment that developers can be more proactive (hell, more responsible!) with how they approach calling out sunk costs when they see them.

    I was talking to someone this weekend and we made the observation that “good developers” will find the time to “secretely” refactor needy code, despite the instructions of their superiors, and that if you could somehow capture “covert” code improvement, it would be a great way to find good developers. I suspect the same is true for the willingness to bitch about and call out silly commitments to obviously broken projects and codebases.

    • Hi Matt,

      Yeah, I am starting to think, javascript was not the best example although most of the points I made are still valid, I feel. You’re right in that it is more of a market outcome, still never hurts to bring up the fact that javascript is not as great as some people would have you believe. Too much javascript love around lately :).

  • David

    I agree at a project level, but isn’t it also a question of what control/influence you have?

    So in your example, how would you replace JavaScript? A website needs to support the browsers its audience uses. It’s taken this long just to get a version of Internet Explorer that (broadly) renders CSS correctly, so how would you go about (a) producing this better and presumably open source scripting language (I’m assuming here that you’re dismissing plug-in based solutions like Flash and Silverlight), and (b) making MS and everyone else support it?

    Or am I missing your point?

    • Hi David,

      You’re right, it is difficult. The place to start is not to replace javscript, but to provide a viable alternative, and yeah we won’t consider flash a viable alternative. If even one major browser provides an alternative scripting language for client side dev, it will be a start. Some people don’t care about cross browser compatibility and they may choose to use the new language. After a while there may be sufficient traction that more people will start adopting it and other browsers may start supporting it to remain competitive.

      This doesn’t make it less difficult, but it is a start. The alternative is to never do anything, in which case the best you’re ever going to get is the current status quo or custom hacks built on top of existing foundations not designed to support those hacks, not a god situation.

  • Mike

    Well said! I have also experienced this numerous times. And the problem is it goes into a cycle, oh we spent $1mil on X – can’t throw it away now. So another 500k is spent. But now we’ve spent $1.5mil on X – there’s no way we can throw that money away!! *sigh*

    Also totally agree with you regarding Javascript, I’d even go further and include HTML, Javascript and CSS. They just simply were not designed to do what we are now doing with them – and new functionality and workarounds keep being gaffa taped on top. Hopefully HTML5 will fix this (not holding my breath though) – and now there’s some okish alternatives like Flex and Silverlight.

    anyway good read ;)

    • Hey Mike,

      Yeah it’s quite a vicious cycle, which is why one should be very careful about considering cost already incurred when making future decisions.

      As I said there has been entirely too much javscript love going on lately, which just goes to show that you can even start to think that torture isn’t so bad when you suffer it for long enough :).

  • Unlike most everyone else I agree with the JS example. As someone who has read most of the ‘big’ JS book (and promptly forgotten it) I can confirm that there is an awful lot of whale-guts in there. The extent of the language is huge and so is the cognitive burden of comprehending it.

    Whilst I also agree with your central point, in my experience the sunk-cost fallacy happens real slow. Now, if you were actually to spot the sunk-cost fallacy emerging over a period of time then there’s another problem. What will it cost to fix the code that led to the sunk-cost fallacy? Let’s call this the salvage cost (be it money, sanity or time) .

    For it to be truly worthwhile abandoning the sunk-cost, the salvage cost must be less than the *future* sinking-costs by some measure. This must be true because if you have sunk all your costs, and there is nothing left to pay, then abandoning it all just because it was a mistake is an even bigger mistake. No one would ever do this I hope. But if there are STILL costs to pay but they’re not very big, then you really need to estimate your salvage costs as carefully as you can. Again by some appreciable measure.

    Now, I’m willing to believe that the same lack of foresight that led us to the sunk-cost fallacy might also lead us to make a bad estimate of the true salvage-cost. Which in-turn all leads me to the conclusion that your conclusion “it’s never to late to fix it” is perhaps misleading. Sometimes it REALLY IS too late.

    Thankfully I think on large projects, like JS, evolution will save us. Like a living-organism, JS will continue to evolve until it reaches the limits of its capability and then it will either sink without trace or become something new and better.

    Now, if we can only wait/live that long :)

    • Hi Steve,

      That’s very insightful, I have to agree with pretty much everything you said. You’re right in that sometimes it IS too late to fix it, but I would argue that is because you have become aware of engaging in the sunk cost fallacy a bit too late. Or even more accurately, you have been aware for a while, but couldn’t bring yourself to do anything about it, until circumstances have forced you to face reality.

      At this point it may be too late to fix it, but the best course of action is probably to cut your losses and go on your merry way, unless as you mentioned the salvage-cost is bearable. However, once you find yourself in a morass, it is very hard to climb out and not be badly affected. So even though you may be able to salvage, you should be planning to salvage what you can and then cut your losses, rather than salvaging to try and continue operations indefinitely.

  • Matt

    In my (somewhat limited) experience, the cost often has less to do with dollars and more to do with ego, pride, and reputation. Admission of being wrong in software design and decisions is not treated as an evolutionary step forward in the design process, it’s treated as a problem.

    On the other hand, how can you blame them? Most developers are not very good at what they do. From the management perspective, developing software is a painful experience in most enterprise settings. It’s a place of bad design, poor execution, and massively missed timelines. The cost of a mistake is not just the sunken cost, it’s the rework cost and the massive risk of sinking cost into the next mistake.

    It’s sad that the sight of a piece of “working” software outweighs all other considerations. And frustrating though it may be, like most problems in software development, the source of the problem can eventually be traced back to developers.

    • Hey Matt,

      Very good points, I have seen/heard of plenty of projects where any kind of working software was such a massive step forward, that everyone was quite willing to overlook anything else to get it.

  • Pingback: – Lazy Friday Reading Assignments()

  • David

    Nice blog – couldn’t agree more with whatever you said. Keep up the good work!


  • TonyB

    From the book “I’ve been thinking” by New Zealand politician Richard Prebble:

    ” The Post Office told me they were having terrible problems tracking telephone lines … They found an excellent program in Sweden which the Swedes were prepared to sell them for $2m …. So the managers decided to budget $1m for translating into English and another $1m for contingencies.

    … But, as the general manager explained, it had turned out to be more expensive than the contingency budget allowed and they needed another $7m

    “How much”, I asked, “have you spent on it so far?”

    “Thirty-seven million dollars” was the reply.

    “Why don’t we cancel the programme?” I asked

    “How can we cancel a programme that has cost $37m?” they asked

    “Do you believe the programme will ever work?” I asked

    “No, not properly”

    “Then write me a letter recommending its cancellation and I will sign it”

    The relief was visible. I signed the letter, but I knew I needed new managers. …. “

    • I can’t even say how many times i have witnessed or heard of a similar situation, still a great story :).

  • John Armstrong

    You really have to balance the concept of Sunk Cost with the concept that Real Developers Ship.

    All software sucks. Digital technology is a horrible abstraction for the continuous nature of the real world. You’ll never meet a developer who can truly say ‘Hey, thats exactly what we meant to build, is utterly awesome and has no compromise or terrible little bits under the hood’

    It all sucks. Real managers (and developers) balance general suckiness vs business goals.

    Sure, the sunk cost may be $1.5 MILLION DOLLARS but if your product is grabbing $10 MILLION dollars of revenue -per year- for 10 years AFTER the ‘sunk cost’ its actually a fantastic business decision.

    Your Sunk Cost examples are only discussing the concept of an internal IT project vs a shipping concern. In fact, if that $1.5 MILLION DOLLAR internal IT project is supporting $25 MILLION dollars of business -per year- maybe its not so bad? For a valid comparison you have to examine the business context the investment lives within.

    You also do not mention that Greenfield development (the only alternative to a Sunk Cost from the perspective of ‘we are building it ourselves’) is almost inevitably doomed to failure.

    Joel Spolsky has a great article on this :

    “When you throw away code and start from scratch, you are throwing away all that knowledge. All those collected bug fixes. Years of programming work.”

    • Hi John,

      I get your points, but I actually disagree with Joel regarding throwing away code. He is looking at it as a business owner which inevitably means you cringe at throwing away anything you spent money on. You’re infact not throwing away any knowledge if you retain the good people who built the original code in the first place. Throwing away bug fixes, is not such a bad thing. Infact I throw away the code I write day-to-day quite regularly, and believe the code I write to replace it is better as a result.

      “years of programming work” mentality is precisely what sunk cost is all about. It doesn’t matter if you’ve done years of programming work, you can’t get those years back, and shouldn’t use that reasoning to guide future decisions. If the current product is crap, unwieldy, hard to maintain and the only thing you get from it is “potential future benefits” – get rid of it and do it better.

      • John Armstrong

        First off, ‘looking at it as a business owner’ is the absolutely right thing to do. We do not live in a platonic world of software for software purity. We need to eat, we eat by selling our software.

        More generally, how much code you get to toss depends on your scale. Sure, throw away code you write ‘day to day’ and when the team is small and discrete in time and space you can throw more and more away. We throw away a lot of new code on a day to day basis. No Sunk Costs on New Code.

        But it doesn’t scale on a legacy platform. Software lives a lot longer then we think. For example, in my last position I managed a 10 year old code base serving millions of page requests a day. Throwing that away is not an option, you can’t replace the train while its running on the tracks and 10 years of business rules developed by 4 VP/CTO’s with different philosophies compounded by a founders initial brillance (in 1999), insane early architectural decisions and probably 15 different developers over the years mean that Joel’s points ( re: the arrogance of thinking one can replicate years of accumulated knowledge ) hold just fine for long term / large scale development.

        Additionally, any such development effort scales beyond the ability of any one team member to have perfect recall of all past business decisions. Imperfect documentation, fire fighting problems, many developers on a project and a constantly evolving business landscape create an environment where to just “retain the good people who built the original code in the first place” is not enough to retain any sort of reasonable window into the many business situations that lead to something being the way it is on any given day.

        I also think you may be misinterpreting Joel’s point on ‘bug fixes’. Its not just the bug fix, its what the bug fixed and what business logic lives under that and how many customers depend on it and why it was put there in the first place that really matter. Its the complex interactions of countless instances of these attributes in large, actively used, systems that make it inadvisable to throw it all away.

        This is not to say that identification of wasted effort and a concentrated focus on rebuilding is not absolutely critical! You always have a list of ‘things to throw away’ but you do them in line with the business efforts, one at a time, and this requires a massive level of investigation so that you can understand things like why, on every 5th tuesday, the date for customers using MyFooBits(int count) who have the setting “DoesFooBits==true” set on any object that was instantiated before 2006 is automatically set to NOW()-1 but only when run from cron with a specific flag set (almost a real example sadly enough).

        (see Working With Legacy code by Michael Feathers for a lot more discussion on this topic).

        Also keep in mind that very large systems that serve very large audiences have unintended ‘features’ (I call them ‘experiential features’). Things customers do that were never intended and thus are undocumented and unknown but absolutely critical functionality for that customer. You can’t Greenfield these ‘experiential features’ and to lose them means you lose customers. No big deal when someone is giving you $25/month but when you just cashed a check for $20k for a 2 year contract this sort of thing matters.

        So I agree with you, in a world unbounded by finances you are absolutely right. You are also right that, on a day forward basis you should strive to throw away junk and build it better. Unfortunately in large scale, commercially driven, decade scale development, this perspective, applied uniformly without regard to business objectives, can be extremely destructive.

        And before you say ‘decades! please! nothing lives for decades’ in my 15 years of work I’ve spawned or worked on at least 3 decade old platforms. It happens a lot more then we think..

        • John Armstrong

          Holy moly that was long. Did not mean to write such a diatribe :)

          Just strong feelings on the topic after recently leaving 2 years of dealing with a massive, horribly architected, horribly scaling, legacy system that customers absolutely loved and adored. Sorry if it came of sanctimonious or too ‘veteran of the coding wars’ish..

          • Haha, not a problem, I actually really appreciate hearing a strong opinion especially if it brings to light points that may not have been considered, I am sure other people do as well. Cheers for sharing your thoughts.

        • You’re right in that software does live on for years. I would say though that it often lives on for way longer than it should. Of course once you have a system that is a decade old with perhaps hundreds of different people’s input plus everything else you’ve mentioned, you really have no choice but to keep it, as you don’t have the capability to replace it. The thing is – it should perhaps have been replaced many years before the situation got to this point. Or at the very least the time should have been invested over the years to incrementally improve the system until it is actually decent. If this wasn’t done, then you’re right, it is a rock and hard place situation.

          • John Armstrong

            “Or at the very least the time should have been invested over the years to incrementally improve the system until it is actually decent”

            Exactly, this is the opposite of ‘throw it all away’. Incremental improvements happen constantly with an eye towards throwing away small bits with the knowledge that eventually all of those small bits add up to big improvements.

            I guess I don’t consider this approach ‘the very least’, its a viable software business strategy employed by any organization that has a living product that has been in production for more then a few years and is actively used by customers.

            It is also important to remember that todays ‘enlightened coding practice’ is tomorrows stupid! stupid! stupid! decision. This leads inevitably to the fact that any implementation that outlives the fad that spawned it becomes legacy code. Since it outlived the fad its obviously useful and being used. Do we toss it because its no longer in vogue or do we ‘incrementally improve it’ to get it closer to current best practices? We all know the answer here.

            Good post, its been fun to think about!

          • Mike R

            Great article Alan, and valuable insight from John – I wish there was more constructive forum debate like this!

            One point that I’d like to add is that we often tend to overestimate how much of the cost is sunk into the code itself. The most expensive part of developing software is gaining a deep understanding of both the problem and the solution domains.

            If you still have those people with the understanding, then a code re-write would be significantly cheaper. For a complex decades-old system, the cost of gaining a deep and broad understanding would be huge. The evolution of those systems tends to slow more and more until they are replaced by a revolution.

            BTW, I agree with your javascript example (and HTML for that matter) but I find the more interesting debate is around if/when/how to re-write.

    • Rob Whelan

      Ha! I had that same exact link ready to post once I finished reading through the comments, amazed that no one was pointing out the massive pitfalls of “starting from scratch” on a project of any significant size.

      Joel’s article is also somewhat one-sided, but his points are valid, and this was a tidy way to put the reason for the skewed evaluations: “It’s harder to read code than to write it.”

      The proper way to “throw away” a huge, ugly mess of code in most cases is to refactor it gradually into non-existence, not throw it away first and *then* see if you can reproduce functionality you may not have even fully understood.

      Another way to put it — the sunk cost fallacy absolutely applies to software development decisions, and if you have access to *reliable* estimations of effort required for the various different options, the decision is easy (and every once in a while it’ll be the right thing to jettison an entire codebase). But if you rely on the little-more-than-gut estimates (and natural excitement at starting fresh, with noble aims of “doing it right this time”…), sometimes you’re really going to suffer in the long run… and due to budget overruns, last-minute hacks, desperate fixes, etc. after a few years your new project will be just as much an ugly mess as the one you replaced (but that ugly mess exchange will have been so very much more expensive than working to clean up the first mess…).

      You mentioned day-to-day throwing away of code in a comment below — that’s great; that’s refactoring, and that’s the real secret to good architecture & code, not starting from scratch every time it feels like a good idea.

      [reading more comments… apologies if I’m just rewording John’s points too much here]

  • MisterMister

    As others, I disagree with your JavaScript example.

    I think there’s two important points.

    1) I think you have an exaggerated sense of how “bad” JavaScript is…. it has it’s problems, but it’s really a nice language and it’s ecosystem is improving.

    2)That’s not *precisely* the issue.

    The whole crux of what you are talking about is this: “Would it have made more objective sense to start over?” What you are talking about is people letting their emotional and financial investment cloud their judgement in the aforementioned decision.

    In your first example, there was a point where an objective party would say: “quit while you are ahead” or “stop throwing good money after bad”.

    In order for JavaScript to be a an example of what you are talking about, there has to be a point where it would have made objective, good sense to abandon it.

    The problem is that point doesn’t exist….there’s no point where it was both feasible to abandon JavaScript and simultaneously worth the huge hassle except perhaps the beginning…

    Yes, in theory we could start throwing weight around and try to force the entire industry to something better….but you have to ask yourself if that makes sense, and I don’t think it does because JavaScript isn’t THAT bad and definitely not that hopeless.

    It makes more sense to try and fix JavaScript than it does to try to move to something else (at this point).

    There’s always going to be some hypothetical better way to do it….the whole point is to know when you should give up on what you are doing now and THAT’S when you start worrying about said “better way”. Otherwise you’ll be forever chasing rainbows and the moving target that is the “better way”.

  • It’s not fair to always pick on JavaScript. JavaScript is actually a quite good language and projects such as node.js and the V8 interpreter have picked up a good amount of steam. I’ve also seen interesting use cases of using JavaScript for configuration files for Java projects via Rhino.

    What people hate is JavaScript and the DOM. That’s what sucks.

  • Ken Jackson

    Alan, stick to your guns on Javascript. It really is an ugly language. Sure it’s powerful, but so is x86 assembly language, and I don’t write in it anymore either (at least not by choice).

    But like x86 assembly, the people who have to write in it tend to defend it. And they can show you cool stuff that can be done with it (I did some really cool stuff with Logo when I was 6), but it doesn’t change the fact that Javascript violates several of the key tenets of a good language. Such as basic things should be easy. Simple mistakes should be hard to make. Difficult things should be possible. Of these Javascript really only satisfies the last one.

  • Pingback: Mejores blogs sobre tecnología. « droope.wordpress()

  • I see a problem with this sunken cost fallacy: the alternative could be much worse. What would happen if, industry-wide, there were a tendency to just dump emerging projects because they had certain flaws? Nothing would reach maturity as a project and endeavors would be killed at the first signs of trouble, not giving them the opportunity to become useful through their good parts.

    In my opinion, it’s not the sunken cost that is the problem, it’s the fact that we expect stuff to go all lean and have no flaws in the first place. Software is hard and complex and must accommodate hard to implement and complex changes constantly. All this takes more and more money as more and more functionality is desired from the same product. Killing the product and starting all over again will:
    a) cost at least as much as is did to get it in the current stage.
    b) have to be repeated as many times necessary when new functionality is added in time and the current version becomes bloated.

    So, as you can see, there’s money down the drain both ways, the best you can save is some degree of frustration for a while.

  • i don’t like java script too because it suck.

  • some genuinely superb info , Sword lily I discovered this.