Imagine doing estimations, management "transforms" them into commitments, then those things don't happen, then you have a review meeting to "analyze" why the estimations were wrong, then you have a third meeting to re-estimate the estimations and do it all over again. Bonkers :))
Review meeting??? No need. You have to finish the work within the estimates, working overtime and on weekend (non payed), so, at the end, all went well😏
When you give estimations, remember to add in your error bars, and remember to tell them that your error bars are measured in orders of magnitude. When things turn out wrong and you have to explain how wrong, you just measure the number of orders of magnitude it was off, compare it to how unlikely it was to be that far out and then find that you have insufficient evidence to prove that the change was not within the model of your prediction.
Discovering this channel and listening to these lectures is like coming up for air, whilst drowning in a sea of "this is fine" fire memes that are actually my day-to-day reality.
First question I always asked was "who are you giving me for the team?" It makes a big difference. I know who's productive and who is not. In bigger organizations that's hard to know unless you are organized in smaller teams. Next question is "has this been done before". My boss once asked for an estimate and old me he though 2 man years. I ended up telling him 6 man years and he said the customer wouldn't pay that much. I told him "we won't want the contract then". The customer paid for it much to the surprise of my boss. It ended up taking us 5.5 man years and had a little extra profit. That solution is still in use some 20 years later.
We had a bright guy who estimated (say) a year to do a project. Management asked him what if we gave you three guys? He said: five years. First they have to do my PhD, then we can build it in a year
@@stevecarter8810 That's rather just arrogant. PhDs really think they are special, but if they are really as good as they believe they are, they should be able to explain and instruct others to do what is needed. If they CAN'T, that's an indicator of their incompetence. Why have PhDs if they are unable to share their knowledge in a feasible manner? To have code monkeys with a degree? Unless the task is really difficult and requires a high degree of specialization for all involved... but we are talking about the top 0.2 percentile of task difficulty here - tasks that are so complicated that it cannot be expected that a PhD can lead it. I personally worked (and still do) on tasks that normally require "experts" in that category, and I aim for highest performance.
@@brianviktor8212 Hmmm. Someone calling someone else arrogant for allegedly doing something in circumstances he personnally didn't witness, taking the piss on PhDs, and ending with "I often do an Expert's job and I'm Super Good At It". Well, that sure paints a picture.
@@maianoguillaume My comment is from 5 months ago... It's a worldwide, single-realm, non-instanced, very high performance MMO with a distributed architecture that allows the addition of physical game servers to the server cluster, whereas an algorithm ensures the server load is automatically distributed, given a dynamic cluster system. It uses my own TCP/UDP communication library, which also is quite powerful. It is done. It took me 2 months to do all the logic. I just need to do some more bugfixing, testing and client implementation. This technology does compete with the best of the best that exists, and could take #1... if it were a competition. It had a level of difficulty you cannot even imagine. Gemini rated it 12/10. Not that it matters... each single attribute would take teams of developers months of work. And the entire thing? It doesn't even exist (yet). Anyway, regarding PhDs - my point remains unchanged.
I love that you've framed it as the decision of which thing to build. So much of the toxic estimating is "How much can we do in this sprint?" or "Will I get this next week?". When I've found estimates valuable it's been in decision making - and usually with an order of magnitude style sizing. Is this "hours, days, weeks or months or quarters?" On the estimation of value, I have not found the estimates particularly valuable, but the process of discussing the estimates with a couple of product managers has uncovered interesting and valuable disagreements.
I look at this problem at a different, but complimentary angle: a buy vs build (or create) mindset. I've seem decision makers demanding estimates because they have a buy mindset. This is the thinking we do when buying anything, from a book to a big corporate software: how much it costs, how long it will take to deliver and do I want it that bad to put up with those two? This works because the thing is already built. The risks the builder took and unknowns discovered are aready built into the price. We just need to decide if we can afford it. When building something new, there is no way to know for sure how much it will cost and how long it will take. And approaching the decision to build something with the mindset of buying something makes everything worse. Decision makers need to be educated in shifting their mindset if they want to build software and really understand the risks and uncertainty that comes with it.
Buy vs build is often not a truthful exercise, people often fail to account for the complexity of integrating the software. Even then most vendor software is 85% of the way there so they need to plug that 15% and when you don't own the code that is a serious issue. I've been on both ends both in the enterprise as a developer and at the vendor. Vendors are about selling, they typically will not customize for customers and in general the sales people lie about their capabilities. Unless the vendor package does exactly what you need it to do there is no "buy", you'll be building regardless of whether or not you buy the vendors package.
That's a really good point. In one of the links Dave shared, that author writes about looking at assets from a 'total cost of ownership' view. This applies equally well to software. Looking at the TCO, building and owning the software has a lot of uncertainties but also a lot of opportunities. Both of which are hard to price into the value before you start any work
However, this is the reality of business. Big corporations want to know the cost and delivery date before signing any contract and trying to convince them that “we’ll deliver value every month and charge you a small fee then” will not get you the contract and in fact will damage your reputation as being amateur. So what do you do then?
@@Jedimaster36091 The reality is the software company is taking the initial risk. There needs to be a product before you can sell it and typically if you are selling “to be built”, there will be tollgates with acceptance criteria and a prescribed payment schedule, even flexible/dates and times must be mutually agreed and missing dates usually carries a financial penalty. These types of arrangements are more a partnership and less a product purchase relationship as the software firm will not take on all the risk. Typically in these circumstances a product to do what is needed doesn’t exist. I’ve never encountered a vendor willing to do a fixed price delivery for a new build that wasn’t insanely high and without loads of conditions, they all want to do T&M, if not there is a more of a partnership. The bigger question is on the build how to you measure it as a business success? A product can be fantastic but if it costs more to build than you could ever recover from sales it’s a failure. So when deciding business and product viability you need to have some idea of cost to deliver vs revenue. I think there can be some SWAG of what a build will cost and how long it will take to go live given a determined team size. Where estimation goes off the rails is when you try to get more micro and accurately determine time and cost for individual components. I feel like experienced software people can tell you with 99% certainty they can deliver something under a specified cost and schedule using a worst case scenario estimate which is what should be done.
I've worked on 2 projects in which I can estimate that 80% of the time was spent on estimations, deadline negotiation, explanations on why the deadlines were missed, re-estimation, and anxiety-coping activities to deal with the stress of missing those deadlines.
In both situations, the stakeholders came from construction backgrounds (where estimations and detailed planning makes sense) and did not take a "software is different" reasoning from the team
I'm so annoyed with people who believe they are improving their estimating skills when, in reality, they are becoming more adept at manipulating their lunch breaks to consistently finish work exactly on time.
I‘m always thinking about giving a lecture to my dear colleagues about how developing software actually works so that they can see how stupid their ideas are. People think like we are some kind of craftsmen pulling up walls or something. What really bothers me is the idea of learning from past estimates. That is insane, because everything we do is new. And if we were really doing the same things over and over again we we’re doing it wrong!
"giving a lecture to my dear colleagues about how developing software actually works so that they can see how stupid their ideas are" -- that approach will ruin your career.
That‘s exactly why I‘m not going to do it. But if you do not communicate where things are going in the wrong direction, nothing is going to change. So the „lecture“ has to come in small doses and it should be respectful after all. The hardest part is to present an alternative that people can agree on so you can only move forward step by step. That is often quite frustrating and in my case leads to these kind of thoughts all the time. Nevertheless you have to remain professional.
@@madmanX1314 Things aren't going to change regardless. You want to dive down the rabbit hole of power struggles that have existed for longer than mankind, have fun. But it won't get you anywhere.
Well be ready to get a lecture how company management, resource allocation marketing fund allocation works in return. I have done both things and I will say, software development is not harder than navigating business and government regulations that emand beforehand estimation of cash flow..
Even simple tasks can have massively wrong time estimates. I worked on something recently that I thought would take just a few hours and instead it took 3 days to do. Once I was actually building the system I noticed that the data I was working with had a number of mathematical properties that made the problem much harder to solve. Trying to do that analysis ahead of time to come up with an estimate would have been most of the time to solve the problem. At least in my case, nobody really cared that it took longer they just cared that it was as correct as we could reasonably make it.
Hah, I was once asked to add a field in a Java entity because a new column had just been added to a table in a database. Shouldn't take very long, right? First, nobody really knew how it was going to be used or for what reason. But we needed it, apparently. So it took a lot of digging. I asked about the data type. I was told it would always be a 1 or 2 digit number. So I added an integer field in the model. But it didn't work when I tried testing it. The column was not added in the test environment. I told the database team to add in the test environment, and that took a long time for them to do. I then had to connect to the database, but there were access issues. Eventually I got access, and I saw a bunch of 9 digit numbers and letters and got a bit confused. I was told it was just for testing and I should expect 1-2 digit numbers only in production. I was told to just deploy my change, because it was a simple change. It didn't work. The column was defined as a char type in production. I fixed that, but still it didn't work. My conversion from String to Long somewhere else in the code failed when the number was only 1 digit because they would pad it with a space in front. So I had to trim the number before converting. A bunch of other things happened as well, and I just can't remember. I think I suddenly lost access to my work accounts, or a certificate expired, or some transfer job failed and needed to be rerun correctly. And during this entire time, every little change had to go through a tedious process of tickets and hours of waiting for each reply. It took almost 2 god damn weeks to add an integer.
Assigning story points feels like an exercise in futility, especially in my current project, where estimations only exist ot decorate decisions that have been already made by the client. They're more of an excuse to justify development time if anything.
I am using a simple tool that looks quite similar to WSJF, but not with the same purpose. I ask stakeholders to give me their perspective: - What benefits would we like to create? - What happens if we don't do this soon? - What risks can we predict? - What's the simplest way to achieve this? What typically happens is that the answer is, "We don't know." I then take it from there ... Divide and conquer, Slice and dice, experiment, inspect and adapt. But as a side note, it's amazing how common the concept is, "Developer, I need you tell me EXACTLY how long this will take!" - "I don't know. Can you tell me more about what 'this' is?" - "I don't know."
My rule is... estimate time you think it will take to do. Then multiple by BIG questions that you don't know answer, because client don't know it to. Then multiply by 2 and you will get around good number of hours, what you need to finish project. 🙂
That’s pretty much been me method too. And you know what? I can count on one hand where those estimates have been off in the last 5 years. Meanwhile, I keep seeing dev teams underestimating over and over again, and having to work weekends to satisfy leadership, because the dev team is too timid to give a more reasonable estimate.
A technique that has worked for me is to start off with a high margin of errors. When stakeholders balk and request a more refined estimate, tell them that will require investigation and perhaps a prototype. That itself has a cost but usually it's a tiny fraction of what the entire project is. The act of developing the prototype will not only uncover where most of the work is going to be, it usually results in a demo which can help the stakeholders refine the estimate of the final VALUE of the feature.
So my takeaway, as an absolute junior dev - in order to solve the problem of estimation - estimates must be very 'coarse'. It is not wise to try and estimate the benefit/profit of the feature down to a the cent, rather they should be in very rough terms (e.g. the benefit of Feature 1 is in the ballpark of a few hundred thousand dollars a year). Similarly, the cost must also be estimated in a very ballpark manner. And if the benefits far outweighs the cost, they are to be prioritized first. The lesser this benefit-cost ratio is the lower down the priority list the feature is.
I had a boss once who expected me to be able to just come up with an estimate 90-95% likely to be correct, with a day or two of precision. He expected that for everything we ever considered doing -- including large projects. I couldn't get my head around how he would use such a number if it were even possible. He hadn't a clue what the dang feature was, what it meant, what value it provided or anything. I got the impression he worked for a company before that did the same thing over and over again. He couldn't understand that a feature that may sound in English to be similar to something else, can be totally different than the other task. Integrate with some software that we don't know the protocol for or if it even has a protocol? Sure, we've integrated into things before, how hard could it be?
Managers crave these numbers because it gives them a sense of control. They crave it so much that they deny the impossibility, like the dictator who denies that good and decent people could ever want to be rid of him.
Here is a way to "solve" said issue, tell him that such an estimate will be very expensive to calculate - specifically compariably expensive to making the thing, and that he should be sure he wants to pay that beforehand, and providing an accurate estimate on that estimate is similarly expensive. If he wants to go ahead anyway, the idea is simply to develop there entire thing while measuring it out, and the provide the measured out time as the estimate to him. That is the most reasonable way I can come up with to fullfill his demand, and he was warned that it would be that expensive.
@@sorcdk2880 Yes we had several discussions over how long it would take to come up with the estimate. This started with, something reasonable like taking weeks to develop the plan and estimate and ended with needing something on the spot. It is important to reason with people but at some point you come to the conclusion that the person isn't trying to be reasonable. I ended up switching jobs.
I have worked for 40+ years in product development SW and HW. I have concluded that guessing (estimation) by comparison is much more accurate than guessing by synthesis or detailed analysis. Business owners want to know how long something will take and I've found that thinking "we recently did product A in 18 months...this product B is a bit simpler but we have lost a team member so maybe it will take somewhere between a year and two years! Product C is waaay more complex and has aspects that we are unfamiliar with - it will take 3 to 5 years. Has anyone else had success using this kind of simple aid to guessing?? It works much better than a Microsoft Project plan with every task defined and resource levelled etc. That will usually give an estimate that needs to be multiplied by Pi to get something about right! :) But it's a waste of time creating the plan to get a terrible estimate.
I really like the ideas from How to measure anything. The key is error bars are not a problem, unless they affect the decision, and if they do their is a way to price the value of new information. So the learn more part is about buying information. Reinersten is also of a similar mind. Our optimization is around U shaped curves, so there is a large range where performance is good. Thus simple models and measures are normally good enough.
Came to the comments looking for "The Principles of Product Development Flow" by Reinertsen - really a must-read if you're working in any kind of development. How to Measure Anything was eye opening too. And good point about models having appropriate precision for the organization - of course NASA or software designer for a nuclear plant need tighter and more precise estimates, but most companies are not that...
Estimations are only valid if you already did something very very similar. That said, from a owner of a software company POV now, they are needed, because money allocation depends on that. So we really need to know a low and a high bar for an expected release.
You can say you need these estimate, but they’re still mere guesses, guesses that if treated as commitments get turned into lies. Stop trying to do that. Switch to learning. Ask the business how much it’s willing to spend to learn the next smallest thing you can find out for them that will help them know whether to spend more money, finding out more. Get out of the feature business, and into the learning business.
@@TrackedHiker I Know MORE about development than any of my employees. I had your mentality yet as ALL business owners I understood that it cannot be that way. Simple as that. There is a reason why all developers that start their own companies change their opinions... because it is not an opinion, it is a fact. I bet you would be angry if your boss did not accept to say how much you will earn in your month until after the month ended. Yet we also do not know how much money will enter the company before hand.. but we HAVE TO predict! We can lose a client, we can fail to get new ones, lots of things can happen. The business feature is the one that pays the bills, be sure if ONE facet of companies that will not change is that. The business is the goal, development is just a tool. You do not change the objective just to make the tool easier.
@@tiagodagostini Why don't you spend your own money if a developer goes over your oh so precious budget/estimations? Surely you understand a developer can't estimate new features accurately, since you know so much about development.
Still a little lacking in answers or approaches for situations where estimates are definitely required, such as contract bids, yearly budget planning, comparisons to buying, new staffing requirements, etc. Of course if you don't estimate, estimating is easy. :)
I guess you didn't watch the video? There is no way to increase the accuracy of estimates. The data to back that claim has existed since the 1990's and was described very well by Steve McConnell in "Rapid Development". Working within "error-bars" that make sense for estimation, using estimates to prioritise work, rather than to draw Gannt charts and if you really want to increase the illusion of rigor try CD3 would be my advice. Sure we are pushed to make estimates sometimes, but don't buy into the illusion that you can do that with any accuracy, there is no solution to that. Accuracy in predicting the future (estimates) requires that we understand the problem, that means we have already solved it, and there is no point in solving the same problem twice in software, because once you have a solution, you can clone the answer for, essentially, zero cost - so we are ALWAYS build something new, or we are being dumb!
@@ContinuousDeliverySure we cannot accurately estimate the effort of a software project. But that doesn't stop potential clients from demanding it. When it comes to decide which software vendor to choose, which one would the client likely choose: the one who says "we cannot estimate the cost and delivery because it doesn't make sense (or we're agile)", or the other one who gives some estimate? My bet is on the latter. Doesn't matter if the estimate is realistic or not, it is now written in the contract. And I would argue that this is the norm in most organizations.
I think the point is that what your are describing @@Jedimaster36091 is a business problem, not an engineering one. It sounds like your business operates on the basis of time and materials contracts, so it needs to work out how much and how long a project will cost. What @Continuous Delivery is saying is that is sucks to be you in the situation, but that no matter how much you 'need' high precision estimates, they are not possible. So the business has a problem, not the engineers.
Here's the deal some items can be estimated more accurately than others but in general estimation is about throwing out the biggest number you can justify so you can keep your team and don't get pushed to take on more than you can deliver. Generally the goal is to quote your best guess for the longest it could possibly take that way you have some spare capacity to refactor or buffer should things go wrong. In general you can guess "is the ask small"(rarely is) or is it massive and the more novel or unknown the more massive the estimate needs to be. Developers don't want to estimate, we want to ship code but like death and taxes we are always asked for an estimate, the earlier we learn how to respond to these requests the better. I have long thought the best model would be to fund a system not based on the what but rather the importance and try to maximize the output rather than think you can fund for features because in my experience most companies baseline funding isn't even enough to keep the lights on.
this is a good video for whoever managing the schedule, whenever I told these people the difficulty to do estimations, they would definitely tell me to give a high level estimation, and at the end I will have to rush for the timeline that they promised to stakeholders
It pains me to say that many of your videos present many problem statements (sometimes even the solutions) that many developers already know, but would have absolutely 0 chance of implementing at their organization. So watching these is often just an exercise in frustration. I'd say I hope management watches these but not likely.
Great point. All these videos make perfect sense but they are in a way just repeating what is already been said and known. The real question is how to actually cause this change? Because as a simple developer I seem to have little impact on these things.
Isn't that one of the most fundamental questions one would ask during planning? "Why should we do this and what are alternatives (which include doing nothing)?"
@@noli-timere-crede-tantum sure but I like the idea of turning the question around. I'm thinking about stuff that can often get kicked down the road like technical debt
I was in a project management training, where they told us two things: - we can estimate a general task by estimating the best scenario, the worst scenario and our gut feeling. (Most propable scenario.) - this will never work with software. :) My issue is that I understand that software estimation is futile, but we have contractors, we have budget plan, we have deadlines, etc... I can't say to the management that I have no idea about the launch date. What I can say is that I have only vague idea about the initial (MVP) functionality. Usually our initial expectations are very high, we want a software that can solve world hunger and word peace while explores the deep space,, and my job is to cut the not-so-important functions during the project to meet the deadline. For this, we need some kind of estimation. (Which we have to update weekly.) In the end, we will have a software that is ready for deadline, we can give it to the users, we can try it and it has some "Coming soon" screens.
Rather than attempting to forecast the value of the feature for the Value Provider, i tend to lean to estimation of value for the Customer. Eg. if you think you're going to save time with your solution - then this becomes something quantifiable. So, when you have several features that are supposed to save time - calculate those savings for your customer base (how many people are actually going to use this and how much time it supposed to save, approximate their hourly rate and calculate the cost over say 1 year. This way you can calculate the cost of delay and CD3 from customer stand point, while your own business benefit is becoming a derivative from your customer's success. There is another benefit in this analysis as it requires the product team to really dig into the value hypothesis definition and learn more about their customers. On the other side the question of how much is it going to cost us to implement a feature is still a guess, but in this case "how to measure anything" is providing an alternative. Instead of giving a straight up number, you work with 90% confidence intervals. E.g. you're 90% confident that this feature can be implemented within a range of 1 week to 6 weeks. Providing this type of estimates is a trainable skill and gives you higher accuracy and precision together.
It is very similar to a standard Jeff Patton conference talk. He calls it a "bet" and he draws the quadrants and calls the expensive low-value section Stupid. I've seen him do it on two or three separate conference videos on RUclips. He also mentions CarMax in those videos.
The rough cost-value matrix that was shown in the video is very useful, but you can go further. Any feature of large size can be sliced up into smaller capabilities that provide incremental value. Repeat the rough cost-value classification for each slice and use that with stakeholders to help decide what should be in and out of scope. The smaller the slice, the more you can make intelligent tradeoffs about team size, cross-team dependencies, and timing. Stop this process once you have enough confidence to get started, then prioritize the riskiest or highest value work first and adjust as you go along, just as he said.
I am talking about slices of value, not necessarily implementation details, a la Neil Killick. The point is that not every capability we are imagining in a large feature may be needed, and when we decompose and estimate cost-value as you have done for more fine-grained capabilities our stakeholders may come to a very different decision on what feature they want built. The more incremental we can slice it, the more choices we have. Prioritization schemes like CD3 and WSLJF have the flaw that they tend to bias against any large feature, since the duration uncertainty is so much higher that they never get into the top right of the cost-value matrix. Slice capabilities thinner, and it will be easier to identify what riskiest assumption tests we want to run early to validate our assumptions. In general, the goal of any estimate is to be useful, not accurate. If the features are all a reasonable scope to begin with then we don’t need to decompose further. There are many cases though, as others have mentioned, where you need more confidence for budgeting, contracts, or cross-team dependencies to make a decision to proceed, and decomposition to a finer grain may be called for.
I am not computer scientist (or engineer). In my electronics background the definitions of accuracy and precission are a little bit different. A measurement is precise if it is repeatable, but it does not need a lot of resolution. Anyway, I enjoyed the video, and your channel.
Again, all those arguments are relevant in a world where you build software for the consumer with an unclear scope. But if you build software to implement a necessary business process which must be ready in 2024 to be compliant with a law, there is no "lets find out". Time is fixed, scope is fixed (kinda), the question is who has a convincing plan to deliver for less money? Stuff like this is reality in most large companies and in almost all public projects. And you can sugar coat it and do it in iterations and discuss about a couple nice-to-have features and prioritize based on business value (more like critical path), but you need to deliver the software in time because the law does not wait for you or the new Mercedes S class car unveiling and order-start. And as much as i hate it, for that you need a plan and a plan means estimations and thanks to Parkinson's law we will almost always under-estimate. And i've seen a thoughtworks project fail miserably, which forced their customer to use the legacy system for 5 more years because "something" disruptive was promised and build in a very agile way, but nothing was even close to replacing the legacy system or even parts of it (not because TW did a really bad job, but because reality in those giant backend world of core banking systems or plant management worlds). That "business value" discussion also has caused more harm than good in projects i reviewed because many young developers and "architects" learned somehow (with the help of "scrum masters"), that only something visible in the UI for the consumer is a feature and fun to implement. So they roll out new buttons in the UI constantly with a sloppy backend implementation while any technical stuff and refactoring is thrown into "technical stories" which will never be prioritized because: "no business value". I wish our industry would have a more established common knowledge and definition of what the buzzwords actually mean and make everyone learn them (computer science, hello?!). If real architects and engineers (or doctors) would have such different understandings of absolute basics of their profession, we would be doomed. Not sure if we deserve the title "engineer" considering how hard we all "wing it".
You definitely work really closely to me, given your experiences are pretty similar to mine. :) I have also had the same thoughts about the same topics. Here are some of my ideas: First you talk about fixed scope and kinda fixed scope projects, where something definitely needs to be on time. Does it really? If you buy a car, you buy a car that has already been manufactured. Its there as is, getting it back to the factory to upgrade hardware is expensive. Getting a software upgrade is close to free. The software is not ready at the vehicles launch. So what? If its the best software on the market 3 months after launch, I will buy the car with the best software anywhere after the lunch. Not a car guy myself, I know people working for VW in car software dev and it sounds like 1970s development methods, really long cycles, essentially people knowing how to build physical stuff trying to manage software dev the same way, when it is not. And people driving these vehicles talk about how shit the software looks, feels and behaves all the time. Alternatively: What does estimating do for you here? The estimate is always wrong and features are always delivered far too late with far too little quality. So from my PoV estimating is always wrong, so why even bother? For the "big legacy systems": I have worked in several projects trying to replace old legacy systems, up to 50 years old. An iterative approach is the only thing that can work. We took small business processes from the legacy system and replaced what they were supposed to handle in a new system. This still took quite a lot of time, but the customer saw: "This new small system works!" And we got to work on the next part. And the way to choose which system to start with: Estimate value vs cost, just like in the video. In the beginning you talk about scenarios of fixed scopes. Developing a whole system to be legally compliant is still a big project that can be broken down into smaller pieces. These pieces have different value and costs associated. If you already have a system that needs to be updated to be compliant, the value is to not have to pay a fine. This fine (+ image damage? Eg: not being GDPR conform?) would be probably higher than other story in the project, so it makes sense to focus on everything connected to it. Does that make any sense? Happy to hear your thoughts. :)
@@KiddyCut I think you missunderstood the first use case, with the car. What @vanivari359 meant is that if a car company is planning to release its next gen vehicle on a certain date, then everything needs to be ready by then, including the software. If this vehicle will have the next gen autopilot software, and this is a big marketing point, then the software development team has a fixed delivery date and scope. Nothing less than that would be acceptable by business. After sale upgrades are just that, upgrades and big fixes not delivering the actual autopilot feature.
I agree that software companies that build software for online users (e.g. Spotify, Facebook, Google, Forex) could use a pure agile approach of prospecting and small increments deployed to production, where estimation is less important. But for all the other companies, all-or-nothing seems to be the norm - they either get everything they (think they) want at an agreed price and date, or they won't choose the software vendor.
I would love it if sources were sited in the description. For example you make the claim at around 8:15 minutes that research shows estimates are out by a factor of four. Which specific research paper or book says this? (Apologies in advance if it’s there already and I missed it :P)
We sometimes concentrate on the wrong areas, while neglecting fixes staring at us in plain sight. In a rapid development environment (using C++, gcc, Linux), the metrics looked good, so we pushed 3 deployments (A, B, C), when we had a regression in function. A had a long standing bug, B did not for 6 months, then C reintroduced it. Something had changed, but the code the error was traced to hadn't changed. TL;DR, the machine for release B has an upgraded compiler, and the dev tasked with the build used his own by accident, while the A,C deployments used the same device. Turns out, B's compiler rev should've been adopted earlier as it fixed an esoteric language issue but our Config Management missed the bulletin.
Although I also don't find the point of estimations, I appeared to me that the model proposed suites more on prioritization than to "predict" the effort that will take to build something. I mean, when you put features into the Build slot this means that they are ready to be worked on, but when people estimate a feature they want to have a measurement of how hard it would be to develop it.
Software estimations, are not any more inaccurate, than any other estimation. The few bits of scientific data created on the matter supports this. Inaccuracy vs actuals, is kind of the definition of "estimation". Estimation, is a science. It has methodologies separate from any specific domain. No estimation can ever claim to be 100% accurate to future actuals.
If you had a magic crystal that could either let you complete projects as fast as possible, OR give you the foresight to know exactly how long the project would take, which would you choose? Assume the quality of the project is the same either way. You'd think the sensible choice would be to be fast, even if you didn't know when the project would be done. And yet companies seem to always, always wish for the second option.
+2 Managers occasionally feel a need to justify their salary and that's how you get estimation mania and burndown charts shared across all levels and people being taken into micromanagement because something happened that nobody had foreseen.
I haven't done any real coding in about a decade but I always found the concept of estimations in software development completely unreasonable. You would need to have zero understanding of programming to even believe an estimation is a thing that can apply to it.
I think estimates are good on the very small scale. Having a rough idea how long individual tasks will take helps keep me from overloading a sprint. Other than very small increments of time estimates are pretty worthless. Any estimate longer than an 8 hour day is as good as fiction. Once it's longer than a day it's a maximum amount of time to spend before reevaluating whether it's worth continuing. It's useless for any kind of important goal setting. Having serving sized sprints is about all it's good for.
Practically nothing that has to take on a formal decision to develop is going to end up costing just a beer, for the simple reason that the overhead of such a decision is way higher than the cost of a beer, not to mention the overhead for deployment and integration if your organization has too much red tape there. Also, the car, vacation and months salary are comparatively close, especially once you take variations into account, which are big enought that those 3 could be in any pertubated order between them, with cheap used cars easily being less than a months salary, and the cost of an extended weekend trip by car to visit some family is a lot smaller in cost compared to a 6 week summer vacation far away staying at hotels or on cruise ship. To fix it, I would probably remove the vacation one, specify for the car that it has to be a new decently good car, and then below a months salary add in the price of a computer. The use of a computer is really useful, because a lot of things might need someone to get a new computer for something, and if it would make sense for a customer to set up a computer in connection to the service you provide, you already know that they would evaluate it at least compared to the value of a computer, because they would pay that as an additional cost. It also handles the fact that most actual task fall much well short of a month of work and well above the price of a beer.
Respectfully, the solution given at the end of the video is for prioritization, not estimating. I get that it’s difficult to estimate but I’ve never heard a good answer to solve this problem. It’s critical for organizations(especially smaller companies) to understand when a feature or a set of features can and most likely will be completed by. I’m all ears for any solutions to this specific problem.
Still waiting for what to replace this with....you have a keen sense of what the problem is and able to articulate this very well..... Now what? What replaces this with?
Well what most people seem to mean when they ask a question like that is "yes, but how can I perfectly predict the future anyway" and my point is that you can't this is irrational and impossible, so you have to find a way to deal with the uncertainty. For short-term, in-project steering, then count stories, not story points or days, this has a number of benefits. It encourages you to make smaller stories, and that increases the accuracy of any predictions that you make on that basis. But even more important change the focus of planning from accurately predicting an end-date and a cost, to instead, organising work around how much to invest to determine feasibility, treat this more like a start up, with seed funding to see if the idea is viable and then round 1 funding to get some basic work done to see if the idea works and so on. Or if you are very rich, and you believe in what you want to do strongly enough just take the punt and forget all about the estimates and just do the work. This last one is the Apple/SpaceX model.
Good and old known theory, but the true challenge is in estimating the business value and having common definition of value categories (can not use beer/house :) )
As always, a well reasoned and insightful video Mr Farley. I can see how I could use the matrix towards the end of the video in a B2C context. Do you have any advice for B2B contexts? I work in a sector where the products are (generally) free to end-users and our customers work for businesses looking to reach/retain those end-users.
I'd use the idea of "value" in that case as an analogy, you can have the notion of "value" without it being monetary. A game that is fun to play has value. So grade the "value" based on fun, or engagement or whatever it is that makes your software desirable for its users.
The last 3/4 of this video requires estimating both cost and value. Immediately after pointing out these are just guesses, and typically wildly wrong. These are at odds with each other. If estimates are wildly wrong, which they are, then anything based on them is waste.GIGO
Great tool, thanks for sharing! I wonder were stuff like refactoring would fall, sometimes is hard and the benefits might not be reflected directly to end users, but overall I think improving the DX ends up being a benefit for the end user because of less bugs and faster iteration.
I think of refactoring as prepping and painting the bodywork being repaired. Nobody wants grinder marks and rust on their car. Unrefactored code is seven centimeters of body filler smoothed over and painted.
How do you deal in an interview when the job description clearly says: Provide accurate development estimates in support of feasibility assessments and planned development activities.
People who ask for estimates on projects that will take weeks, months or years, organise meetings that take 30 to 60min. More often than not these run over, and that’s a very short timeframe to estimate with only 1 or 2 things to discuss.
In a hardware company, by the time you learn a feature is too expensive for software, you may have spent an enormous amount of time and money in other departments (e.g. Hardware engineering, research, manufacturing). Unfortunately, answering “what next” for software doesn’t solve this.
Once you understand the cost of delay you have to determine how much you reckon it will take to create the solution which requires - estimates. Circular discussion.
Wherever I worked the estimations where never to decide if we should work on something but to know around when they would be expected, i.e. do we manage to deliver it in Q1 or Q3? I think the problem was never deciding what has value to work on but that came top down (another discussion altogether I guess)
I agree, but I think this has more to do with the "something new" factor than software specifically. That's why it's harder to meet estimates for a custom home vs. a production home construction project. Cookie-cutter software projects are easier to estimate correctly as are cookie-cutter construction projects. If your sprints are frequently filled with spikes, it may be a sign that Kanban is the ticket.
Yes, the problem is that "cookie-cutter software projects" make no sense at all, whereas "cookie-cutter construction does". The difference is in the cost of reproduction, for software it is essentially free, because we can perfectly clone any sequence of bytes, representing a system, for essentially zero cost, so why on earth would we ever make the same software again, because we can just copy it. So it is ALWAYS something new, it is ALWAYS at some level a custom project. Which as you said, are impossible to estimate with any degree of accuracy, whatever the field.
I usually share your videos to my managers to show them the problems we have can be fixed or at least mitigated using CI/CT/CD. However, this time it is hard. In automotive industry we have too make sure that the car is moving/behaving in safe manner. So each feature is worth the same because it has to be done anyway. The order of task defined by the customer. But it also standard to plan based those inaccurate estimates and to escalate on multiple levels of HiPPOs if the plan does not hold. Then re-estimates are requested which are wasting even more time although the more accurate due to the narrowed cone of uncertainty... for the next delivery even more planning and tracking (aka interruptions) implemented just to make sure. Any suggestions?
No Jeff definitely did the "stupid zone" drawing for the last 10+ years at least. Not sure why he didn't recognize it but hey good on you for asking him haha
I was on a government program and was part of the proposal team bidding the software. We had a bunch of metrics we had to use like complexity and line of code counts. We finally came up with the software part and it was $12 million dollars. Management said it was too much and that I only had $9 million to work with (would have been nice to know that up front). So we went back and cut a bunch of time on each of the pieces. They questioned if the tasks could be done in such little time because that increased risk. I said you only gave me so much money to work with so I had to cut from somewhere. This is the stupid crap we have to work with as engineers...
Huh, I have practised this with product managers (digital agency PMs) to explain why something is a bad/good idea, never knew there was a method that used it too (rule 34?). Particularly the cost to not implement a change.
Such a difficult topic within my organisation, one team follows an agile approach but consistently delivers years later than they originally communicated, and another team refuses to give any estimates at all unless under duress to give T-shirt sizes.... it's difficult to base a business around this that has customers that need to do things by a certain point... What's the answer?
I don't think that there is an answer, if we constrain the problem to only fixing a price. Let's be clear though, small simple, low cost, things that are very similar to work that we have done before - well ok, we have some basis to make a guess. It will be wrong, but commercially it is easier to say "I will make you a wordpress based website for £500" than I will build you a healthcare system for £50 million. The first one MAY be close enough, and you will do enough of them, that sometimes you will do it with £125 worth of effort, and sometimes £2000 and it will work out as long as it is more often less than more - incidentally those are the error bars for estimation at the start of a software project 1/4x to 4x! But we can see from real world projects bigger projects are ALWAYS WRONG but because the numbers are so big and scary, people want more precision, even though bigger projects are more uncertain because they are less similar to one another, and there are always huge unknowns at the start. My view is that anything we do to up the precision is a mistake, because the error-bars are so huge that precision isn't what we need. So more subjective, less precise is better, and best of all, to my mind, is the realisation that organising this through ideas like incremental/venture-capital style funding models for big projects is by far the more sane response to the reality of what SW dev really is. The trouble is that customers and businesses are not always, or even usually, rational. So crossing your fingers and guessing is all there is.
I was afraid you would say that! I work in the broadcast industry and we sell a system that automates transmission and media management, the entire EU broadcast industry works by responding to tenders and agreeing to fixed prices & timescales .... stability is obviously paramount. I've been watching your videos (which are awesome) on DevOps/CD as I'm looking to create an organisational step change in the quality of code that is installed by moving away from the traditional waterfall (over the wall!) approach we currently have...... I'm technically an Ops guy, but do have R&D within my org @@ContinuousDelivery
I completely agree. The business model has to change. The days of easy business decisions of just doing a cost/value analysis are gone. In My experience, the business is usually trying to decide to take one major direction or another. They want to know broadly how 'big/costly' that work will be until they start to get a return, so they can decide which direction to take. They want to have a position of evidence to fall back on, so they can be accountable to the money men. E.g. We decided to go with option A because the Engineers said it was cost x and take y. When that turns out to be wrong, which it well, then they blame the engineers bad estimates. . What business should do in this situation is perhaps decide their strategic commercial goals based on other factors like market potential, and just start building software. There is not a safety net here. A bigger mistake would committing to years of development, big contracts, when there are so many unknown unknowns and feeling like its all going to work out, because of some estimates that took the engineers a day or two to come up with and treated as gospel. @@ContinuousDelivery
I think benefits to customers in terms of money are useful, but not the whole picture. Some changes are more about enabling and are very hard to attach a £value to.
In general, the longer something will take to develop, the greater the error in being able to estimate time. And the error isn't something like 180 days +/- 10 days. It's more like 170 days to 2 years. It's that bad when the thing you want to do is complex enough that it might take a half a year. That half a year estimate can easily balloon out to 1 year, 2 years, etc. So why bother estimating it? You do it because managers demand it and feel it's reasonable that a professional developer like you should be competent enough to get it right! Expectations of the developer's ability to estimate time are high. And developers are pressured into giving a number and adhering to it, with their reputation and job on the line. No pressure! In my experience, there is just one type of software developer that is good at estimating how long some code will take to develop. They keep expectations low and scope very narrow. They never take on more than could be done in one week, maybe two at the most. They never talk about long term goals other than to say we'll do this very short term work one step at a time and then "together" (manager + developer + team) we will decide later on what the long term goals should be. And so they string the project along one step at a time making absolutely no long term plans or time estimates. When asked to do so, they just avoid being made responsible for it like a software architect would, and they will only commit to short term things. It's really about setting boundaries and sticking to them. Managers love those kinds of developers. These developers seem very professional compared to those who want to plan deep into the future. That's because the ones that plan long term things are usually spectacularly bad at time estimates. Managers want developers to not only make these estimates but commit to them and never change them. If they're missing an intermediate goal, from a manager's perspective, that developer better be working around the clock to make up for the schedule slip. It's hilariously wrong headed for managers to do this. But they all pretty much do it. What is needed is a middle ground between those two types of people. You want someone who's architecting the whole project out into the future many years. But you can't expect any of your time estimates out past a couple weeks to be meaningful. At the same time, you can't just have developers who only take on things that take them 1 to 2 weeks. You want someone who is responsible for long term planning and execution at all points in between. And as a manager, you have to accept that whatever time estimates are given for the completion of the project, they are subject to change and should not be considered a commitment at all by any of your employees. That last part is a hard sell for managers. Most will say that's unacceptable. And around we go with each employee tap dancing around their managers with language that doesn't commit to anything. The ones foolish enough to commit to long-term goals get burned out and leave the company, get laid off, or are given horrible work that nobody wants.
Where do you stand on Prototypes using tools like figma? Prototype that has tested really well with users and we have figured out the areas of value. How can we work out how long to build those areas and release? Client is asking when they can have the system? What can I tell them without any estimate?
What can you honestly tell them *with* an estimate? It is not that I am fundamentally against the idea of estimates, I just don't believe that you can estimate software projects, unless they are exceedingly simple, and very very similar to what you have done before. And I would argue that if you are building something "very very similar to what you have done before" then you are doing it wrong, because you should have abstracted the learning and made something generic that you can make more easily. I completely concede that all of this is a somewhat academic position to hold, which is why I am not actually a strong proponent of the #NoEstimates movement. My position is a little different. My experience is that the best teams and orgs grow out of estimates, when they are allowed to by circumstance. But the real world is often irrational. So yes, customers will ask for estimates, even though this is an irrational, and deeply sub-optimal approach. I confess I think of estimates and the effort expended on them in the same way that I think about Witch-doctors, Astrology and Horoscopes. They are pseudo-scientific nonsense. But some people still like them. The trouble with estimates is that we are still in a world that assumes that they are real, so in some parts of the software business, you have to do the witch-doctor dance and read the runes to win the sale. Despite this, I still think it healthy, and more intellectually honest, to recognise that this is what you are doing, and so it is only sensible to not waste too much time and effort on the ceremonies surrounding estimation, because they don't represent reality, they are part of the planning theatre that exists in our industry. We aren't always free to do the rational things. Just to be clear, I have been one of the lead estimators, for often very big projects, in every org that I have worked in for the last 20 years. I used to try hard to be accurate, and learned that I was no more accurate when I tried hard, than when I simply guessed, but you still have give your guesses the illusion of rigor if you are selling things. Many orgs selling services, often don't even try to be accurate with their guesses, they just try to undercut the competition. This is all crazy, but it is still the industry that we work in.
@@ContinuousDelivery thank you I do appreciate the time you have taken to put a response together. Having trod through the trenches of software development myself for many years and been "the coder" I recognize the sentiment of #NoEstimates and the frustration and stress giving estimates can be on a software team. And I agree with the deep suspiciousness of an "accurate estimate", it is as good as being able to stare at the stars and try to work out the distances between them with any great accuracy. However, having said all that we do not work in a vacuum, software projects have budgets and businesses have expectations, that all need managing an all this without the project being cancelled and effort wasted if it is not delivered on time (or within budget). I have no answers but just experience that as you correctly say is the crazy world of software development. I do like the approaches in your video and will try some of them out so thanks for illuminating those ideas. One question I do have and it goes back to my initial comment and a grey area (for me) at the moment; UR/UX designers are building prototypes of applications, websites and system then testing them with clients and doing the whole incremental/iteration of the prototype before saying to the developer team in a Picard voice "make it so". Where do you stand on this methodology as I see it becoming more common over the past few years and it does put pressure on the developers to somehow estimate the building of the prototype as a finished article to then be real application.
Maybe I'm missing something, but even in the proposed system, we still have to estimate the cost of building the thing we want to build, and to guess how much value it can bring to the user. So, it seems that we're still back at square one in trying to estimate cost and value. Sure, it's way simpler to use and look at compared to fancy, complex and over-engineered mathematical models, but I don't see how that helps us solve the problem at heart: better estimation.
It depends on what you mean by "better" if you mean "more precise", which is what most biz people want, I think that is looking for the wrong "better". This is "better" because it is less precise, but more accurate as a result. It admits the error-bars inherent to estimation. So it is fast and easier to place ideas in the relevant box, and as soon as you do that, you know what to do. I think that is indeed "better". There is no solution to "more accurate" or "more precise" it is like wishing that fairies were real. It may sound like a nice idea, but it is imaginary.
@@ContinuousDelivery What I was asking was more confined to a single aspect of the problem, for example how can we say that feature X will cost us a beer, a vacation or a house? In the end we have to make such an estimation, even if it's broader than some fancy model with a ton of variables. And that's what I'm missing: why does that feature cost us a beer instead of a vacation? How do we estimate *that*?
I am sort in the same boat here.. the storyline is “yes, estimates are always wrong, let’s use a simple way to estimate”. It doesn’t solve anything fundamental… maybe restricting people to use the estimate as a science (which in my opinion is the biggest problem). Even going from time estimating to story points estimates had the same foundation. This video doesn’t give solutions for the fact that there is a need for some kind of predictability.. like explained in some of the comments. Because you have a dependency with another department (Sales, Marketing etc), another company or vendor, customer demand, legislation, competition.. there are probably more reasons to have some kind of indication of when your are done with something. And we probably all ask the same question when we hire people to do something for us in our personal life.. when is my car fixed, when is my new house done, when can I expect my package to be delivered etc etc. Yes, building features in software is more complex than delivering a package… but that doesn’t mean some predictability is not wanted within our world of software delivery. We cannot just say..hey, software dev is too complex, forget about when you are going to get it…it is done when it is done! #NoEstimates!! It is just not realistic..
What is the source for the 4x error claim? I really would like to read the entire thing and bring it up to clients as a "fair warning" when giving estimates.
I first read about it in the book "Rapid Development" by Steve McConnell (who led the team that developed Excel at Microsoft). The research wasn't his thought, it has been around for a very long time. Here is his take: www.construx.com/books/the-cone-of-uncertainty/ Here is Wikipedia on the topic: en.wikipedia.org/wiki/Cone_of_Uncertainty Scholarly Article on Addressing the cone of uncertainty: www.researchgate.net/profile/Pongtip-Aroonvatanaporn/publication/220883691_Reducing_estimation_uncertainty_with_continuous_assessment_tracking_the_cone_of_uncertainty/links/54bfdfe00cf28eae4a6635d2/Reducing-estimation-uncertainty-with-continuous-assessment-tracking-the-cone-of-uncertainty.pdf
Once everyone in the business understands that there is risk in the estimates of value and costs, the next debate is always, whose risk is it. If the Devs underestimate, shouldn't they work overtime? Of course not, but then who is to blame. I can't work in a toxic culture like that anymore, where anything that goes wrong or not to plan has to be someones fault.
Hi Dave, Quick question! In agile, you should receive feedback from the customer about what's the next biggest priority in development. I try to follow that rule preciesly, I think it's great. But how do I prioritze the kind of work, that the user doesn't understand - for example, user wants to write email campaigns, and he cares about the email title, the addresses, the volume, etc. But I, as a programmer, know, that we must add safety features, like throttle, DDos prevention, time limits, etc. I tried to explanining those concepts to the customer, but I can see they don't really understand them - it goes right over their heads. So how do I prioritze work on that?
It depends 😉 Mostly for the things that are essential, don't surface them, and implement them as part of the story that the user cares about that makes most sense. You take responsibility for building good software, including adding features that must be there for the system to be safe and useable. Ask the user when you need help to decide how far to go. If the system is always behind a firewall, it may not need as much security. Where you need to expose ideas to users to get such a steer, describe the ideas from their perspective. Don't say "DDos protection" or "throttles", you could ask what they'd like to happen when the system was under attach, or when they were closed down for spamming people. You may need to explain that if the system was under attack they wouldn't be able to send messages, and if they were deemed to be spamming they'd be black-listed. These are real world things, not technical esoterica. I think there is ALWAYS a user need hidden beneath any technical story. Find that, and talk to them about that.
Thank you very much for your answer. I think it's really simple, now that you've described it so simply :D @@ContinuousDeliverySo what you're saying is "find a way to translate the technical necessity into user-need", and then try to prioritize from that using client feedback? And there's always a way to translate that?
@@danielwilkowski5899 Yes, but also, some technical things you don't need to ask permission for. An author doesn't ask "and would you like me to spell things correctly too, because that will take more time". So take responsibility for doing what is essential to this being a good system. Take responsibility for writing code that is easy to change in future, don't skimp testing or refactoring because "no one told me to do it", don't skimp on essential security, ask the users when there is a real choice to be made. The other things are just "the cost of doing business". I have electricians in my house at the moment, they aren't asking me if I want a safe version or an unsafe version 😉
Amusing, but silly. Yes, estimations are incredibly hard, and they often makes no sense. I'm with you up to that point. Beyond that, good luck dragging product owners or managers into a meeting to explain how estimates are silly, as you pull up a little beer-car-house matrix. I'll predict to you that you're not doing your career path in that company any good. And if you are bidding on contracts, that contract will go to the company that can convince a customer with clear estimates, however nonsensical and inaccurate they are.
I feel like here you're striking mostly a balance between overly complicated estimations and market research. How can a product owner estimate what's the ratio between least work and most gain if they don't know which epics imply the least amount of work. They still need some T-Shirt sizes. What happens to the bold ideas that require substantial amount of work vs small ideas with small impacts with quick implementation times. How could you know which one is better without some wrong estimates on which you can base your computations on
It isn't backwards. The horizontal axis is time, the vertical axis is the accuracy of your guess. at time zero, when we haven't started yet, the data say we are out by a factor of 4x so if our prediction says 100 days it could be 25 days (x1/4) or 400 (x4). Our predictions are certainly accurate when we finish so as we approach the finish point, the right hand edge of the graph, our predicion intersects with the duration of the project.
Estimations are hard. But what grind my gears are devs that say "yes I am done tomorrow or this week" and then they repeat that for 3-6 months. How can you keep saying that with a straight face.
Why are you saying that regulation related features do not have value? They actually do, not only for the risk, but also for the benefit of consistency, the resulting system actually has better quality in various dimensions
One second after you give a range of time estimates, your boss takes the lowest time and demands that you commit to meeting this schedule. After all, it is YOUR estimate. Making an accurate estimate requires knowing all the problems you will have and how long each will take to solve. Like that happens. Deadlines do have a way of motivating you to finish up. Doesn't the product owner require timebox commitments from the developers before they start? It would seem to be obvious to put usage counters on each function (object) so that you can prioritize refactoring and bug fixes. Yet, nobody every talks about doing this.
Imagine doing estimations, management "transforms" them into commitments, then those things don't happen, then you have a review meeting to "analyze" why the estimations were wrong, then you have a third meeting to re-estimate the estimations and do it all over again. Bonkers :))
When you say 'imagine', you mean 'remember', right?
What do you mean "imagine"? -_-
Review meeting??? No need. You have to finish the work within the estimates, working overtime and on weekend (non payed), so, at the end, all went well😏
When you give estimations, remember to add in your error bars, and remember to tell them that your error bars are measured in orders of magnitude. When things turn out wrong and you have to explain how wrong, you just measure the number of orders of magnitude it was off, compare it to how unlikely it was to be that far out and then find that you have insufficient evidence to prove that the change was not within the model of your prediction.
My team
Discovering this channel and listening to these lectures is like coming up for air, whilst drowning in a sea of "this is fine" fire memes that are actually my day-to-day reality.
First question I always asked was "who are you giving me for the team?" It makes a big difference. I know who's productive and who is not. In bigger organizations that's hard to know unless you are organized in smaller teams. Next question is "has this been done before". My boss once asked for an estimate and old me he though 2 man years. I ended up telling him 6 man years and he said the customer wouldn't pay that much. I told him "we won't want the contract then". The customer paid for it much to the surprise of my boss. It ended up taking us 5.5 man years and had a little extra profit. That solution is still in use some 20 years later.
We had a bright guy who estimated (say) a year to do a project. Management asked him what if we gave you three guys? He said: five years. First they have to do my PhD, then we can build it in a year
@@stevecarter8810 That's rather just arrogant. PhDs really think they are special, but if they are really as good as they believe they are, they should be able to explain and instruct others to do what is needed. If they CAN'T, that's an indicator of their incompetence. Why have PhDs if they are unable to share their knowledge in a feasible manner? To have code monkeys with a degree?
Unless the task is really difficult and requires a high degree of specialization for all involved... but we are talking about the top 0.2 percentile of task difficulty here - tasks that are so complicated that it cannot be expected that a PhD can lead it. I personally worked (and still do) on tasks that normally require "experts" in that category, and I aim for highest performance.
@@brianviktor8212IIRC it was about schedulability of hard real-time systems in multicore environments, so yeah, a 0.2 percenter
@@brianviktor8212 Hmmm. Someone calling someone else arrogant for allegedly doing something in circumstances he personnally didn't witness, taking the piss on PhDs, and ending with "I often do an Expert's job and I'm Super Good At It".
Well, that sure paints a picture.
@@maianoguillaume My comment is from 5 months ago...
It's a worldwide, single-realm, non-instanced, very high performance MMO with a distributed architecture that allows the addition of physical game servers to the server cluster, whereas an algorithm ensures the server load is automatically distributed, given a dynamic cluster system. It uses my own TCP/UDP communication library, which also is quite powerful.
It is done. It took me 2 months to do all the logic. I just need to do some more bugfixing, testing and client implementation. This technology does compete with the best of the best that exists, and could take #1... if it were a competition.
It had a level of difficulty you cannot even imagine. Gemini rated it 12/10. Not that it matters... each single attribute would take teams of developers months of work. And the entire thing? It doesn't even exist (yet).
Anyway, regarding PhDs - my point remains unchanged.
I love that you've framed it as the decision of which thing to build. So much of the toxic estimating is "How much can we do in this sprint?" or "Will I get this next week?". When I've found estimates valuable it's been in decision making - and usually with an order of magnitude style sizing. Is this "hours, days, weeks or months or quarters?" On the estimation of value, I have not found the estimates particularly valuable, but the process of discussing the estimates with a couple of product managers has uncovered interesting and valuable disagreements.
I look at this problem at a different, but complimentary angle: a buy vs build (or create) mindset. I've seem decision makers demanding estimates because they have a buy mindset. This is the thinking we do when buying anything, from a book to a big corporate software: how much it costs, how long it will take to deliver and do I want it that bad to put up with those two? This works because the thing is already built. The risks the builder took and unknowns discovered are aready built into the price. We just need to decide if we can afford it.
When building something new, there is no way to know for sure how much it will cost and how long it will take. And approaching the decision to build something with the mindset of buying something makes everything worse. Decision makers need to be educated in shifting their mindset if they want to build software and really understand the risks and uncertainty that comes with it.
Buy vs build is often not a truthful exercise, people often fail to account for the complexity of integrating the software. Even then most vendor software is 85% of the way there so they need to plug that 15% and when you don't own the code that is a serious issue.
I've been on both ends both in the enterprise as a developer and at the vendor. Vendors are about selling, they typically will not customize for customers and in general the sales people lie about their capabilities.
Unless the vendor package does exactly what you need it to do there is no "buy", you'll be building regardless of whether or not you buy the vendors package.
That's a really good point. In one of the links Dave shared, that author writes about looking at assets from a 'total cost of ownership' view. This applies equally well to software. Looking at the TCO, building and owning the software has a lot of uncertainties but also a lot of opportunities. Both of which are hard to price into the value before you start any work
However, this is the reality of business. Big corporations want to know the cost and delivery date before signing any contract and trying to convince them that “we’ll deliver value every month and charge you a small fee then” will not get you the contract and in fact will damage your reputation as being amateur. So what do you do then?
@@Jedimaster36091 The reality is the software company is taking the initial risk. There needs to be a product before you can sell it and typically if you are selling “to be built”, there will be tollgates with acceptance criteria and a prescribed payment schedule, even flexible/dates and times must be mutually agreed and missing dates usually carries a financial penalty. These types of arrangements are more a partnership and less a product purchase relationship as the software firm will not take on all the risk.
Typically in these circumstances a product to do what is needed doesn’t exist. I’ve never encountered a vendor willing to do a fixed price delivery for a new build that wasn’t insanely high and without loads of conditions, they all want to do T&M, if not there is a more of a partnership.
The bigger question is on the build how to you measure it as a business success? A product can be fantastic but if it costs more to build than you could ever recover from sales it’s a failure. So when deciding business and product viability you need to have some idea of cost to deliver vs revenue.
I think there can be some SWAG of what a build will cost and how long it will take to go live given a determined team size. Where estimation goes off the rails is when you try to get more micro and accurately determine time and cost for individual components. I feel like experienced software people can tell you with 99% certainty they can deliver something under a specified cost and schedule using a worst case scenario estimate which is what should be done.
I've worked on 2 projects in which I can estimate that 80% of the time was spent on estimations, deadline negotiation, explanations on why the deadlines were missed, re-estimation, and anxiety-coping activities to deal with the stress of missing those deadlines.
So sad to hear
In both situations, the stakeholders came from construction backgrounds (where estimations and detailed planning makes sense) and did not take a "software is different" reasoning from the team
I'm so annoyed with people who believe they are improving their estimating skills when, in reality, they are becoming more adept at manipulating their lunch breaks to consistently finish work exactly on time.
So true
I‘m always thinking about giving a lecture to my dear colleagues about how developing software actually works so that they can see how stupid their ideas are. People think like we are some kind of craftsmen pulling up walls or something. What really bothers me is the idea of learning from past estimates. That is insane, because everything we do is new. And if we were really doing the same things over and over again we we’re doing it wrong!
Don't make yourself a target at work. Let people with power be morons, it's how they got the ability to throw your under the bus to begin with.
"giving a lecture to my dear colleagues about how developing software actually works so that they can see how stupid their ideas are"
-- that approach will ruin your career.
That‘s exactly why I‘m not going to do it. But if you do not communicate where things are going in the wrong direction, nothing is going to change. So the „lecture“ has to come in small doses and it should be respectful after all. The hardest part is to present an alternative that people can agree on so you can only move forward step by step. That is often quite frustrating and in my case leads to these kind of thoughts all the time. Nevertheless you have to remain professional.
@@madmanX1314 Things aren't going to change regardless. You want to dive down the rabbit hole of power struggles that have existed for longer than mankind, have fun. But it won't get you anywhere.
Well be ready to get a lecture how company management, resource allocation marketing fund allocation works in return. I have done both things and I will say, software development is not harder than navigating business and government regulations that emand beforehand estimation of cash flow..
Even simple tasks can have massively wrong time estimates. I worked on something recently that I thought would take just a few hours and instead it took 3 days to do. Once I was actually building the system I noticed that the data I was working with had a number of mathematical properties that made the problem much harder to solve. Trying to do that analysis ahead of time to come up with an estimate would have been most of the time to solve the problem.
At least in my case, nobody really cared that it took longer they just cared that it was as correct as we could reasonably make it.
Hah, I was once asked to add a field in a Java entity because a new column had just been added to a table in a database. Shouldn't take very long, right?
First, nobody really knew how it was going to be used or for what reason. But we needed it, apparently. So it took a lot of digging. I asked about the data type. I was told it would always be a 1 or 2 digit number. So I added an integer field in the model.
But it didn't work when I tried testing it. The column was not added in the test environment. I told the database team to add in the test environment, and that took a long time for them to do. I then had to connect to the database, but there were access issues. Eventually I got access, and I saw a bunch of 9 digit numbers and letters and got a bit confused. I was told it was just for testing and I should expect 1-2 digit numbers only in production. I was told to just deploy my change, because it was a simple change.
It didn't work. The column was defined as a char type in production.
I fixed that, but still it didn't work. My conversion from String to Long somewhere else in the code failed when the number was only 1 digit because they would pad it with a space in front. So I had to trim the number before converting.
A bunch of other things happened as well, and I just can't remember. I think I suddenly lost access to my work accounts, or a certificate expired, or some transfer job failed and needed to be rerun correctly.
And during this entire time, every little change had to go through a tedious process of tickets and hours of waiting for each reply. It took almost 2 god damn weeks to add an integer.
Assigning story points feels like an exercise in futility, especially in my current project, where estimations only exist ot decorate decisions that have been already made by the client. They're more of an excuse to justify development time if anything.
What a great metaphor (or so): "decorate decision" 😄 love it!
I am using a simple tool that looks quite similar to WSJF, but not with the same purpose.
I ask stakeholders to give me their perspective:
- What benefits would we like to create?
- What happens if we don't do this soon?
- What risks can we predict?
- What's the simplest way to achieve this?
What typically happens is that the answer is, "We don't know."
I then take it from there ... Divide and conquer, Slice and dice, experiment, inspect and adapt.
But as a side note, it's amazing how common the concept is, "Developer, I need you tell me EXACTLY how long this will take!" - "I don't know. Can you tell me more about what 'this' is?" - "I don't know."
My rule is... estimate time you think it will take to do. Then multiple by BIG questions that you don't know answer, because client don't know it to. Then multiply by 2 and you will get around good number of hours, what you need to finish project. 🙂
That’s pretty much been me method too. And you know what? I can count on one hand where those estimates have been off in the last 5 years. Meanwhile, I keep seeing dev teams underestimating over and over again, and having to work weekends to satisfy leadership, because the dev team is too timid to give a more reasonable estimate.
A technique that has worked for me is to start off with a high margin of errors. When stakeholders balk and request a more refined estimate, tell them that will require investigation and perhaps a prototype. That itself has a cost but usually it's a tiny fraction of what the entire project is. The act of developing the prototype will not only uncover where most of the work is going to be, it usually results in a demo which can help the stakeholders refine the estimate of the final VALUE of the feature.
So my takeaway, as an absolute junior dev - in order to solve the problem of estimation - estimates must be very 'coarse'. It is not wise to try and estimate the benefit/profit of the feature down to a the cent, rather they should be in very rough terms (e.g. the benefit of Feature 1 is in the ballpark of a few hundred thousand dollars a year). Similarly, the cost must also be estimated in a very ballpark manner. And if the benefits far outweighs the cost, they are to be prioritized first. The lesser this benefit-cost ratio is the lower down the priority list the feature is.
I had a boss once who expected me to be able to just come up with an estimate 90-95% likely to be correct, with a day or two of precision. He expected that for everything we ever considered doing -- including large projects. I couldn't get my head around how he would use such a number if it were even possible. He hadn't a clue what the dang feature was, what it meant, what value it provided or anything. I got the impression he worked for a company before that did the same thing over and over again. He couldn't understand that a feature that may sound in English to be similar to something else, can be totally different than the other task. Integrate with some software that we don't know the protocol for or if it even has a protocol? Sure, we've integrated into things before, how hard could it be?
Managers crave these numbers because it gives them a sense of control. They crave it so much that they deny the impossibility, like the dictator who denies that good and decent people could ever want to be rid of him.
Here is a way to "solve" said issue, tell him that such an estimate will be very expensive to calculate - specifically compariably expensive to making the thing, and that he should be sure he wants to pay that beforehand, and providing an accurate estimate on that estimate is similarly expensive. If he wants to go ahead anyway, the idea is simply to develop there entire thing while measuring it out, and the provide the measured out time as the estimate to him. That is the most reasonable way I can come up with to fullfill his demand, and he was warned that it would be that expensive.
@@sorcdk2880 Yes we had several discussions over how long it would take to come up with the estimate. This started with, something reasonable like taking weeks to develop the plan and estimate and ended with needing something on the spot. It is important to reason with people but at some point you come to the conclusion that the person isn't trying to be reasonable. I ended up switching jobs.
I have worked for 40+ years in product development SW and HW. I have concluded that guessing (estimation) by comparison is much more accurate than guessing by synthesis or detailed analysis. Business owners want to know how long something will take and I've found that thinking "we recently did product A in 18 months...this product B is a bit simpler but we have lost a team member so maybe it will take somewhere between a year and two years! Product C is waaay more complex and has aspects that we are unfamiliar with - it will take 3 to 5 years.
Has anyone else had success using this kind of simple aid to guessing?? It works much better than a Microsoft Project plan with every task defined and resource levelled etc. That will usually give an estimate that needs to be multiplied by Pi to get something about right! :) But it's a waste of time creating the plan to get a terrible estimate.
Sounds like Scrum with story points?
I really like the ideas from How to measure anything. The key is error bars are not a problem, unless they affect the decision, and if they do their is a way to price the value of new information. So the learn more part is about buying information. Reinersten is also of a similar mind. Our optimization is around U shaped curves, so there is a large range where performance is good. Thus simple models and measures are normally good enough.
Came to the comments looking for "The Principles of Product Development Flow" by Reinertsen - really a must-read if you're working in any kind of development. How to Measure Anything was eye opening too. And good point about models having appropriate precision for the organization - of course NASA or software designer for a nuclear plant need tighter and more precise estimates, but most companies are not that...
Estimations are only valid if you already did something very very similar. That said, from a owner of a software company POV now, they are needed, because money allocation depends on that. So we really need to know a low and a high bar for an expected release.
You can say you need these estimate, but they’re still mere guesses, guesses that if treated as commitments get turned into lies.
Stop trying to do that. Switch to learning. Ask the business how much it’s willing to spend to learn the next smallest thing you can find out for them that will help them know whether to spend more money, finding out more.
Get out of the feature business, and into the learning business.
@@TrackedHiker I Know MORE about development than any of my employees. I had your mentality yet as ALL business owners I understood that it cannot be that way. Simple as that. There is a reason why all developers that start their own companies change their opinions... because it is not an opinion, it is a fact.
I bet you would be angry if your boss did not accept to say how much you will earn in your month until after the month ended. Yet we also do not know how much money will enter the company before hand.. but we HAVE TO predict! We can lose a client, we can fail to get new ones, lots of things can happen.
The business feature is the one that pays the bills, be sure if ONE facet of companies that will not change is that. The business is the goal, development is just a tool. You do not change the objective just to make the tool easier.
@@tiagodagostini Why don't you spend your own money if a developer goes over your oh so precious budget/estimations?
Surely you understand a developer can't estimate new features accurately, since you know so much about development.
@@lynic-0091 Stay in your little square, dude. You clearly have no grasp about business.
Still a little lacking in answers or approaches for situations where estimates are definitely required, such as contract bids, yearly budget planning, comparisons to buying, new staffing requirements, etc. Of course if you don't estimate, estimating is easy. :)
Yeah, agreed
I guess you didn't watch the video?
There is no way to increase the accuracy of estimates. The data to back that claim has existed since the 1990's and was described very well by Steve McConnell in "Rapid Development".
Working within "error-bars" that make sense for estimation, using estimates to prioritise work, rather than to draw Gannt charts and if you really want to increase the illusion of rigor try CD3 would be my advice.
Sure we are pushed to make estimates sometimes, but don't buy into the illusion that you can do that with any accuracy, there is no solution to that.
Accuracy in predicting the future (estimates) requires that we understand the problem, that means we have already solved it, and there is no point in solving the same problem twice in software, because once you have a solution, you can clone the answer for, essentially, zero cost - so we are ALWAYS build something new, or we are being dumb!
@@ContinuousDeliverySure we cannot accurately estimate the effort of a software project. But that doesn't stop potential clients from demanding it. When it comes to decide which software vendor to choose, which one would the client likely choose: the one who says "we cannot estimate the cost and delivery because it doesn't make sense (or we're agile)", or the other one who gives some estimate? My bet is on the latter. Doesn't matter if the estimate is realistic or not, it is now written in the contract. And I would argue that this is the norm in most organizations.
I think the point is that what your are describing @@Jedimaster36091 is a business problem, not an engineering one. It sounds like your business operates on the basis of time and materials contracts, so it needs to work out how much and how long a project will cost. What @Continuous Delivery is saying is that is sucks to be you in the situation, but that no matter how much you 'need' high precision estimates, they are not possible. So the business has a problem, not the engineers.
It's so easy to confuse how much time you would like to spend on a project with how long it really takes.
Here's the deal some items can be estimated more accurately than others but in general estimation is about throwing out the biggest number you can justify so you can keep your team and don't get pushed to take on more than you can deliver. Generally the goal is to quote your best guess for the longest it could possibly take that way you have some spare capacity to refactor or buffer should things go wrong. In general you can guess "is the ask small"(rarely is) or is it massive and the more novel or unknown the more massive the estimate needs to be. Developers don't want to estimate, we want to ship code but like death and taxes we are always asked for an estimate, the earlier we learn how to respond to these requests the better.
I have long thought the best model would be to fund a system not based on the what but rather the importance and try to maximize the output rather than think you can fund for features because in my experience most companies baseline funding isn't even enough to keep the lights on.
About the "Jeff Patton Model", I saw this model too on Jeff's presentation here on youtube. It was some conference, maybe YOW. Love this model!
from 1:00 to 1:27 is a fantastic sentence. we should all adopt it.
Your videos are always a pure gold - thank you very much.
this is a good video for whoever managing the schedule, whenever I told these people the difficulty to do estimations, they would definitely tell me to give a high level estimation, and at the end I will have to rush for the timeline that they promised to stakeholders
It pains me to say that many of your videos present many problem statements (sometimes even the solutions) that many developers already know, but would have absolutely 0 chance of implementing at their organization. So watching these is often just an exercise in frustration.
I'd say I hope management watches these but not likely.
Great point. All these videos make perfect sense but they are in a way just repeating what is already been said and known. The real question is how to actually cause this change? Because as a simple developer I seem to have little impact on these things.
I like the idea of looking at how much it costs to not do something
Isn't that one of the most fundamental questions one would ask during planning? "Why should we do this and what are alternatives (which include doing nothing)?"
@@noli-timere-crede-tantum sure but I like the idea of turning the question around. I'm thinking about stuff that can often get kicked down the road like technical debt
@@shau4744 understood. Doesn't change the concept, though. Basic trade-off analysis
That is the coolest t-shirt I've seen.
I was in a project management training, where they told us two things:
- we can estimate a general task by estimating the best scenario, the worst scenario and our gut feeling. (Most propable scenario.)
- this will never work with software. :)
My issue is that I understand that software estimation is futile, but we have contractors, we have budget plan, we have deadlines, etc... I can't say to the management that I have no idea about the launch date.
What I can say is that I have only vague idea about the initial (MVP) functionality.
Usually our initial expectations are very high, we want a software that can solve world hunger and word peace while explores the deep space,, and my job is to cut the not-so-important functions during the project to meet the deadline.
For this, we need some kind of estimation. (Which we have to update weekly.)
In the end, we will have a software that is ready for deadline, we can give it to the users, we can try it and it has some "Coming soon" screens.
Rather than attempting to forecast the value of the feature for the Value Provider, i tend to lean to estimation of value for the Customer. Eg. if you think you're going to save time with your solution - then this becomes something quantifiable. So, when you have several features that are supposed to save time - calculate those savings for your customer base (how many people are actually going to use this and how much time it supposed to save, approximate their hourly rate and calculate the cost over say 1 year. This way you can calculate the cost of delay and CD3 from customer stand point, while your own business benefit is becoming a derivative from your customer's success.
There is another benefit in this analysis as it requires the product team to really dig into the value hypothesis definition and learn more about their customers.
On the other side the question of how much is it going to cost us to implement a feature is still a guess, but in this case "how to measure anything" is providing an alternative. Instead of giving a straight up number, you work with 90% confidence intervals. E.g. you're 90% confident that this feature can be implemented within a range of 1 week to 6 weeks. Providing this type of estimates is a trainable skill and gives you higher accuracy and precision together.
It is very similar to a standard Jeff Patton conference talk. He calls it a "bet" and he draws the quadrants and calls the expensive low-value section Stupid. I've seen him do it on two or three separate conference videos on RUclips. He also mentions CarMax in those videos.
It *is* Jeff Patton's model, I say that in the video, and yes he does call it a bet.
The rough cost-value matrix that was shown in the video is very useful, but you can go further. Any feature of large size can be sliced up into smaller capabilities that provide incremental value. Repeat the rough cost-value classification for each slice and use that with stakeholders to help decide what should be in and out of scope. The smaller the slice, the more you can make intelligent tradeoffs about team size, cross-team dependencies, and timing. Stop this process once you have enough confidence to get started, then prioritize the riskiest or highest value work first and adjust as you go along, just as he said.
Except that now you have to predict all of the pieces, and the riskiest may not be related to the highest value, which is where CD3 comes in.
I am talking about slices of value, not necessarily implementation details, a la Neil Killick. The point is that not every capability we are imagining in a large feature may be needed, and when we decompose and estimate cost-value as you have done for more fine-grained capabilities our stakeholders may come to a very different decision on what feature they want built. The more incremental we can slice it, the more choices we have.
Prioritization schemes like CD3 and WSLJF have the flaw that they tend to bias against any large feature, since the duration uncertainty is so much higher that they never get into the top right of the cost-value matrix. Slice capabilities thinner, and it will be easier to identify what riskiest assumption tests we want to run early to validate our assumptions.
In general, the goal of any estimate is to be useful, not accurate. If the features are all a reasonable scope to begin with then we don’t need to decompose further. There are many cases though, as others have mentioned, where you need more confidence for budgeting, contracts, or cross-team dependencies to make a decision to proceed, and decomposition to a finer grain may be called for.
I am not computer scientist (or engineer). In my electronics background the definitions of accuracy and precission are a little bit different. A measurement is precise if it is repeatable, but it does not need a lot of resolution. Anyway, I enjoyed the video, and your channel.
nice idea for user-facing features. I am wondering how you would go to justify enabling features (like improving a delivery platform) ?
Same way. All features have users, for some features the users are other programmers.
Again, all those arguments are relevant in a world where you build software for the consumer with an unclear scope. But if you build software to implement a necessary business process which must be ready in 2024 to be compliant with a law, there is no "lets find out". Time is fixed, scope is fixed (kinda), the question is who has a convincing plan to deliver for less money? Stuff like this is reality in most large companies and in almost all public projects. And you can sugar coat it and do it in iterations and discuss about a couple nice-to-have features and prioritize based on business value (more like critical path), but you need to deliver the software in time because the law does not wait for you or the new Mercedes S class car unveiling and order-start.
And as much as i hate it, for that you need a plan and a plan means estimations and thanks to Parkinson's law we will almost always under-estimate. And i've seen a thoughtworks project fail miserably, which forced their customer to use the legacy system for 5 more years because "something" disruptive was promised and build in a very agile way, but nothing was even close to replacing the legacy system or even parts of it (not because TW did a really bad job, but because reality in those giant backend world of core banking systems or plant management worlds).
That "business value" discussion also has caused more harm than good in projects i reviewed because many young developers and "architects" learned somehow (with the help of "scrum masters"), that only something visible in the UI for the consumer is a feature and fun to implement. So they roll out new buttons in the UI constantly with a sloppy backend implementation while any technical stuff and refactoring is thrown into "technical stories" which will never be prioritized because: "no business value".
I wish our industry would have a more established common knowledge and definition of what the buzzwords actually mean and make everyone learn them (computer science, hello?!). If real architects and engineers (or doctors) would have such different understandings of absolute basics of their profession, we would be doomed. Not sure if we deserve the title "engineer" considering how hard we all "wing it".
You definitely work really closely to me, given your experiences are pretty similar to mine. :)
I have also had the same thoughts about the same topics. Here are some of my ideas:
First you talk about fixed scope and kinda fixed scope projects, where something definitely needs to be on time. Does it really? If you buy a car, you buy a car that has already been manufactured. Its there as is, getting it back to the factory to upgrade hardware is expensive. Getting a software upgrade is close to free. The software is not ready at the vehicles launch. So what? If its the best software on the market 3 months after launch, I will buy the car with the best software anywhere after the lunch. Not a car guy myself, I know people working for VW in car software dev and it sounds like 1970s development methods, really long cycles, essentially people knowing how to build physical stuff trying to manage software dev the same way, when it is not. And people driving these vehicles talk about how shit the software looks, feels and behaves all the time. Alternatively: What does estimating do for you here? The estimate is always wrong and features are always delivered far too late with far too little quality. So from my PoV estimating is always wrong, so why even bother?
For the "big legacy systems": I have worked in several projects trying to replace old legacy systems, up to 50 years old. An iterative approach is the only thing that can work. We took small business processes from the legacy system and replaced what they were supposed to handle in a new system. This still took quite a lot of time, but the customer saw: "This new small system works!" And we got to work on the next part. And the way to choose which system to start with: Estimate value vs cost, just like in the video.
In the beginning you talk about scenarios of fixed scopes. Developing a whole system to be legally compliant is still a big project that can be broken down into smaller pieces. These pieces have different value and costs associated. If you already have a system that needs to be updated to be compliant, the value is to not have to pay a fine. This fine (+ image damage? Eg: not being GDPR conform?) would be probably higher than other story in the project, so it makes sense to focus on everything connected to it.
Does that make any sense? Happy to hear your thoughts. :)
@@KiddyCut I think you missunderstood the first use case, with the car. What @vanivari359 meant is that if a car company is planning to release its next gen vehicle on a certain date, then everything needs to be ready by then, including the software. If this vehicle will have the next gen autopilot software, and this is a big marketing point, then the software development team has a fixed delivery date and scope. Nothing less than that would be acceptable by business. After sale upgrades are just that, upgrades and big fixes not delivering the actual autopilot feature.
I agree that software companies that build software for online users (e.g. Spotify, Facebook, Google, Forex) could use a pure agile approach of prospecting and small increments deployed to production, where estimation is less important. But for all the other companies, all-or-nothing seems to be the norm - they either get everything they (think they) want at an agreed price and date, or they won't choose the software vendor.
I would love it if sources were sited in the description. For example you make the claim at around 8:15 minutes that research shows estimates are out by a factor of four. Which specific research paper or book says this? (Apologies in advance if it’s there already and I missed it :P)
I do add the sources, but I did miss this one, sorry. It is there now. I first saw it in Rapid Development by Steve McConnell amzn.to/38OKgtP
We sometimes concentrate on the wrong areas, while neglecting fixes staring at us in plain sight. In a rapid development environment (using C++, gcc, Linux), the metrics looked good, so we pushed 3 deployments (A, B, C), when we had a regression in function. A had a long standing bug, B did not for 6 months, then C reintroduced it. Something had changed, but the code the error was traced to hadn't changed. TL;DR, the machine for release B has an upgraded compiler, and the dev tasked with the build used his own by accident, while the A,C deployments used the same device. Turns out, B's compiler rev should've been adopted earlier as it fixed an esoteric language issue but our Config Management missed the bulletin.
Although I also don't find the point of estimations, I appeared to me that the model proposed suites more on prioritization than to "predict" the effort that will take to build something.
I mean, when you put features into the Build slot this means that they are ready to be worked on, but when people estimate a feature they want to have a measurement of how hard it would be to develop it.
Reg changes where there is a defined fine are probably the only ones where you have an accurate business value - it's the cost avoidance of the fine!
I love the stock video clips. They really catch me off guard
Software estimations, are not any more inaccurate, than any other estimation. The few bits of scientific data created on the matter supports this. Inaccuracy vs actuals, is kind of the definition of "estimation". Estimation, is a science. It has methodologies separate from any specific domain. No estimation can ever claim to be 100% accurate to future actuals.
I disagree with almost everything on your channel, but I really like this 😊
If you had a magic crystal that could either let you complete projects as fast as possible, OR give you the foresight to know exactly how long the project would take, which would you choose? Assume the quality of the project is the same either way.
You'd think the sensible choice would be to be fast, even if you didn't know when the project would be done. And yet companies seem to always, always wish for the second option.
"as fast as possible" is arbitrary.
+2 Managers occasionally feel a need to justify their salary and that's how you get estimation mania and burndown charts shared across all levels and people being taken into micromanagement because something happened that nobody had foreseen.
Asking to hit like and subscribe at the beginning of a video is asking to *estimate* if the video and the channel are worth it before having seen it.
I'd like to tell my boss I belong to the no estimates movement 😂
I haven't done any real coding in about a decade but I always found the concept of estimations in software development completely unreasonable. You would need to have zero understanding of programming to even believe an estimation is a thing that can apply to it.
I think estimates are good on the very small scale. Having a rough idea how long individual tasks will take helps keep me from overloading a sprint. Other than very small increments of time estimates are pretty worthless. Any estimate longer than an 8 hour day is as good as fiction. Once it's longer than a day it's a maximum amount of time to spend before reevaluating whether it's worth continuing. It's useless for any kind of important goal setting. Having serving sized sprints is about all it's good for.
Practically nothing that has to take on a formal decision to develop is going to end up costing just a beer, for the simple reason that the overhead of such a decision is way higher than the cost of a beer, not to mention the overhead for deployment and integration if your organization has too much red tape there.
Also, the car, vacation and months salary are comparatively close, especially once you take variations into account, which are big enought that those 3 could be in any pertubated order between them, with cheap used cars easily being less than a months salary, and the cost of an extended weekend trip by car to visit some family is a lot smaller in cost compared to a 6 week summer vacation far away staying at hotels or on cruise ship.
To fix it, I would probably remove the vacation one, specify for the car that it has to be a new decently good car, and then below a months salary add in the price of a computer. The use of a computer is really useful, because a lot of things might need someone to get a new computer for something, and if it would make sense for a customer to set up a computer in connection to the service you provide, you already know that they would evaluate it at least compared to the value of a computer, because they would pay that as an additional cost. It also handles the fact that most actual task fall much well short of a month of work and well above the price of a beer.
"THE MEASURE OF INTELLIGENCE IS THE ABILITY TO CHANGE -- ALBERT EINSTEIN"
This is what written in his t-shirt 😍
When you estimate, always multiply it by 3. If you're dealing with hardware software development; good luck
Respectfully, the solution given at the end of the video is for prioritization, not estimating. I get that it’s difficult to estimate but I’ve never heard a good answer to solve this problem. It’s critical for organizations(especially smaller companies) to understand when a feature or a set of features can and most likely will be completed by. I’m all ears for any solutions to this specific problem.
That's because there is no good solution. The best solution is to avoid them and "prioritise rather than plan".
Still waiting for what to replace this with....you have a keen sense of what the problem is and able to articulate this very well.....
Now what?
What replaces this with?
Well what most people seem to mean when they ask a question like that is "yes, but how can I perfectly predict the future anyway" and my point is that you can't this is irrational and impossible, so you have to find a way to deal with the uncertainty. For short-term, in-project steering, then count stories, not story points or days, this has a number of benefits. It encourages you to make smaller stories, and that increases the accuracy of any predictions that you make on that basis. But even more important change the focus of planning from accurately predicting an end-date and a cost, to instead, organising work around how much to invest to determine feasibility, treat this more like a start up, with seed funding to see if the idea is viable and then round 1 funding to get some basic work done to see if the idea works and so on. Or if you are very rich, and you believe in what you want to do strongly enough just take the punt and forget all about the estimates and just do the work. This last one is the Apple/SpaceX model.
Good and old known theory, but the true challenge is in estimating the business value and having common definition of value categories (can not use beer/house :) )
Very similar to Weighted Shortest Job First (WSJF) from Reinertsen and Leffingwell in SAFe.
As always, a well reasoned and insightful video Mr Farley. I can see how I could use the matrix towards the end of the video in a B2C context. Do you have any advice for B2B contexts?
I work in a sector where the products are (generally) free to end-users and our customers work for businesses looking to reach/retain those end-users.
I'd use the idea of "value" in that case as an analogy, you can have the notion of "value" without it being monetary. A game that is fun to play has value. So grade the "value" based on fun, or engagement or whatever it is that makes your software desirable for its users.
@@ContinuousDeliveryLove this idea - thank you for your insight!
The last 3/4 of this video requires estimating both cost and value. Immediately after pointing out these are just guesses, and typically wildly wrong. These are at odds with each other. If estimates are wildly wrong, which they are, then anything based on them is waste.GIGO
Great tool, thanks for sharing! I wonder were stuff like refactoring would fall, sometimes is hard and the benefits might not be reflected directly to end users, but overall I think improving the DX ends up being a benefit for the end user because of less bugs and faster iteration.
I think of refactoring as prepping and painting the bodywork being repaired. Nobody wants grinder marks and rust on their car.
Unrefactored code is seven centimeters of body filler smoothed over and painted.
Really really great video. Thanks Dave
How do you deal in an interview when the job description clearly says:
Provide accurate development estimates in support of feasibility assessments and planned development activities.
People who ask for estimates on projects that will take weeks, months or years, organise meetings that take 30 to 60min. More often than not these run over, and that’s a very short timeframe to estimate with only 1 or 2 things to discuss.
In a hardware company, by the time you learn a feature is too expensive for software, you may have spent an enormous amount of time and money in other departments (e.g. Hardware engineering, research, manufacturing). Unfortunately, answering “what next” for software doesn’t solve this.
Once you understand the cost of delay you have to determine how much you reckon it will take to create the solution which requires - estimates. Circular discussion.
Wherever I worked the estimations where never to decide if we should work on something but to know around when they would be expected, i.e. do we manage to deliver it in Q1 or Q3?
I think the problem was never deciding what has value to work on but that came top down (another discussion altogether I guess)
I agree, but I think this has more to do with the "something new" factor than software specifically. That's why it's harder to meet estimates for a custom home vs. a production home construction project. Cookie-cutter software projects are easier to estimate correctly as are cookie-cutter construction projects. If your sprints are frequently filled with spikes, it may be a sign that Kanban is the ticket.
Yes, the problem is that "cookie-cutter software projects" make no sense at all, whereas "cookie-cutter construction does".
The difference is in the cost of reproduction, for software it is essentially free, because we can perfectly clone any sequence of bytes, representing a system, for essentially zero cost, so why on earth would we ever make the same software again, because we can just copy it. So it is ALWAYS something new, it is ALWAYS at some level a custom project. Which as you said, are impossible to estimate with any degree of accuracy, whatever the field.
Love the HiPPO acronym 🤣🤣🤣
I'll be sure to use it.
Not knowing how much something cost even after the project has finished is something serious. That happens everywhere, all the time.
I usually share your videos to my managers to show them the problems we have can be fixed or at least mitigated using CI/CT/CD.
However, this time it is hard. In automotive industry we have too make sure that the car is moving/behaving in safe manner. So each feature is worth the same because it has to be done anyway. The order of task defined by the customer. But it also standard to plan based those inaccurate estimates and to escalate on multiple levels of HiPPOs if the plan does not hold. Then re-estimates are requested which are wasting even more time although the more accurate due to the narrowed cone of uncertainty... for the next delivery even more planning and tracking (aka interruptions) implemented just to make sure.
Any suggestions?
No Jeff definitely did the "stupid zone" drawing for the last 10+ years at least. Not sure why he didn't recognize it but hey good on you for asking him haha
I really like the beer-house analogy. No brain needed to explain this to stakeholders!
I was on a government program and was part of the proposal team bidding the software. We had a bunch of metrics we had to use like complexity and line of code counts. We finally came up with the software part and it was $12 million dollars. Management said it was too much and that I only had $9 million to work with (would have been nice to know that up front). So we went back and cut a bunch of time on each of the pieces. They questioned if the tasks could be done in such little time because that increased risk. I said you only gave me so much money to work with so I had to cut from somewhere. This is the stupid crap we have to work with as engineers...
Hey Dave , we’re did you get the information about the problem that estimation is around 4 x off at the start of a project?
Huh, I have practised this with product managers (digital agency PMs) to explain why something is a bad/good idea, never knew there was a method that used it too (rule 34?).
Particularly the cost to not implement a change.
Any comments about cost-of-delay or weighted cost-of-delay as opposed to stakeholder value?
“The measure of inteligence is the ability to change” Albert Einstein
Such a difficult topic within my organisation, one team follows an agile approach but consistently delivers years later than they originally communicated, and another team refuses to give any estimates at all unless under duress to give T-shirt sizes.... it's difficult to base a business around this that has customers that need to do things by a certain point... What's the answer?
I don't think that there is an answer, if we constrain the problem to only fixing a price. Let's be clear though, small simple, low cost, things that are very similar to work that we have done before - well ok, we have some basis to make a guess. It will be wrong, but commercially it is easier to say "I will make you a wordpress based website for £500" than I will build you a healthcare system for £50 million. The first one MAY be close enough, and you will do enough of them, that sometimes you will do it with £125 worth of effort, and sometimes £2000 and it will work out as long as it is more often less than more - incidentally those are the error bars for estimation at the start of a software project 1/4x to 4x!
But we can see from real world projects bigger projects are ALWAYS WRONG but because the numbers are so big and scary, people want more precision, even though bigger projects are more uncertain because they are less similar to one another, and there are always huge unknowns at the start.
My view is that anything we do to up the precision is a mistake, because the error-bars are so huge that precision isn't what we need. So more subjective, less precise is better, and best of all, to my mind, is the realisation that organising this through ideas like incremental/venture-capital style funding models for big projects is by far the more sane response to the reality of what SW dev really is.
The trouble is that customers and businesses are not always, or even usually, rational. So crossing your fingers and guessing is all there is.
I was afraid you would say that! I work in the broadcast industry and we sell a system that automates transmission and media management, the entire EU broadcast industry works by responding to tenders and agreeing to fixed prices & timescales .... stability is obviously paramount. I've been watching your videos (which are awesome) on DevOps/CD as I'm looking to create an organisational step change in the quality of code that is installed by moving away from the traditional waterfall (over the wall!) approach we currently have...... I'm technically an Ops guy, but do have R&D within my org @@ContinuousDelivery
I completely agree. The business model has to change. The days of easy business decisions of just doing a cost/value analysis are gone. In My experience, the business is usually trying to decide to take one major direction or another. They want to know broadly how 'big/costly' that work will be until they start to get a return, so they can decide which direction to take. They want to have a position of evidence to fall back on, so they can be accountable to the money men. E.g. We decided to go with option A because the Engineers said it was cost x and take y. When that turns out to be wrong, which it well, then they blame the engineers bad estimates. .
What business should do in this situation is perhaps decide their strategic commercial goals based on other factors like market potential, and just start building software. There is not a safety net here. A bigger mistake would committing to years of development, big contracts, when there are so many unknown unknowns and feeling like its all going to work out, because of some estimates that took the engineers a day or two to come up with and treated as gospel.
@@ContinuousDelivery
thanks for the video.
Thank you so much, your team is telepathic 😂
I think benefits to customers in terms of money are useful, but not the whole picture. Some changes are more about enabling and are very hard to attach a £value to.
In general, the longer something will take to develop, the greater the error in being able to estimate time. And the error isn't something like 180 days +/- 10 days. It's more like 170 days to 2 years. It's that bad when the thing you want to do is complex enough that it might take a half a year. That half a year estimate can easily balloon out to 1 year, 2 years, etc. So why bother estimating it? You do it because managers demand it and feel it's reasonable that a professional developer like you should be competent enough to get it right! Expectations of the developer's ability to estimate time are high. And developers are pressured into giving a number and adhering to it, with their reputation and job on the line. No pressure!
In my experience, there is just one type of software developer that is good at estimating how long some code will take to develop. They keep expectations low and scope very narrow. They never take on more than could be done in one week, maybe two at the most. They never talk about long term goals other than to say we'll do this very short term work one step at a time and then "together" (manager + developer + team) we will decide later on what the long term goals should be. And so they string the project along one step at a time making absolutely no long term plans or time estimates. When asked to do so, they just avoid being made responsible for it like a software architect would, and they will only commit to short term things. It's really about setting boundaries and sticking to them.
Managers love those kinds of developers. These developers seem very professional compared to those who want to plan deep into the future. That's because the ones that plan long term things are usually spectacularly bad at time estimates. Managers want developers to not only make these estimates but commit to them and never change them. If they're missing an intermediate goal, from a manager's perspective, that developer better be working around the clock to make up for the schedule slip. It's hilariously wrong headed for managers to do this. But they all pretty much do it.
What is needed is a middle ground between those two types of people. You want someone who's architecting the whole project out into the future many years. But you can't expect any of your time estimates out past a couple weeks to be meaningful. At the same time, you can't just have developers who only take on things that take them 1 to 2 weeks. You want someone who is responsible for long term planning and execution at all points in between. And as a manager, you have to accept that whatever time estimates are given for the completion of the project, they are subject to change and should not be considered a commitment at all by any of your employees.
That last part is a hard sell for managers. Most will say that's unacceptable. And around we go with each employee tap dancing around their managers with language that doesn't commit to anything. The ones foolish enough to commit to long-term goals get burned out and leave the company, get laid off, or are given horrible work that nobody wants.
“The measure of intelligence is the ability to change - Albert Einstein” nice shirt! And great content!
Can you share the source for the research on the « factor of 4 » errors at early project stage at 8:14 in the video? Thanks!
Where do you stand on Prototypes using tools like figma? Prototype that has tested really well with users and we have figured out the areas of value. How can we work out how long to build those areas and release? Client is asking when they can have the system? What can I tell them without any estimate?
What can you honestly tell them *with* an estimate?
It is not that I am fundamentally against the idea of estimates, I just don't believe that you can estimate software projects, unless they are exceedingly simple, and very very similar to what you have done before. And I would argue that if you are building something "very very similar to what you have done before" then you are doing it wrong, because you should have abstracted the learning and made something generic that you can make more easily.
I completely concede that all of this is a somewhat academic position to hold, which is why I am not actually a strong proponent of the #NoEstimates movement. My position is a little different. My experience is that the best teams and orgs grow out of estimates, when they are allowed to by circumstance.
But the real world is often irrational. So yes, customers will ask for estimates, even though this is an irrational, and deeply sub-optimal approach. I confess I think of estimates and the effort expended on them in the same way that I think about Witch-doctors, Astrology and Horoscopes. They are pseudo-scientific nonsense. But some people still like them. The trouble with estimates is that we are still in a world that assumes that they are real, so in some parts of the software business, you have to do the witch-doctor dance and read the runes to win the sale. Despite this, I still think it healthy, and more intellectually honest, to recognise that this is what you are doing, and so it is only sensible to not waste too much time and effort on the ceremonies surrounding estimation, because they don't represent reality, they are part of the planning theatre that exists in our industry. We aren't always free to do the rational things. Just to be clear, I have been one of the lead estimators, for often very big projects, in every org that I have worked in for the last 20 years. I used to try hard to be accurate, and learned that I was no more accurate when I tried hard, than when I simply guessed, but you still have give your guesses the illusion of rigor if you are selling things. Many orgs selling services, often don't even try to be accurate with their guesses, they just try to undercut the competition. This is all crazy, but it is still the industry that we work in.
@@ContinuousDelivery thank you I do appreciate the time you have taken to put a response together.
Having trod through the trenches of software development myself for many years and been "the coder" I recognize the sentiment of #NoEstimates and the frustration and stress giving estimates can be on a software team. And I agree with the deep suspiciousness of an "accurate estimate", it is as good as being able to stare at the stars and try to work out the distances between them with any great accuracy.
However, having said all that we do not work in a vacuum, software projects have budgets and businesses have expectations, that all need managing an all this without the project being cancelled and effort wasted if it is not delivered on time (or within budget).
I have no answers but just experience that as you correctly say is the crazy world of software development. I do like the approaches in your video and will try some of them out so thanks for illuminating those ideas.
One question I do have and it goes back to my initial comment and a grey area (for me) at the moment; UR/UX designers are building prototypes of applications, websites and system then testing them with clients and doing the whole incremental/iteration of the prototype before saying to the developer team in a Picard voice "make it so". Where do you stand on this methodology as I see it becoming more common over the past few years and it does put pressure on the developers to somehow estimate the building of the prototype as a finished article to then be real application.
Maybe I'm missing something, but even in the proposed system, we still have to estimate the cost of building the thing we want to build, and to guess how much value it can bring to the user. So, it seems that we're still back at square one in trying to estimate cost and value.
Sure, it's way simpler to use and look at compared to fancy, complex and over-engineered mathematical models, but I don't see how that helps us solve the problem at heart: better estimation.
It depends on what you mean by "better" if you mean "more precise", which is what most biz people want, I think that is looking for the wrong "better".
This is "better" because it is less precise, but more accurate as a result. It admits the error-bars inherent to estimation. So it is fast and easier to place ideas in the relevant box, and as soon as you do that, you know what to do. I think that is indeed "better". There is no solution to "more accurate" or "more precise" it is like wishing that fairies were real. It may sound like a nice idea, but it is imaginary.
@@ContinuousDelivery What I was asking was more confined to a single aspect of the problem, for example how can we say that feature X will cost us a beer, a vacation or a house? In the end we have to make such an estimation, even if it's broader than some fancy model with a ton of variables. And that's what I'm missing: why does that feature cost us a beer instead of a vacation? How do we estimate *that*?
I am sort in the same boat here.. the storyline is “yes, estimates are always wrong, let’s use a simple way to estimate”. It doesn’t solve anything fundamental… maybe restricting people to use the estimate as a science (which in my opinion is the biggest problem). Even going from time estimating to story points estimates had the same foundation. This video doesn’t give solutions for the fact that there is a need for some kind of predictability.. like explained in some of the comments. Because you have a dependency with another department (Sales, Marketing etc), another company or vendor, customer demand, legislation, competition.. there are probably more reasons to have some kind of indication of when your are done with something. And we probably all ask the same question when we hire people to do something for us in our personal life.. when is my car fixed, when is my new house done, when can I expect my package to be delivered etc etc. Yes, building features in software is more complex than delivering a package… but that doesn’t mean some predictability is not wanted within our world of software delivery. We cannot just say..hey, software dev is too complex, forget about when you are going to get it…it is done when it is done! #NoEstimates!!
It is just not realistic..
Lol. HiPPO. That gives me more than a chuckle.
@5:23 is where the actual answer is
What is the source for the 4x error claim? I really would like to read the entire thing and bring it up to clients as a "fair warning" when giving estimates.
I first read about it in the book "Rapid Development" by Steve McConnell (who led the team that developed Excel at Microsoft). The research wasn't his thought, it has been around for a very long time.
Here is his take: www.construx.com/books/the-cone-of-uncertainty/
Here is Wikipedia on the topic: en.wikipedia.org/wiki/Cone_of_Uncertainty
Scholarly Article on Addressing the cone of uncertainty: www.researchgate.net/profile/Pongtip-Aroonvatanaporn/publication/220883691_Reducing_estimation_uncertainty_with_continuous_assessment_tracking_the_cone_of_uncertainty/links/54bfdfe00cf28eae4a6635d2/Reducing-estimation-uncertainty-with-continuous-assessment-tracking-the-cone-of-uncertainty.pdf
Once everyone in the business understands that there is risk in the estimates of value and costs, the next debate is always, whose risk is it. If the Devs underestimate, shouldn't they work overtime? Of course not, but then who is to blame. I can't work in a toxic culture like that anymore, where anything that goes wrong or not to plan has to be someones fault.
To problem is that stakeholders don't understand the risks associated with software development
Hi Dave, Quick question! In agile, you should receive feedback from the customer about what's the next biggest priority in development. I try to follow that rule preciesly, I think it's great. But how do I prioritze the kind of work, that the user doesn't understand - for example, user wants to write email campaigns, and he cares about the email title, the addresses, the volume, etc. But I, as a programmer, know, that we must add safety features, like throttle, DDos prevention, time limits, etc. I tried to explanining those concepts to the customer, but I can see they don't really understand them - it goes right over their heads. So how do I prioritze work on that?
It depends 😉
Mostly for the things that are essential, don't surface them, and implement them as part of the story that the user cares about that makes most sense. You take responsibility for building good software, including adding features that must be there for the system to be safe and useable. Ask the user when you need help to decide how far to go. If the system is always behind a firewall, it may not need as much security.
Where you need to expose ideas to users to get such a steer, describe the ideas from their perspective. Don't say "DDos protection" or "throttles", you could ask what they'd like to happen when the system was under attach, or when they were closed down for spamming people. You may need to explain that if the system was under attack they wouldn't be able to send messages, and if they were deemed to be spamming they'd be black-listed. These are real world things, not technical esoterica. I think there is ALWAYS a user need hidden beneath any technical story. Find that, and talk to them about that.
Thank you very much for your answer. I think it's really simple, now that you've described it so simply :D
@@ContinuousDeliverySo what you're saying is "find a way to translate the technical necessity into user-need", and then try to prioritize from that using client feedback? And there's always a way to translate that?
@@danielwilkowski5899 Yes, but also, some technical things you don't need to ask permission for. An author doesn't ask "and would you like me to spell things correctly too, because that will take more time". So take responsibility for doing what is essential to this being a good system. Take responsibility for writing code that is easy to change in future, don't skimp testing or refactoring because "no one told me to do it", don't skimp on essential security, ask the users when there is a real choice to be made.
The other things are just "the cost of doing business". I have electricians in my house at the moment, they aren't asking me if I want a safe version or an unsafe version 😉
Estimations are wrong only 193.47% of the times
Surely you mean 193.476% of the time 🤣🤣
Amusing, but silly. Yes, estimations are incredibly hard, and they often makes no sense. I'm with you up to that point. Beyond that, good luck dragging product owners or managers into a meeting to explain how estimates are silly, as you pull up a little beer-car-house matrix. I'll predict to you that you're not doing your career path in that company any good. And if you are bidding on contracts, that contract will go to the company that can convince a customer with clear estimates, however nonsensical and inaccurate they are.
I feel like here you're striking mostly a balance between overly complicated estimations and market research. How can a product owner estimate what's the ratio between least work and most gain if they don't know which epics imply the least amount of work. They still need some T-Shirt sizes. What happens to the bold ideas that require substantial amount of work vs small ideas with small impacts with quick implementation times. How could you know which one is better without some wrong estimates on which you can base your computations on
I love your content dude
Why is your cone of uncertainty backwards?
It isn't backwards. The horizontal axis is time, the vertical axis is the accuracy of your guess. at time zero, when we haven't started yet, the data say we are out by a factor of 4x so if our prediction says 100 days it could be 25 days (x1/4) or 400 (x4). Our predictions are certainly accurate when we finish so as we approach the finish point, the right hand edge of the graph, our predicion intersects with the duration of the project.
Estimations are hard. But what grind my gears are devs that say "yes I am done tomorrow or this week" and then they repeat that for 3-6 months. How can you keep saying that with a straight face.
Why are you saying that regulation related features do not have value? They actually do, not only for the risk, but also for the benefit of consistency, the resulting system actually has better quality in various dimensions
Best thing is when i am asked to estimate how long fixing this bug will take? I don't know. Somewhere between 5min and a week guess.
I came across a company a couple of years ago, who had had 2 people working on a single, nasty bug, for 2 years!
After 17 seconds: no! Estimation and guessing are NOT the same!
Should I continue listening...?
Sounds similar to the three point estimate derived from PERT charts en.m.wikipedia.org/wiki/Three-point_estimation
One second after you give a range of time estimates, your boss takes the lowest time and demands that you commit to meeting this schedule. After all, it is YOUR estimate. Making an accurate estimate requires knowing all the problems you will have and how long each will take to solve. Like that happens. Deadlines do have a way of motivating you to finish up. Doesn't the product owner require timebox commitments from the developers before they start? It would seem to be obvious to put usage counters on each function (object) so that you can prioritize refactoring and bug fixes. Yet, nobody every talks about doing this.
I think teams are capable of working diligently and hard but without a deadline everything always gets over engineered.