## On Estimating

*Charlie Tanksley*

> Has anyone ever worked somewhere that was good at setting timelines for shipping software (implied in ‘good’ means hit the timelines)?

*Steve Wan*

> I have Charlie. :)

*Charlie Tanksley*

> Really? Any idea what made it successful? It feels impossible to me.

*Steve Wan*

> I think what worked for us was a combination of things. We worked in 2 week sprints, estimations included QA time before we could consider a story closed. We tracked story points very closely and at the start of every sprint the higher complexity stories were worked in first.

*Charlie Tanksley*

> Was it a long project? Did you hit the overall timeline?

*Steve Wan*

> That bought us time if there were things to flesh out, possibly pair on etc

*Charlie Tanksley*

> Cool

*Hez Ronningen*

> How long was the estimation good for? 1 month? 3?

*Steve Wan*

> Timelines were probably 2-3 months

> The estimation was part of sprint planning and for the 2 week sprint

*Hez Ronningen*

> no estimation / promises for the 3 month project?

*Steve Wan*

> there were epics for certain items filled with stories that had rough estimates from the leads to fill out the project timeline.

> I'd say that not all the features got in, but the PO had to choose what was in / out for that particular release...assuming there were no dependencies on the stories.

*Charlie Tanksley*

> That’s pretty good.

*Steve Wan*

> But our product owner was also very engaged and understood things in the sense that if something was overly complicated to do

*Charlie Tanksley*

> We are trying to get more accurate but it doesn’t seem promising.

*Steve Wan*

> He was willing to sacrifice certain parts

> And have them worked on in the next release

*Steve Wan*

> Also we always seemed to estimate higher

> So in agile points if there were 3 and 5s split amongst the group, the story would get a 5

> The team was about 7 devs and 2 QA

*Charlie Tanksley*

> That’s helpful. Thanks, @steve!

*Steve Wan*

> No prob. Estimation is hard.

*Brian Shirai*

> @steve was anyone tracking the business impact of hitting those "estimates" and why there was business value to estimating? was anyone tracking "cost of delay" or "time to value"?

> I have a really hard time finding a mechanism in the market that rewards "accurate estimating", but there is definitely value to actually shipping, and cost of delay and time to value are not difficult to measure

*Steve Wan*

> @brixen the value was to our customers / sales / implementation folks. Life sciences software is a kind a niche market there was definite value getting features out. Internally, there was the morale boost when we did ship something on time and generally positive impact to our customers

*Brian Shirai*

> there's undeniably value in keeping a promise, but I'm talking about the value of making the promise in the first place

*Steve Wan*

> That's a hard one for me know really - what happens in those negotiations but I think the value there was that we truly wanted to help our customers do their work efficiently? Tracking samples through a lab workflow is probably really mind numbing work, tedious and error prone. We tried to simplify that work.

> that's me being naive and hoping that the world is great but I do know there are always sales targets / quarter that GenoLogics had to achieve and I'm sure those promises, "value" wise to the company meant getting paid and shareholders happy.

*Brian Shirai*

> well, that's mixing a ton of stuff, so "value" comes to mean nearly "anything good", which doesn't really aid understanding

> one of the things from Alan Cooper's first edition of About Face has stuck with me for 20+ years, "the user's goals are simple: get a reasonable amount of work done, and not look stupid. Of those, not looking stupid is the most important" (summary / approximate quote, I don't have that edition anymore)

> the customer is the most valuable to any business, and solving the customer's problem efficiently and with as low cost as possible is always a winning approach

> it's possible to deliver a product without making estimates, it's not really possible to do anything with estimates by themselves, so if they are an enabling function, they should have measurable value, and I'm curious if anyone ever measures it

> because I have never once seen it measured

> it's valuable in manufacturing because coordination is actually the problem being solved (ie when will this material I need get here so I can process it)

> building digital products has nearly no coordination problem to solve because data can always be fabricated

*Steve Wan*

> I don't know how / if the estimates were tracked. All I remember was that on a specific date, we'd be delivering X number of features. What I do know is that the Director of Dev would sit with the product owner to determine if X features were reasonable to fit into that timeline. I also know that those parties had to compromise - whether that meant the date would need to move or scope gets cut and is targeted for a later date.

*Brian Shirai*

> [on the topic of measuring things](https://codeclimate.com/velocity/), and yes, I think this solidly belongs in #antidaymaker
> Velocity | Code Climate
> Velocity helps organizations increase their engineering capacity by identifying bottlenecks, improving day-to-day developer experience, and coaching teams with data-driven insights, not just anecdotes.
> primarily due to the very idea that you can measure an arbitrary codebase against some "industry average"

*Matthew Lyon*

[some thoughts on estimates](https://twitter.com/mattly/status/250278697771356160)
> Matthew Lyon@mattly
> There are really four types of lies: Lies, Damned Lies, Statistics, and Estimates.

> there’s nothing wrong with an individual estimate, per-se

> it’s when you start thinking of them in aggregate that you get in hot water

> You can’t just estimate a bunch of tasks with numbers and then _add those together_ and treat the result as if it means something

> the problem with estimates is that they’re fractally wrong

> you’re looking from where you are now to where you want to be and see a straight line

> without all the twists and turns along the way and while you can _plan for_ those twists and turns with some thinking to increase the accuracy of your estimate, I’d argue that it _the accuracy of your estimate is logarithmically related to the time you spent making it_

> and then many of those twists and turns can only really be discovered in the trenches

*Charlie Tanksley*

> Yea. That all sounds right to me.

*Matthew Lyon*

> so when you take these things that are fundamentally lies and start adding them together

> you’re committing one of the grave sins of statistics

> it’s like adding averages together to get a sum or a new average which is one of the reason why I’ve become an advocate of using t-shirt sizes to estimate things

> `[2xs xs s m l xl 2xl]`

> and if you need anything larger than that then you haven’t broken the thing up enough

> see also: [TechCrunch On the dark art of software estimation](https://techcrunch.com/2016/04/30/estimate-thrice-develop-once/)
> "How long will it take?" demand managers, clients, and executives. "It takes as long as it takes," retort irritated engineers. They counter: "Give us an estimate!" And the engineers gather their wits, call upon their experience, contemplate the entrails of farm animals, throw darts at a board adorn…

*Charlie Tanksley*

> Hrm. Very interesting.

> So what about for a ‘big’ project? Do you just not estimate it bc it is bigger than 2xl?

*Matthew Lyon*

> I’d argue that you have to break it up into smaller, estimatable chunks but: *you cannot add those estimates together*

*Charlie Tanksley*

> so what do you do when you have 20 estimated chunks and the PM or whatever asks ‘but when will the whole thing be done?’

*Matthew Lyon*

> explain this to them and tell them what I’m about to say is a lie

*Charlie Tanksley*

> haha okay

> I like that

*Matthew Lyon*
> they’re asking me to answer the equivalent of, “when will the next cascadia subduction zone earthquake happen?”

> there is also, I believe, an inverse relationship between a focus on estimates and quality

> when you’re rushing to get something done because someone made an estimate and based a deadline on it, you start cutting corners

> you’re aware of some of those corners but not all of them and if you’re disciplined you’ll log the ones you’re aware of, and find ways to become aware of the ones you’re not aware of

> this was the whole point of the style of testing on IQ Analytics

> but when you’ve under pressure, a lot of that stuff that isn’t essential to the stated goal of getting something that works to a (often-times poorly thought-out) specification goes out the window

*Brian Shirai*

> @charlietanksley estimating comes down to this, "how can I make these lies tell the truth" but most people see it as, "how can I be sure these truths don't tell me a lie", and that's why I push back so hard on what value the estimating itself, as an activity, has for the work being done

> in manufacturing, it has very high value because the properties of matter dictate that material be colocated with the machines processing it

> absolutely inescapable

? working with information does not have the same requirement for coordination, and this is even true when integrating with a system or organization not under your control

> another property of information is that it is _fractal_, you can complete a task with a spectrum of information

> as an example, consider car manufacturing for the past century. 100 years ago, we were making cars, and that's been true at every point in the past 100 years, so the information to make a car has been there the whole time, but the information to build a Tesla is pretty recent

> most of the time, when working with information, the insistence on an estimate presumes that a knowable, fixed about of information is both necessary and required to "complete" the task according to the "estimate"

*Charlie Tanksley*

> hrm interesting

> I think it’s the first part I’m still trying to get my head around: making the lies tell the truth.

*Brian Shirai*

> the thing that is most ironic about the estimating is that it's most often requested _before_ really understanding the problem, but if you push back and say, "I need to study this first", you'll get a, "we don't have time for that"

> no time for doing something that leads to understanding, but plenty of time to sit around and lie to each other :thinking_face:

> @charlietanksley if you have historical data that tells you how long a task will take, there is 1. zero need to estimate, and 2. pretty high reliability (because it's actually been done)

> if you start with the presumption that your estimate is not a lie, you are sunk from the start, unless any sort of rigor is considered optional

> Here's a simple heuristic:

1. am I working with matter? if yes, known processes can provide reasonably good estimates
counter points, Tesla manufacturing of the model 3 and every time Amazon's delivery estimate is wrong
and especially understand the latter one because even when Amazon gets you your package sooner than estimated, _their estimate is wrong_!

2. am I working with information? if yes, don't estimate, build something using the least amount of information to complete the task
instead of wasting time estimating, take the time to try to understand the problem and continually ask this one question, "what do I not know?"