Forecasting Error: Not Accounting For Scope Increase

Posted by in Featured, Reference |

Initial estimates of the amount of work for a project or idea lacks detail. Attempting to forecast using historical rates would be in error if:

1. The granularity of work breakdown differs from historical samples.

2. The project isn’t completely fleshed out as to what features are required

3. The project ignores system and operational issues. New environments, changes in current environments, security or performance audits and requirements.

If every project needed to be completely understood with every detail, then forecasting would take too long. The rate of failure in delivery of waterfall projects howed that even attempting to completely design and understand projects doesn’t improve delivery likelihood.

Tracking the increase in scope and the causes for previous projects allows projects at an idea level to forecast with some certainty of likely scope increase. The recommended technique is to keep clear records of the amount of total scope for each project, categorized by work-item type. Some categories we recommend are:

1. Split (straight split of known work)

2. Discovered Scope (scope found only after deling into the detail)

3. New Requirement (nothing to do with the original ideas, added features)

4. Adopted work (work the team took on but isn’t actually part of a project)

By tagging each backlog item with these tags, a growth-rate from an original amount of work can be computed. This adjustment can be applied to new ideas when quantitatively forecasted. These metrics are also good to put targets around. None of these items are bad by default, its just good to know where the scope increases are coming from and to manage/consider them when forecasting proposals.

 

Read More

Free Tools and Resources

Posted by in Featured, Reference, Tools |

We often build custom tools and spreadsheets during our consulting work. We offer these to the community for free under a Creative Commons Attribution Non-Commercial License. Please help us keep these resources free and updated by abiding by the conditions of this license.

\Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

These tools are kept under version control on GitHub here –

https://github.com/FocusedObjective/FocusedObjective.Resources

 

Read More

Agile Forecasting Error: Assuming Uniform Throughput or Velocity

Posted by in Featured, Forecasting, Reference |

When planning, either on the back of a napkin or using more statistical methods, some estimation of “how long” is often needed. Precision will differ depending on needs, for example it might just be good enough to know the calendar quarter to confirm that a new website will be in place well in advance of a promotional period. Other people might want more clarity because they are making trade-offs between multiple options. We often get asked to confirm or build a forecasting model for these organizations and see some common errors that cause erroneous forecasts.

This post and subsequent posts will outline some of the common ones we see.

Assuming uniform throughput, and that team performance is the only factor for throughput

Whether forecasting using velocity or story-count over time, the amount of work being completed is a measure of throughput or in plain terms, a rate of completed work over time. We often see organizations consider this rate as within the teams control, and used as a measure of progress and performance. Sometime it is, but if we plot the throughput history, notable areas of instability and step-changes are evident. Knowing the sources of these, and how to adjust throughput forecasts knowing this in advance is key to improving any method of delivery estimation.

throughput

Figure 1 – Throughput run charts like this show discontinuities that are other than team completion rate and need to be considered when forecasting

Common causes we see and adjust for –

1. Team forming stage and other phases – Teams often take on new technologies and team members at the beginning of a new piece of work. It should be expected that the early storming phases for tams and new investigative development will be slower than when a team is long-running and stable. We start with an adjustment of minus 50% for the first 20% of a project.

2. Calendar events – Differing by regional geography and countries, often there are whole calendar periods where throughput drops dramatically and recovers slowly. Depending on the granularity of our forecasts, consider known holiday days where no-one will work, long weekends where people take vacation to extend the single days holiday to a week or more, and the biggest factor of all, December. If an organization has a “use or lose” vacation policy, a lot of technical staff end up using that vacation in December, and some combine it with new vacation and extend to January. We see roughly 4 weeks of almost no progress in some organizations. Impact is cascading if teams have tight dependencies. Forecasting over these periods is challenging.

3. Organizational changes – Employee concerns and stress during leadership changes and re-orgs is another step-function factor in throughput impact. Even rumors can be seen in throughput run charts. Expect a -20% decrease recovering over one to two months depending on how well the change is accepted and communicated. For large companies, we assume there will be at least one of these for every six month period.

4. Changes in the way work is sub-divided or described – this is an obvious one, but often overlooked. New processes or constraints or motivations will impact the way work is sub-divided. Throughput or velocity being captured at one granularity is not going to forecasts work in a different granularity. We often adjust for this by taking a sample of prior work and getting teams to break-down using the new process to find a multiplying factor. Performing this regularly for samples of work in each quarter going back 12 months helps normalize a throughput run-chart back to “real rate of progress.” This process re-plots the historical throughput to a similar rate of that being used today with the aim of isolating team process improvements rather than work-size anomalies.

These are just a start of the factors that influence throughput in ways that make it a poorer predictor of the future than it could be. These factors apply no matter what unit of completion rate used, be it velocity, or story count.

 

Read More

Paper: The Economic Impact of Software Development Process Choice – Cycle-time Analysis and Monte Carlo Simulation Results

Posted by in Featured, Forecasting, Reference |

Troy Magennis has recently written a paper on economic impact of Agile calculations using a variety of cycle-time analysis and Monte Carlo techniques. We would like feedback on this paper and especially contrary views.

Download the paper here: The Economic Impact of Software Development Process Choice – Cycle-time Analysis and Monte Carlo Simulation Results

Abstract:

IT executives initiate software development process methodology change with faith that it will lower development cost, decrease time-to-market and increase quality. Anecdotes and success stories from agile practitioners and vendors provide evidence that other companies have succeeded following a newly chosen doctrine. Quantitative evidence is scarcer than these stories, and when available, often unverifiable.

 

This paper introduces a quantitative approach to assess software process methodology change. It proposes working from the perspective of impact on cycle-time performance (the time from the start of individual pieces of work until their completion), before and after a process change.

 

This paper introduces the history and theoretical basis of this analysis, and then presents a commercial case study. The case study demonstrates how the economic value of a process change initiative was quantified to understand success and payoff.

 

Cycle-time is a convenient metric for comparing proposed and ongoing process improvement due to its easy capture and applicability to all processes. Poor cycle-time analysis can lead to teams being held to erroneous service level expectations. Properly comparing the impact of proposed process change scenarios, modeled using historical or estimated cycle-time performance helps isolate the bottom line impact of process changes with quantitative rigor.

This paper will be presented at the HICSS Conference in January 2015.

 

 

Read More

Seminar: Dial “M” for Metrics: Agile Metrics and Data Science Full Day

Posted by in Announcements, Events |

We are proud to work with great partners. In this power-day of Agile metrics and analytics, Focused Objective’s Troy Magennis has teamed up with Larry Maccherone, formally of Rally Software where he led the team responsible for their metrics and analytics features.

This first seminar will be held in Seattle, 14th October 2014. This seminar is very competitivly priced, and you would rarely get two industry leaders for one day at this price. Please book early because we are limiting the number of attendees to ensure quality interactions with attendees.
Eventbrite - Dial "M" for Metrics: Agile Metrics and Data Science Full Day

 

Dial “M” for Metrics is a one day seminar that covers the entire lifecycle of an Agile metrics process, from metric selection through to reporting data in compelling ways. After this seminar, participants will be well versed in all aspects of Agile metrics and basic data science for use in their own IT organizations.

The day is structured with multiple lecture style presentations from industry experts on all metric topics from selection, capture, analysis, forecasting to reporting. The presenters are all experienced in the practical application of metrics for IT organizations from team to portfolio and organization perspectives.

This seminar is suited to you if you –

  • Struggle to know what metrics are useful and which ones are misleading
  • Suffer from data quality through poor capture and gamed values
  • Find analyzing and presenting data to get action is harder than it ought to be 

Target Audience –

  • Executives or mangers wanting a better understanding of Agile metrics and analysis
  • People responsible for Agile project planning and reporting
  • People interesting in expanding their knowledge on Agile metrics and analytics who are using Scrum, Lean, Kanban or ScrumBan as their IT process

Topic areas –

  • Metric Selection – picking the right metrics that add value and avoid gaming or poor performance
  • Metric Capture – identifying errors and cleaning noisy or gamed data
  • Metric Analysis – forecasting and interpreting variation in data that is significant
  • Metric Presentation – presenting data to get action and avoiding common presentation mistakes
  • Making Analytic Decisions / Q & A – wrap-up combining all the practices into a useful management program
 
Coffee povided (it is Seattle). Lunch will be on your own at one of many downtown locations. We suggest joining your colleagues for networking time or joining the speakers for vigorous discussions. This is not a sales conference. This is a training course with lots of time for specific questions. The speakers will be available during the breaks and specifically at lunchtime for one on one advice. 
 
Detailed Agenda
Subject to change on the day based on audience feedback!

Time

Topic

Details

90 min

Metric Selection
by Larry Maccherone

– Seven Deadly Sins of Metrics

– Metric Selection Strategies: ODIM/MIDO

– Big Data Science versus Tactical Metrics

– Value of Different Metrics vs Cost of Acquisition

60 min

Metric Capture

by Troy Magennis

– Capturing Data Correctly

– Cleaning Data from Errors and Bias

– How Much Data is Needed

– Capturing Data Context

90 min

Metric Analysis

by Troy Magennis

– Analyzing Data Integrity (error checking)

– Analyzing Variability

– Forecasting and Prediction

 – Regression

 – Monte Carlo

– Limits of Certainty

– Sensitivity Analysis

60 min

Metric Presentation

by Larry Maccherone

– Presenting accurate and compelling data

– Lying with Statistics and knowing others have

– Comparing Data Across Teams Safely
– Presenting uncertainty in results

90 min

Making Analytic Decisions / Q & A

by Larry and Troy

– Starting Organization Wide Metric Programs

– Quantifying the Impact and ROI of Metric Programs

– Industry Progress – What tools do things well

– Where does Big Data fit with Agile

 

Forecasting Hands-on  Tutorial (optional, runs Wednesday)
Agile Forecasting by Troy Magennis. This will be a hands-on day building probabilistic forecasting models for Agile (Scrum, Kanban, ScrumBan) teams and projects. PC or Mac running Windows in emulator if you want to follow-along during the exercises. By the end of this day attendees will be well versed in building probabilistic models and forecasting their own IT projects.
About the Speakers
Larry Maccherone

Larry is an industry recognized Agile thought leader. Larry most recently served as Rally Software’s Director of Analytics and Research where he led a team using big data techniques to draw interesting insights, provide software development performance metrics, and provide products that allow Rally customers to make better decisions with data.

Before coming to Rally Software, Larry worked at Carnegie Mellon with the Software Engineering Institute for seven years conducting research on software engineering metrics with a particular focus on reintroducing quantitative insight back into the agile world.

Larry is an accomplished author and speaker, presenting at major conferences for the lean and agile markets over the last several years, including the most highly rated talk at Agile 2013.

Agile 2014 Interview

Troy Magennis – Focused Objective LLC

Troy has been involved with technology companies since 1994, fulfilling roles from QA through to CTO over the last 20 years. Troy speaks at many Agile conferences and has played an Agile training and mentoring role for executives in small and large organizations (Walmart, Microsoft, Sabre, Siemens being previous clients).

Troy currently consults and trains organizations wanting to improve decision making on software portfolio and project questions through Agile and Lean thinking and tools. Applying Scrum and Lean techniques appropriately and where they are going to make this biggest benefit through quantitative rigor.

Agile 2014 Interview

Agile 2014 slides

Interview “Projects at Work”

 

Read More