Tools for comparing schedules have not evolved like other schedule analytics tools and remain somewhat primitive because they simply show differences between the schedules. Analysts are still faced with the difficult and expensive task of performing a forensic schedule analysis to determine why a key milestone slipped. Steelray has developed a forensic schedule analysis software tool based on a methodology from AACE Recommended Practice 29R-03, MIP 3.4. The tool provides answers within seconds on analyses that would otherwise take hours or days to discover.
The Problem
In 2018, I attended the PM College of Scheduling conference in Vancouver, British Columbia. I had the opportunity to speak with several schedule analysts and asked them all the same two questions: What is still difficult for you? Where is software still falling short for you? And while I received various answers, a common pattern emerged: schedule comparison tools were still in the stone age.
It's useful to understand why these analysts use schedule comparison tools. Often, they compare schedule updates to answer a simple question:
Compared to the previous schedule update, why did the finish date move?
And while existing tools provide them with an exhaustive list of differences between updates, they are still left with the lion’s share of the work (typically several days) necessary to answer the question.
The tool most often referenced was Claim Digger, the schedule comparison tool included in the Oracle Primavera P6 Visualizer. Claim Digger compares two updates of a schedule and outputs a series of tables showing you the list of differences it found:
Claim Digger is very good at what it does (comparison), but it doesn’t get you very far toward answering the key question of why the finish date moved. The bulk of the analysis is still left to the analyst to figure out. Other tools perform a similar comparison, but none attempt to answer the most critical question.
So how do analysts arrive at the correct answer? The most common technique used is to perform a forensic schedule analysis. We start with the previous update. Each change is applied, and we observe the impact on the finish date. When a change impacts the finish date, we record that change and its effect. When you’ve applied all the changes, you should end up with the current update and a table with the answers – which changes impacted the finish date and the extent of that impact.
Schedule delay analyses are difficult. Besides the precise work of applying each change to the schedule, you'll have to deal with concurrent changes, calendar changes, relationship changes, etc. These analyses are also time-consuming and thus expensive. So, we began by asking a simple question:
Why, in 2018, is forensic schedule analysis not being performed by a computer?
We made the decision to attempt to solve this problem in software. Our initial goal was to develop a better Claim Digger, to create a tool that not only compares schedule updates but performs an objective delay analysis to answer precisely why the finish date moved. Over the years, we realized we were building something much more ambitious.
Our development was a true research & development project. We didn’t know whether it was a solvable problem, how long it would take to solve, or the issues we would encounter along the way.
We had to tackle some sticky problems along the way, including:
Four years later (2022), we released the technology as a product called Steelray Delay Analyzer (SDA). SDA is used daily by schedule analysts and other project controls professionals.
Our Approach
We began by learning everything we could about forensic schedule analysis and deciding which method made the most sense. The primary guiding resource we used was AACE® International Recommended Practice No. 29R-03, Forensic Schedule Analysis, and specific to that paper, the method implementation protocol (MIP) 3.4: Observational / Dynamic / Contemporaneous Split.
Using that method, we start with the previous update and apply progress-related changes to create a “half-step” schedule, observing the impact on the finish date. We then apply the schedule revisions separately and record the effect of those revisions. To properly handle concurrent changes, we take advantage of the fact that a computer is doing the calculations and apply the changes fractionally in “windows” as small as one minute.
We contacted several of the paper’s authors and involved them early in the process. We followed an agile software development methodology, creating prototypes that we could show them. Their feedback informed many of our design and functionality decisions. Their encouragement sustained us through a long and challenging development process, and we are indebted to them.
Design Philosophy
We adhered to several design principles while developing the product.
Real-time
First, fast performance was paramount. When performing any type of analysis, you often try to answer one or more questions. The best analysis tools are low friction, meaning the tool should get out of the way and allow you to explore and analyze the data as close to the speed of thought as possible. Your train of thought is interrupted if you must wait several minutes for the software to finish its computations. With schedule analysis, you’re essentially solving mysteries about what happened, and we want this process to be as smooth (frictionless) as possible.
Objectivity
Some of the people we worked with related disputes where each side hired a schedule analyst to serve as an expert witness. In some cases, each expert analyzed the same set of schedule updates and arrived in court with a very different story. We resolved that applying the tool on the same schedule updates should produce the same results. We treated the problem as a math problem. We made several design decisions to avoid scenarios where someone could manipulate the analysis to tell a different story.
Transparency
It was also an important design consideration that the tool “show its work.” Because the software and analysis may receive great scrutiny, we didn’t want the software to be a black box that outputs answers without showing how it arrived at those answers.
The Solution
The resulting product, Steelray Delay Analyzer, is the culmination of our efforts. It performs a delay analysis on Primavera schedules exported as XER files. It contains the following components:
Early prototypes produced numeric tables with the results, and we focused our efforts on proving the accuracy and validity of the results. As we showed the technology to early access users, we received feedback that the delay analysis table itself would not be enough. The analyst would need to “see” the results for themselves; the tool would have to provide visualizations that helped explain the results. And with the delay analysis engine, computationally impractical visualizations became attainable.
The above chart shows a baseline schedule and 29 updates, resulting in 29 comparison periods and 29 delay analyses. Each update is approximately 1000 activities, and the entire analysis was performed in under 5 seconds.
The 3D Gantt
Gantt charts use the Y-axis to display the WBS and the X-axis to show the timeline. The data date (or status date) is a vertical line in the chart, the single point in time when the schedule was last updated. The delay analysis data delivers a powerful capability: you now have an idea of what the schedule looks like at any point throughout the update period. With this capability, we realized we could do something unique with the tool. We asked: What if you could drag that data date line forward and back in time and see the Gantt update in real-time? Could we perform the calculations fast enough to render the Gantt chart? And by adding “point in time” as a third dimension, what could you do with a 3D Gantt?
We achieved this capability, and it continues to surprise us how useful this feature is for understanding how the changes in an update period impacted the schedule.
The data date, shown above as the gold vertical line in the Gantt, can be dragged left and right, and the Gantt (values on the left and diagram on the right) will update in real-time. The critical path (black) and delays (red) embedded in the WBS bars will also update in real-time as you move through time.
The Future
We’ve only begun exploring what we can do with this technology. We continue to tinker with the visualization and other capabilities of the tool. We believe that this tool will have applications beyond forensic schedule analysis. Before tools like Steelray Delay Analyzer, the analysis was so difficult and time-consuming that it didn’t gain traction among most project controls professionals. We know of very few who were willing to invest a few days to learn the answer to why the finish date moved. With SDA, the answers are reached within seconds, and we are hopeful that this practice will become commonplace with the time and expense barrier removed.
For More Information
If you’re interested in the technology, please contact Steelray Software at sales@steelray.com or call us at +1 (404) 806-0160.