Skip to content

The art of the possible

Added to your CPD log

View or edit this activity in your CPD log.

Go to My CPD
Only APM members have access to CPD features Become a member Already added to CPD log

View or edit this activity in your CPD log.

Go to My CPD
Added to your Saved Content Go to my Saved Content

Imagine a world where we could predict future project performance, head off risks before they escalate and improve certainty of forecasts. Is this a golden chalice that remains out of reach or a realisable future?

Most project management organisations have a reporting or dashboarding capability. This generally provides an insight into project performance based on historical performance; it is a rear-view mirror. Advancements in Tableau, Power BI and other tools have helped transform the range of insights that can be acquired from this data, with monthly releases and an increasing array of custom visuals. But, for most, these tools only provide descriptive analytics capability – describing what happened.

Predictive analytics uses statistical models and forecasting techniques to provide a probabilistic assessment of what may happen in the future. Prescriptive analytics takes it a step further by using optimisation, machine learning and simulation to inform the action that should be taken to create a positive impact on project delivery. Is this a pipe dream?

Not if we integrate:

  • predictive project analytics;
  • maturity models;
  • assurance toolkits; and
  • common causes of failure.

We need to separate out the predisposition of a project for failure from the early warning signs. They are closely related, but different. We know that the Caribbean is predisposed to tropical storms in late summer. The evidence tells us this. But we don’t know what the impact of the storm will be until we study the sea temperature, which tends to be a lead indicator for the force of the storm. The meteorological model provides us with real-time early warnings of the storm, how it builds and its trajectory. This analogy provides us with insights into how we could integrate the data and methods that span maturity models through to common causes of failure. We can also supplement this with real-time performance data on resource against plan, sentiment analysis and other leading indicators. By connecting these insights, we have the potential to generate accurate predictions on future performance.

Start-ups like nPlan and ALICE are bringing machine learning into the scheduling domain. They aggregate data from thousands of schedules and use it to develop an optimised schedule for a new construction project. They also have the potential to identify where schedule variance is most likely, enabling project managers to intervene and prioritise effort accordingly; they are making prescriptive analytics a reality. This takes the concept of reference-class forecasting to a new level.

By allocating tasks to individuals via apps, the schedule can be updated in real time. Construction companies are also exploring the use of drones to video progress and compare to building information modelling models, providing real-time feedback on delivered performance. By aggregating the measured performance from delivered projects, it is possible to link the schedule to the weather forecast and automatically reschedule to take account of weather events in advance. This is scenario analysis on the fly. The life of a scheduler is about to change significantly.

If we can apply this to scheduling, why not apply it to risk management (identifying typical risks, their probability, likely impact and what triggers them), resource management (areas of pressure, bias in resource estimates, knowledge/insights on demand) and benefits management (which benefits are difficult to realise, benefits that tend to be oversold, effective management action). The opportunities are tremendous.

Our profession arguably remains sceptical. We might think: ‘Projects have a number of moving parts and every situation is unique.’ And yet, we know from experience that some variables have a greater impact on outcome than others. Predictive analytics is able to decode the interrelationship between these variables, and prioritise and extract leading indicators, correlating them with known variance. Such tools will never be ‘right’, but they provide an evidence-based analysis of future project trajectory that generates an evidenced second opinion.

We need academia, industry (large and small) and professional bodies to work together to demonstrate the art of the possible and help deliver a step change in productivity. Government has established the vision for a future of artificial intelligence (AI): it is our collective gift to deliver. The important areas will be:

  • Training: A key challenge for our community is to understand when we can trust the data, and the implications of ignoring it. Gut instinct is a useful moderator, but it can often be influenced by bias. There will be an increasing need for project managers to temper their system-one thinking against the evidence from system two. If they ignore the analysis and the situation worsens, how culpable is the project manager for their decisions? The implications for our profession are significant. Project managers will need to become adept at understanding data, model optimisation and using data to shape key investment decisions.
  • Data improvement: The concept of predictive analytics is a fascinating one, but it is unlikely to be delivered overnight. We need to iterate through the use cases and assess whether our data is good enough to support the requisite analysis. What are the implications of improving the data, and can it be automated? What is the cost/benefit?
  • Data trusts: Predictive analytics is driven by data – lots of data. The more data we have, the better the performance of the models. But the challenge for start-ups is getting access to this data. It can be incredibly time-consuming. Those holding the data must grapple with the General Data Protection Regulation and data security considerations before they contemplate release. The concept of data trusts will help to ease this burden, but progress can happen at a glacial pace. I recently appealed to the Information Commissioner’s Office to gain access to data from the Cabinet Office. The exemptions it claimed were overruled on the basis of public interest, and the appeal was successful. This aligns with the government’s open-data policy and opens up the potential for a raft of similar freedom of information requests. These are journeys that are extremely inefficient and frustrating for everyone involved. We need to find new methods that also capture data from private-sector organisations. But we must make data trusts a reality for the benefit of the community, our profession and society.

Project management-based AI needs focus. It’s not something we can deliver solely from within our profession. The shape of project management is rapidly changing. Predictive analytics is already here; use cases will proliferate. As project managers, we need to understand the art of the possible and how it could shape our approach. Models need data to get to a level of accuracy where we trust the results. We need to work collegiately to develop these data sets for the benefit of our profession, and the data trusts will be key. We need to become increasingly proficient in understanding our use cases, the data required to satisfy the use cases, performance of algorithms and how we can turn analysis into actionable insights.

The future is here. How will you engage with it?

Martin Paver is director and managing consultant at Projecting Success

0 comments

Join the conversation!

Log in to post a comment, or create an account if you don't have one already.