Skip to content

How Do You Know Your Project is on Track? – The Role of Data, James Lea, 24 May 2018

Added to your CPD log

View or edit this activity in your CPD log.

Go to My CPD
Only APM members have access to CPD features Become a member Already added to CPD log

View or edit this activity in your CPD log.

Go to My CPD
Added to your Saved Content Go to my Saved Content

At tonight’s event, held at BAWA, James Lea looked at how to monitor and control projects and the importance of using the right data.  This was to be an interactive discussion, and the audience sat around tables, cabaret style to encourage a workshop environment.

James set the scene by explaining that the role of project management is to deliver to an agreed time and cost, the required benefits to the client. The question is about the level of confidence that the project manager has in doing so, is the journey misty and unclear, or are there clear aims and objectives.

The audience were asked to discuss what makes a good project, how they felt about that project and what made them feel that way.  Ideas shared included having the right tools, supported, trust, enabled, clear goals, clear communication from the leadership and alignment between personal and project / corporate goals and objectives.

Measurement is important to track progress against your plan, to inform decision making and to provide confidence to stakeholders that their interests are being looked after, especially the client.

The key question is about what should be measured. The Integration Definition for Function Modelling (IDEF0) is a useful tool to understand the inputs, controls, resources and outputs of a function or activity. Outputs can be useful value adding, or they can be waste, which can be rework/back flow.  Earned value management (EVM) is also a useful tool to measure the % completion of tasks and activities.

The audience were then asked to discuss what measures they used and also what could be measured and isn’t, and why not. Ideas shared included resources used (time, budget), lead and lag indicators for outputs, milestones achieved, and intangibles such as staff morale and customer satisfaction.  Things that were not measured included rework, benefits, lesson learned logged but not learned (e.g. not resulted in a change request), cost of learning curve for new staff.

James then looked at a number of case studies to illustrate some useful examples of data measures. Case 1 illustrated a graph of input measures and how well they correlated with effort. Input measures can provide a really useful leading indicator allowing anomalies to be spotted and addressed early. Case 2 looked at software faults, with a graph of number of investigations vs time of investigation. By understanding the level of rework, the overall process was streamlined. Case 3 showed how a product breakdown structure illustrated alignment with strategic goals. Case 4 showed how a review team was able to improve their own performance by monitoring the time and resources needed for each review. Case 5 used a milestone trend chart to show how tasks were not being finished which gave feedback to the teams to improve their prediction of task completion.

The challenge is how to motivate individuals and teams to want to measure project data and their performance. The ‘magic’ happens when teams own their data and can see how it relates to their performance, i.e. they have a feedback loop which they own and can use to encourage team spirit and fun competition.   The primary feedback loop should always be for the team, and the secondary loop (using the same data) can be used for upwards reporting.

Counting is an art. James emphasised the need to be aware of politics when shining a light on reality – it is not always comfortable and can be seen as a threat if not handled sensitively.  Data in the raw state is not very helpful.  It is more understandable if presented graphically which can help initiate conversations with the project team. A no blame culture is essential to get buy-in. Several visualisation tools were described.

Data do’s:  Design your project around data. Explore correlations – which can be really valuable if leading indicators can be found. Reinforce a no-blame culture. Anonymise the data.

Data don’ts: Publish without buy-in. Present tables of numbers – more powerful if visualised in a graph.

The final exercise was for tables to discuss what they would do differently tomorrow at work. People were challenged to come up with an action plan.

In summary, data is a means to an end. People need information, preferably in a visual form, to help start real dialogue to understand why and to help make quality decisions.  Data is most useful to the team for their own monitoring and performance monitoring, its secondary purpose is for upwards reporting. Measuring systems need to be designed as part of project initiation and form part normal project routines with the necessary resources included in the budget. A no blame culture is essential to build trust and honesty in gathering and reporting performance data and information.


The presentation is available on the APM Web Site SlideShare page as well as a copy below.


Martin Gosden
SWWE Branch Chairman 

1 comments

Join the conversation!

Log in to post a comment, or create an account if you don't have one already.

  1. James Lea
    James Lea 15 June 2018, 09:35 AM

    Please get in touch with James Lea if your APM branch or organisation would like to receive this presentation - I am happy to run it again.