Skip to content

Measures for assurance - user experience

Added to your CPD log

View or edit this activity in your CPD log.

Go to My CPD
Only APM members have access to CPD features Become a member Already added to CPD log

View or edit this activity in your CPD log.

Go to My CPD
Added to your Saved Content Go to my Saved Content

I was delighted to see the Measures for Assurance tool-kit made available to members and covered in as excellent article in the Project - Winter edition. I have heard that the toolkit has been downloaded by many APM members.

As one of the authors of the toolkit I wanted to share my experience of measuring assurance and to learn from others.

The measures are based on a structure of 10 criteria; and within these guidance is provided on what good looks like based on expert judgement and evidence. The group of experts who developed the tool blended expertise from a wide range of sectors - IT, buildings, infrastructure, railways, manufacturing, petrochemicals and last but not least oil and gas - my own area of experience.

The 10 criteria for assurance cover all aspects of projects - from client and scope to governance. From my experience of oil and gas the criteria fit very closely the way I have seen and practiced assurance - though there are differences in the precise terminology used based on company and industry practices. The biggest difference is the emphasis on Safety for a high hazard industry such as oil and gas; so the assurance I have seen always starts with safety, particularly process safety which in simple terms means ‘keeping the hydrocarbon in the pipes and vessels’ as well as personal safety for construction on mega project sites.

My experience of assurance in the oil and gas company I worked for was that it was carried out by a central group, with support from discipline experts, and normally timed to be completed in the run up to the main stage gates. The assurance was based on 10 project principles which closely match the 10 criteria listed in the tool-kit. Different disciplines were involved in specific criteria, particularly for HSSE (Health, Safety, Security and Environment) - which always came first , and Project Services - which covered a deep dive review of the project cost estimate and schedule. The assurance results were reported to the stage gate and shown on a Red/Amber/Green dashboard. Progress through the stage gate would depend on addressing satisfactorily the risks show up by the assurance process.

I expect larger companies with experience of projects will have their own framework for assurance with terminology based on company and industry history and practice. However, the measures for assurance framework along with the guidance to support assurance with judgement and evidence of best practice can be a useful benchmark. I would commend anyone involved with project assurance to review the tool-kit.

To conclude this blog with a request - it would be great to hear of best practice from others and for suggestions of areas of improvement for the tool-kit. I am keen that we offer an excel and/or word version of the toolkit as a template for users to adopt and adapt. There is also talk of setting up a user group and having a launch event if the new year; please look out for the invite.

2 comments

Join the conversation!

Log in to post a comment, or create an account if you don't have one already.

  1. Richard Renshaw
    Richard Renshaw 13 January 2017, 09:11 PM

    In my opinion the toolkit is very good, thanks to all the team involved and in particular Lead Authors yourself and Geoff Newton. As a suggestion could you consider to establish a mini site to enable consolidation of case studies and providing capability for patterns as lessons learned across industries? I thought one option could be to mirror the P3M3 organizational maturity model approach wherein confidentiality is maintained on individual organizations and possibly the name of the organisation is assigned a code number to enable benchmarking. To enable future comparisons of apples to apples I thought it important to map out the genric roles of organisations across the project life cycle. As an example for future you could adapt as follows: - Benchmarking feedback when using the toolkit for organizations involced in: - strategic, master planning and front end conceptual design - schematic and detailed design - delivery - design and delivery - asset management - operations. In addition consider to have versions of the toolkit as: - a light version, to be used as a health check - the toolkit for conducting audits relative to the quality system. I wish you all good luck and wish to volunteer as a supporter for implementation in any manner. Thank you.

  2. Tim Podesta
    Tim Podesta 09 February 2017, 02:27 PM

    Richard, thank you for your feedback and suggestion. I will take up its the SIG committee and also refer to the feedback we got at the evening event we held in London last evening February 7th. It would be good to get your support as we look to develop a user group to share experience and promote use of best practices,