Skip to content

How can government learn more from project delivery data?

Added to your CPD log

View or edit this activity in your CPD log.

Go to My CPD
Only APM members have access to CPD features Become a member Already added to CPD log

View or edit this activity in your CPD log.

Go to My CPD
Added to your Saved Content Go to my Saved Content

An industry commentator offers his opinion on the big question of data...

I’ve spent the last 12 months on a quest to discover how government manages the experience acquired from project delivery.

Terry Williams (2008) collated many of the arguments that help to illustrate the challenges with learning lessons from project delivery, summarising that “it is not clear that current techniques actually achieve their purpose”. He found “a range of issues… that need addressing”. Has anything changed?

I have submitted a series of freedom of information (FOI) requests in an attempt to gain access to this experience and share it beyond organisational stovepipes for the benefit of society. I appreciate that this is a costly process at a time when government can least afford it, but these costs are inconsequential when compared to the costs of not leveraging experience.

The benefits of sharing these lessons are not limited to the departments that identify them; they extend across organisational boundaries, into the supply chain and across industry. If government can shave five per cent off project delivery costs as a consequence of these lessons, then maybe many of the cuts to social care could be reversed. It is a cause worth fighting for.

So, what have I discovered so far? Although there are pockets of good practice, there are only a handful of organisations that commit tangible effort to leveraging their experience. Even then, the product of their effort is difficult to measure.

The return on investment (ROI) is difficult to quantify, which results in initiatives losing momentum over time – particularly when compared to competing priorities. The process of managing lessons learned remains a significant challenge for most departments.

Processes

Four departments replied regarding their lessons learned processes. For some, the process is mandated and policed, but for others, it isn’t enforced. For one department, it is left to the discretion of individual projects:

  • “[The department] does not have any policies or processes on lessons learned. Lesson learned methodology is standard programme and project management practice which programmes and projects deploy as they see fit.”

For government, the ultimate lessons learned review is a public inquiry. The inquiry into Edinburgh Trams is reported to have cost £7.2m, increasing at £100,000 per week. If there is a justification to “establish why the project incurred delays, cost considerably more than originally budgeted for, and delivered significantly less than was projected”, then is there an obligation to conduct similar reviews on other projects?

The next level down is a review by the National Audit Office, but as I found with the South East Flexible Ticketing Programme, although the highlight report is published, the results of the underpinning forensic examination of the programme tend to be limited to a small circle of people.

The third level down is driven by departmental policy, which is variable. If public money is being spent, is there an obligation on the public sector to leverage this experience for public benefit? Does the same argument apply to private-sector projects, particularly where shareholder capital is at risk?

Identification and access

The overriding principle of lessons learned is that the lessons are discoverable in a form for others to do something with. Three responses to my FOI requests illustrate the magnitude of the challenge:

  • Department one: “…our estimation is that locating, retrieving and extracting the lessons data from centrally held information on projects/programmes (more than 50 in number) will take more than 100 hours of work…”
  • Department two: “…would require [the department] to go through seven years of SharePoint records and then identify the necessary reports which contained the lessons learned information. There are also other change and transformational programmes that have been undertaken by other teams across [the department] who will have similar paper and electronic records…”

I asked for the lessons learned on a major IT project and its two successors – a project reputed to be one of the government’s top 10 project failures. It took the department nearly four months to provide an answer, citing the need to conduct a lengthy public-interest test. Department three finally reported:

  • “[Department] staff have already spent approximately 124 hours on the handling of your request.”

It was eventually rejected on cost grounds (the Freedom of Information Act 2000 allows rejection for responses requiring more than 24 hours of effort) and no information was made available; it felt like a scene from Yes, Minister. Yet this is a project that regularly featured in the national press for cost and schedule overruns.

I acknowledge that some departments seek to reduce the burden of FOI requests by highlighting the work involved, but how can public bodies leverage this information when it can take up to four weeks to extract it? Wouldn’t the public expect this information to be available within a matter of minutes and also shared beyond organisational silos?

Conversely, a £50m failed project had produced a 96-page report that forensically examined the lessons learned. The initial FOI request to access this report was refused, but access was granted following an appeal.

It is an excellent report that has value far beyond the boundaries of the project or the department. This report would otherwise have remained buried within the project, or constrained to a very small circle of people. How many more examples like this exist?

I also wrote to the Infrastructure and Projects Authority to gain access to assurance reports, but this was rejected outright. While I empathise with its need to have a safe space to discuss project issues, should these reports be embargoed in perpetuity?

They provide not only insights into how experience could be leveraged, but also valuable findings for the supply chain on risks to delivery, which for some could be catastrophic. I agree that it’s a delicate subject, but I’m not convinced that the answer is to lock everything away. The Information Commissioner will rule on that one.

Quality

Where lessons have been recorded, they vary significantly in quality. In one instance, a failed £100m project, a lesson on stakeholders was documented as ‘stakeholders’. Another project described a lesson as ‘benefits clearer’.

A department that had a project with a documented cost increase of £2.6bn provided a set of lessons amounting to a page of A4. Very few consider the use case and how the experience will be leveraged, which often means that it can degrade into a box-ticking process.

This results in anodyne statements of the obvious that are difficult to exploit, something that Williams commented on in 2008. There are some notable exceptions, but not many.

Perspective

Many of the lessons are introspective, viewed through the perspective of the person writing the document. The bulk of the investment is delivered through the supply chain, yet the majority of lessons are aimed at procurement officials rather than delivery organisations.

I acknowledge that if suppliers share lessons then they may lose their competitive advantage, but with the appropriate controls and incentives, middle ground can surely be found. If suppliers wish to engage in the delivery of public contracts, then they have an obligation to ensure that they are contributing to improvements in delivery productivity.

Exploitation

There is a significant variation in how experience is leveraged. I haven’t yet been able to identify an organisation that is able to demonstrate the ROI in such a capability, which ultimately influences the extent to which it endures.

One of the best organisations I identified has recently reduced its central team by 75 per cent because of cost reductions and parallel priorities. Another organisation contractually mandated its suppliers to conduct a lessons learned review at the end of a project. They ended up with cabinets and SharePoint folders full of documentation that they struggled to leverage.

The way that technical experience is leveraged often differs to portfolio, programme and project management experience, yet processes suggest a ‘one size fits all’ solution.

I appreciate that some practitioners favour a community-of-interest approach to sharing knowledge, believing that lessons learned databases do not work. Although I agree that this is an essential component of the solution, the answer resides in creating the evidence to invest in improvement, encouraging others to avoid what is eminently avoidable, and leveraging good practice and previous investment.

Government invests a huge amount of money in programmes and projects. The £455bn cost of the Major Projects Portfolio represents the tip of a very large and expensive iceberg of publicly funded investments. With continual cutbacks, there is pressure to focus on delivering projects, rather than the luxury of project reflection.

If this is the consensus, then we should stop the lessons learned processes because they aren’t delivering the value we expect from them. But public resistance to this would be significant. Ultimately, government has a duty to ensure that lessons are being captured from this investment and used to improve the productivity of future delivery.

The same could be said for the private sector, which has a duty to shareholders and other stakeholders.

I have collated more than 15,000 lessons, and read hundreds of research papers on the subject. I am increasingly convinced that the answer lies in the data, building on the excellent work by NASA and others – providing tailored evidence, at the point of need, and helping to nudge people towards leveraging the pool of project delivery experience.

It is the data that enables senior-level decisions to be taken on how to prioritise investment on improvements. The skill is in configuring the data to reflect the use case and having a large enough pool of data to provide relevant insights.

Some of the tools emerging in data science help us to tackle this challenge, but they only form part of an end-to-end solution. By leveraging project delivery experience, we have an opportunity to transform our profession.

It’s a challenge we must grasp with both hands. Can we afford to do otherwise?

0 comments

Join the conversation!

Log in to post a comment, or create an account if you don't have one already.