Martin Paver on why educating, reskilling and collaboration are key to implementation
An APM blog posted in March suggested that “next year could be a turning point for project management and AI”. It is an interesting article and worth a read. Here I want to delve deeper into the notion that next year is the turning point and whether limiting it to artificial intelligence (AI) overly constrains the opportunity.
Robotic process automation and data science
In many sectors, robotic process automation (RPA) is already removing the burden of repetitive work with a focus on the automation of workflows (although the higher-end capabilities can execute machine-learning algorithms). Projects are laden with processes, many of which could be automated today. The only obstacles to implementation are understanding, skills, budget and, most importantly, desire.
But RPA is only one part of the spectrum. We can rapidly extend out into data analytics and data science. At the fifth Project:Hack community hackathon, the winning team created a tool to check for errors in specifications. They scraped the ISO and British Standards Institution websites and compared the information there with the standards in the specification, highlighting any that were out of date or inaccurate.
We have used Python for a wide range of applications, such as topic modelling (a type of text-mining tool) and error checking. It is all possible today and only limited by our imagination.
Network theory and AI
We then move into network theory, where we look for relationships between data. We tend to store data in silos and lose the connection between them. Even single points of the truth may not have the requisite connectivity in the data to enable us to conduct complex queries. For instance, if I want to understand what the key risks are with a certain supplier at a specific stage of a project, including impact, I need to correlate:
- supplier database;
- work breakdown structure;
- risk register; and
- cost and schedule variance associated with the risk, also linking to compensation events or claims.
We also need to capture the data dynamically so that we can see how the project evolved and understand chains of events and vectors.
We haven’t yet touched on AI. We can introduce AI using off-the-shelf algorithms, many freely available. Some have been trained using vast data sets, such as translation or image recognition tools. But we often lack the data to train algorithms on project delivery challenges. For instance, it would be great to use AI to identify lead indicators on health and safety issues on a construction site. But the Health and Safety Executive only collates reportable incidents. It can’t access data on safety audits, safety observations or levels of training. The same applies to risk, schedules, costs, benefits, commercial disputes, logistics, quality etc.
We need to be careful of conflating data analytics and AI
AI in project delivery will only become commonplace when we address:
- Awareness: Educating the community. There are lots of people who use language loosely, which may confuse rather than educate. There are people across the globe who are pushing the boundaries on project data analytics, sharing the latest ideas and trying to put this into terms that we can all understand. But they are still a rare breed.
- Accessibility: There is a perception that this is all very difficult and practitioners need a PhD first. We need to continue working hard to change these perceptions. What starts off as something very scary rapidly becomes digestible and within reach.
- Competence: I see three schools of thought emerging on developing competence within an organisation:
1. ‘Sheep dip’ everyone in a one-week course on data science, from the marketing team to project managers.
2. Create a centralised team of ‘data ninjas’ who work on projects on demand.
3. Upskill project delivery professionals. Project delivery is different. Data doesn’t stream out of a control panel, it is messy, incomplete and often full of errors. If we don’t educate project professionals, we will never create the culture within the business to change. This change needs to come from within, from training such as the project data analyst apprenticeship.
- Data: We need to provide access to data. If we constrain data to the big software vendors, they will dominate and create monopolies at the expense of innovation. We need to work collegiately to pool data for the benefit of the collective, democratising data to enable access to researchers, innovators and industry professionals.
A systems approach
It is possible to implement RPA and data analytics today. But if we want to unlock the potential of AI, we need a systems approach. We can deliver it piecemeal, but we’ll get there a lot quicker if we are able to create critical mass across multiple sectors and professions. But it takes time and a lot of resilience. The tide is turning, but we need to think more strategically, rather than bolting on software solutions and waiting for the magic to happen. We need to educate, upskill, reskill, experiment and collaborate.
Read more on data and AI in projects by Martin in the APM blog archive, including his recent post Data analytics must drive a new world order.
You may also be interested in APM’s Projecting the Future series of papers.