Introduction
Almost everything we do in life has risks attached, whether that’s something adventurous such as rock climbing or a day-to-day activity such as crossing the road.
For each activity, we do what we can to eliminate or mitigate the risks. In the above examples, we do that having the right climbing equipment, training and supervision in place, and by establishing that there’s no oncoming traffic before stepping into the road.
The same is true for construction projects. Every activity needs to be managed so that it does not add risk. That risk could be:
- Safety – someone will be injured during construction or operation
- Financial – the total cost of the project will exceed the budget and stakeholders will lose money on it
- Programme – the project will overrun
- Reputational – the project will harm the standing of one or more parties to it
Ideally, these risks would be eradicated entirely. In reality, that is rarely possible – and the risk that something undesirable will happen must be weighed against the resources needed to eliminate it.
Construction is a risky business

In his 1994 report Constructing the Team, Sir Michael Latham wrote: “No construction project is risk-free. Risk can be managed, minimised, shared, transferred or accepted. It cannot be ignored.”
There are several typical features of construction sector practice that can contribute to the multiplication of risks. These include:
- A long supply chain
- Lack of repetition
- Multiple interfaces
- A wide client pool
- Time pressure
- Budget pressure
- Lack of integration between design, construction and operation
- Procurement methods
Many of these are systemic issues that need to be tackled at an industry-wide level. For example, more collaborative working and alliance-style contracts would improve integration between design, construction and operation, reducing the number of interfaces.
But most projects still involve a client procuring a design team that hands over the design to a separate construction team, which then passes the completed asset to another distinct operations or maintenance team. Even when a design is commissioned through a contractor – for example, on a design and build form of contract – the two parties often remain separate and the design engineer simply hands the design on to the contractor.
These are only the basic interfaces. Even simple domestic-scale projects involve different subcontractors, with information passing backwards and forwards between them. As ICE Fellow John Carpenter states in his book Designing a Safer Built Environment: “Risk does not recognise size of project and risk thrives at interfaces. If there is a weakness, it will find it.”
The Swiss cheese model
A good way to visualise risk is by using the so-called Swiss cheese model, which is based on work by psychology professor James Reason. In his 1990 book Human Error, he examined factors that led to disasters such as the nuclear meltdown at Chernobyl, the gas leak at Bhopal and the fire at King’s Cross Tube station.
Reason made a distinction between ‘active’ and ‘latent’ errors, likening latent errors to pathogens in the body that can lie dormant for a long time and don’t cause illness because the system has a series of defences. They become evident only when they combine with other factors to breach those defences.
Fig 1: Swiss Cheese model
How 'holes' in a project's line of defence could lead to failure
Graphic taken from the ICE’s 2018 In Plain Sight report
This idea of ‘lines of defence’ can be applied to any situation where there is a risk of failure or a catastrophic event, with each line of defence represented as a slice of cheese. Any vulnerabilities in these lines of defence can be shown as holes in the slices, making them look like Swiss cheese. If the holes in the different slices line up, any threat can pass all the way through, enabling the catastrophic event or failure to occur.
For example, if the risk is that part of a structure will collapse under a specific load condition, the lines of defence that should prevent that include:
- The client making it clear how the structure will be used
- The designer being sufficiently competent for the level required, obtaining the appropriate load information and using the correct design codes
- The materials being suitable for the required loading
- The contractor using the material specified and building it in a safe way
- The structure being used and operated in the way it was anticipated
- Any maintenance or refurbishment not violating the design assumptions
A hole could appear in any of these lines of defence – for example, if the designer makes an assumption about how the structure will be built. If this is picked up at the next line of defence – i.e. the contractor goes back to the designer to ask if any assumptions have been made – the threat has been stopped. But, if not picked up at this stage, it creates another hole and the threat can get through, which in this case might mean that the contractor chooses a construction method that puts too much load on the structure while it is being built.
As the Swiss cheese model shows, an error, oversight or event can occur anywhere in the process, and the appropriate lines of defence need to be in place to prevent that from turning into a catastrophic failure. The more complex the process, the more possible it is that apparently unrelated and often small errors will combine to disastrous effect.
Why failures happen

Analyses of construction disasters show that they almost always occur as a result of several factors and are rarely down to a single mistake by one person. In a situation where somebody has made a mistake, it’s likely that there were other factors that either contributed to the mistake (such as time pressures) or meant that it wasn't spotted in time.
The ICE’s In Plain Sight report identified that the same blind spots and vulnerabilities appeared in all major incidents, including:
- A failure to identify all of the risks in the system
- Incomplete or inadequate information
- Poor-quality supervision and assurance
- Design flaws
- Operations not in harmony with design intent
- Inadequate governance and a poor organisational culture
- A failure to share lessons from past incidents and near misses
Looking at how these issues are interlinked, let’s take the example of incomplete or inadequate information. Designers often have limited time to produce their designs. As a result, on the date in the programme when a contractor expects to receive information from the designer, they may get drawings that are only partially complete. This can result in two situations: either the contractor has to wait for the rest of the information to arrive, risking delays to the project; or the contractor can try to press ahead with partial information, risking an error that could be catastrophic.
Creating the right culture
When a design engineer passes incomplete information to a contractor, the actions of that individual are not happening in isolation: the engineer is working in an environment where this practice is deemed to be acceptable, even encouraged. Individual design engineers may not be comfortable sending incomplete drawings to the contractor, but the culture of the organisation they work for might make it difficult to speak out.
Kyle Clough, regional director for contractor Kier, says: “Organisations have to create an environment where individuals can call something out if they’re not happy with it.”
Almost every engineer will have been in a position where they aren’t sure whether they should say ‘no’ to something they’re being asked to do
That organisational environment must also give people confidence in their abilities and awareness of their skills. Clough cites the example of a maintenance team sent to fix a burst water main on the side of an embankment. Having excavated to fix the pipe, they quickly realised that this would be a bigger job than first thought. Rather than continuing – and making matters worse – they acknowledged that this was not something they should be trying to fix, so they put in a temporary solution instead.
The team’s awareness of their competence mitigated the risk. In terms of the Swiss cheese model, if ‘competence’ were one of the slices, continuing to dig would have created a hole in that slice.
Almost every engineer will have been in a position where they aren’t sure whether they should say ‘no’ to something they’re being asked to do, or have reached a point where they don’t have the experience to make a decision.
Professor Julie Bregulla, director of project-based learning at the Engineering and Design Institute, says: “This is about knowing your competence and being honest with yourself, asking: ‘Am I really the right person to do this?’”
Knowing, applying and ensuring
The ICE has developed its own version of the Swiss cheese model that emphasises the importance of competence. The lines of defence are grouped under three headings: knowing, applying and ensuring.
- Knowing covers knowledge and the need for engineers to maintain their knowledge levels, including learning from past failures
- Applying is about the processes used, including standards and regulations, and the deployment of suitably qualified people and accountability of asset owners
- Ensuring focuses on the processes put in place to guarantee that the knowledge is applied
Fig 2: Knowing, Applying and Ensuring
The importance of competence in avoiding catastrophic failure
Graphic taken from ICE’s In Plain Sight report
Bregulla believes that the industry must learn from failures to prevent similar disasters from recurring. She stresses that this is something individual engineers can do, as well as the sector as a whole.
“This should be a learning industry,” she says. “If you spot something that’s not right, you absolutely have to do something about it.”
Some companies have excellent systems in place for reporting ‘near misses’ and a culture that encourages people to speak up in an attempt to cut out risky practices. There are also industry-wide confidential reporting mechanisms, such as the ICE-backed Collaborative Reporting for Safer Structures UK (CROSS-UK), which enables engineers to report anything they think is unsafe.
CROSS also publishes reports into previous failures so engineers can see if there have been any problems linked to the type of structure they are designing or building. Reading these reports should be standard practice for all engineers.
Learning from mistakes can also help to address risks to programme and budget, as well as safety.
Network Rail programme manager Lewis Atherton worked on the £1bn project to redevelop the UK rail infrastructure owner and operator’s London Bridge station. He says: “We benefited from an enormous amount of learning from other projects, such as London Heathrow Terminal 5. That project finished on time and within budget, but it didn’t work on day one because no one knew how anything worked.”
It is really helpful to get people to think about where the holes are, what could go wrong and what we’re trying to design out
Lewis Atherton, Network Rail
Atherton continues: “Learning from that experience, we opened the operations and control areas a year ahead and also built a mock-up to help the staff learn how it would all work. So, when the concourse was opened to the public, we had 400 or 500 staff all in position because of thinking ahead and learning from the past.”
The Swiss cheese model is a useful part of Network Rail’s safety strategy on major projects, he adds.
“It helps us to challenge people and think a bit deeper, particularly when we are having safety conversations,” Atherton says. “It is really helpful to get people to think about where the holes are, what could go wrong and what we’re trying to design out. We tend to ask a lot more questions and don’t accept the first answer.”
How do I make the Swiss cheese model work for me?
The Swiss cheese model is a useful visual aid to help identify all of the possible opportunities for risk to build up. Different organisations and individuals use it in different ways.
For example, the various slices could represent all the main players in the delivery and operation of a project – client, architect, design engineer, specialist designers, contractor, subcontractors etc. Holes in the slices could represent all of the assumptions that are made or information missed as responsibility for elements of the project pass from one to another.
The ICE’s version (see Fig 2) has 13 slices grouped under three headings: knowing, applying and ensuring. Each slice represents a line of defence to prevent a threat from becoming catastrophic.
An analysis of the ICE model by Loughborough University has suggested that this could be augmented by including holistic risk assessment, influences that affect every slice and post-incident response (see Fig 3, below).
Fig 3: Proposed changes to the lines of defence
Introducing holistic risk assessment, cross-cutting influencers and post-incident response
Carpenter advocates a similar model, but with the lines of defence suited to projects of all sizes, not just major assets. In his model there are also three main themes, known as the 3Ps: People, Process and Product (see table below).
The 3Ps: People, Process and Product | ||
---|---|---|
Theme | Line of defence | Detail |
People | Competency | • Individuals and teams are sufficiently competent • Competence is relevant to the nature of the project |
Resource | • Enough people with the right distribution of competency are involved at the right time | |
Process | Brief | • Adequate and clear • Team members’ responsibilities are made clear |
Procedure | • Choice of design methodology • Assumptions • Risk management • Use of contemporary practice • Constructability • Managing interfaces | |
Communication | • Residual risks • Maintenance and repair • Assumptions | |
Checks | • Checking and reviewing critical work • Third-party opinion on complex or unusual design/construction | |
Product | • Checking all products meet relevant standards and are used appropriately • Extra due diligence on new products • Flagging up the proposed use of new products • Making use of others’ research and experience |
The Swiss cheese model can be used at a strategic level for an entire project or for an individual task. For example, if a supplier offers a substitute product, it might be helpful to use the model to identify all of the steps that must be taken to be sure the alternative will do the same job as the one that was specified:
Product
• Does the new product meet the relevant standard?
• Has it been used for this purpose before?
Competence
• Am I sufficiently competent to make this decision?
Communication
• Who else needs to know we plan to use a different product?
What can I do to reduce risk?

There are behaviours that everyone can adopt to ensure that they aren’t creating holes in the lines of defence on every project. This checklist is a good start:
- Am I competent to do this task?
- Am I the right person to be doing this?
- Have I been given all of the information I need?
- Have I been given enough time to do it adequately?
- What assumptions have I made?
- Do I know whom to ask if I need help with the work?
- Is someone more senior going to check my work?
- Have I passed all of the relevant information on to ensure that the next person can do their job properly?
- Do I know what to do if I’m not comfortable with something I have been asked to do, or know how/where to report an unsafe practice?
Sign up to receive news from ICE Knowledge direct to your inbox.