Human Error


Analysing human error is a valuable tool in risk assessing both individuals and the systems in which they work. Particularly in healthcare we aim to reduce the risk of human error as much as possible to avoid adverse incidents which can significantly impact both patients and team members.

 

James Reason highlights two approaches to human error which we will look at in this post.

 

1. The Person Approach

 

The first is the person approach to human error which is the more traditional approach of the two. This focuses on acts which are deemed unsafe. These unsafe acts may normally arise from what Reason calls “aberrant mental process”(1), examples of which include forgetfulness and carelessness.

 

Measures which aim to reduce human error in accordance with the person approach involve reducing variability in performance and holding individuals accountable for their mistakes. Systems which adopt this approach often result in individuals fearing making errors and in many cases, result in poor reporting of errors when they occur. They also consider errors to be moral issues for which people ought to be punished or sanctioned for when they arise – psychologists refer to this as the ‘just world hypothesis.’

 

2. The System Approach:

 

The second framework for addressing human error is to use a system approach which claims that humans are fallible and errors are thus a part of life. They are seen as consequences rather than causes – they are not related to the morality of the individual nor human nature but are rather due to “upstream systemic factors”.

 

Reducing system errors is based on the premise that we cannot change the human condition but we can change the conditions in which individuals work. For example, we can implement barriers or safeguards which will prevent adverse outcomes should an error occur.

  

Evaluating both systems:


The person approach remains the most used of the two systems and fits into our intuitions to blame an individual should something go wrong. However, whilst this is the most widely used approach it does have its drawbacks.

 

Firstly, the person approach requires an an open and honest work environment which has a culture of effective error reporting. This often does not occur when the person approach is employed as individuals fear repercussions when mistakes are made. Not only does this create an environment in which errors are looked upon with fear but it means effective reporting of errors does not occur with vital learning opportunities missed.

 

Secondly, by focusing on the individual it means that the error is looked at in isolation and with little or no regard to the system nor how this could have contributed to the error.

 

By looking at human error in this way two vital aspects of it are missed:

 

1. That human error can be made by even the most reliable and talented personnel.

 

2. Mistakes for the most part are not random and fit into recurrent patterns. The same circumstances can lead to the same errors regardless of which individual is involved.

 

The person approach therefore focuses too much on the individual and analyses the incident in isolation of the system.


The Swiss Cheese Model of Human Error

  

When we employ a system approach to human error we can create defences, barriers and safeguards to help prevent human error from occurring. These defensive layers can be:

 

-       Engineered – e.g. alarms or warning systems

-       Reliant on people – e.g. control room operators

-       Dependent on procedural and administrative controls

 

Their function is to protect the system and individuals within it from making potential errors. In most cases they achieve this, however, as with all systems there are weaknesses.

 

In an ideal world, each barrier or defence would be impenetrable and the system would prevent human error. In reality, there are weaknesses within each layer of the barriers and safeguards we introduce. Bypassing one of these weaknesses generally does not result in an adverse event or error. However, if these errors build up and align it can result in a major system error. This is illustrated below:


Screen Shot 2020-05-30 at 22.45.31.png


James Reason claims that the ‘holes’ in the defences can arise for two reasons:

 

1. Active Failures – these are unsafe acts which are committed by individuals in direct contact with the system or patient, for example a surgeon. These usually occur due to “lapses, slips or fumbles”. They have a direct impact on the system which is short-lived.

 

These incidences, if a person approach is taken, are considered in isolation of the system. However, Reason claims that such adverse incidences could be avoided if causal factors in the system are rectified.

 

2. Latent Conditions – Reason claims that these are “inevitable pathogens” within the system. They tend to be errors which are a result of strategic decisions, all of which have the potential to introduce these “pathogens” into the system.

 

Latent conditions have two kinds of adverse effects:



(i) the introduction of error provoking conditions within the local workplace

(ii) they can also create long-lasting holes or weaknesses in the defences or safeguards which are put in place.

 

These latent conditions may lie dormant and not be detected for a long period of time before they are triggered having a detrimental effect on the system as a whole. These kinds of errors are hard to foresee and require a proactive approach to prevent them.


You can find out more about the ‘Swiss Cheese Model’ by watching the short video below:  

 

Error Management


This has two components:

 

1. Limiting the incidences of errors – this will never be wholly effective and therefore we also require;

 

2. The creation of systems which can tolerate errors occurring and “contain their damaging effects.”

 

In regard to the person approach this involves finding ways to prevent individuals making mistakes. In contrast the system approach aims to create a comprehensive management programme aimed at helping the person, the team, the whole workplace and institution. By employing these two pillars of error management, systems can become more resilient to errors occurring.




Human Error in Healthcare



In healthcare, managing risk and human error are of paramount importance to ensure patient safety and the safety of healthcare practitioners. A study by Thomas et al 2000 showed that in Colorado and Utah adverse events occurred in 2.9% of all hospitalisations. These are events which could be avoided and are a result of preventable errors.

 

Within healthcare, we can apply Reason’s model of human error and look to reduce risk by adopting both the person and system approaches. Vincent (2) proposed an organisational model inspired by Reason’s work. This focuses on the system approach to human error and aims at reducing latent errors by concentrating on 7 key categories which could cause safety concerns:

 

1. Institutional Context 

2. Organisational and management factors

3. Work environment

4. Team Factors

5. Individual (staff) factors

6. Task factors

7. Patient Characteristics

 

Cook and Rasmussen (3) add that human error is more likely to occur when we operate at “almost capacity” and when we are pushed for time.

 

The WHO released a ‘Conceptual Framework for the International Classification for Patient Safety’ which identified contributing factors to adverse incidents(4):

 

“Contributing Factors/Hazards are the circumstances, actions or influences which are thought to have played a part in the origin or development of an incident or to increase the risk of an incident. Examples are human factors such as behaviour, performance or communication; system factors such as work environment; and external factors beyond the control of the organisation, such as the natural environment or legislative policy. More than one contributing factor and/or hazard is typically involved in a single patient safety incident.”

 

They go onto identify the role of a system approach to human error stating:

 

“A complex relationship exists between incident type and contributing factors. The same incident or circumstance may be perceived as an incident or a contributing factor, depending on the context, circumstance or outcome. An incident always has a set of contributing factors. Although an incident can be a contributing factor to the origin or development of another incident, some contributing factors can not be incidents in their own right. An incident can therefore be designated as a principal incident type depending on context specific business rules (e.g., the incident most proximal to the identified patient outcome), design of an information system or type of data analysis.”

 

They reflect this in their model of human error which can be seen in the image below:

Screen Shot 2020-05-31 at 11.23.16.png


Conclusion

In conclusion, human error is incredibly important to consider within healthcare. Firstly, we must employ a proactive approach to prevent errors before they occur. Secondly, we must take into account near misses and any incidents which occur and reflect on what led to the error and how it can be avoided in the future. This is a vital part of risk management and clinical governance – by reducing human error we can increase the standard of healthcare and reduce the risk to patient safety.

 

This post is based on the work of James Reason and in particular the following paper which was quoted above:

Reason J. Human error: models and management. BMJ. 2000;320(7237):768‐770. doi:10.1136/bmj.320.7237.768

 

References:

1. Reason J. Human error: models and management. BMJ. 2000;320(7237):768‐770. doi:10.1136/bmj.320.7237.768

 

2. Vicente KJ. What does it take? A case study of radical change toward patient safety. Jt Comm J Qual Saf. 2003 Nov; 29(11):598-609.

 

3. Cook R, Rasmussen J. "Going solid": a model of system dynamics and consequences for patient safety. Qual Saf Health Care. 2005 Apr; 14(2):130-4.

 

4. World Alliance For Patient Safety Drafting Group., Sherman H, Castro G, Fletcher M, World Alliance for Patient Safety., Hatlie M, Hibbert P, Jakob R, Koss R, Lewalle P, Loeb J, Perneger T, Runciman W, Thomson R, Van Der Schaaf T, Virtanen M

Int J Qual Health Care. 2009 Feb; 21(1):2-8.

  

You can find out more about human error below:

 

https://www.youtube.com/watch?v=MfWpMrEOlJ8

 

http://aerossurance.com/helicopters/james-reasons-12-principles-error-management/

 

https://www.england.nhs.uk/wp-content/uploads/2013/11/nqb-hum-fact-concord.pdf

 

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3057365/

Previous
Previous

Reflective Learning