Human Error

Issues relating to good tactical leadership and planning can be found here.

Moderators: jimothy_183, Admin

Post Reply
User avatar
Posts: 2806
Joined: Sun Apr 10, 2011 3:10 am
Location: Australia

Human Error

Post by Ryan » Mon Jan 27, 2014 2:58 am

Leadership: Human Errors.

Some notes on human errors and error accountability.

The Emergency Environment

Environment – increased noise, lack of light, conflicting information, priority of tasks need to be allocated in real time
Opportunities – short term opportunities appear quick, they need immediate intervention and thus immediate processes
Memory – recall worsens, recognition worsens, muscle memory is sometimes an over-reliant way of operating
Leadership - increased demand, snap decisions needed, less time to think, usually combines with experience for better decisions


There are multiple types of errors, the overarching categories are:
1. Active errors,
2. Latent errors.

Active errors [sometimes called Actual Errors] are identifiable errors that will have real-time or known consequences, for example leaving the gas on in the house then lighting a match. It is identifiable, has consequence and thus weight, and even more worryingly it can be fixed with little effort. In Leadership active errors usually occur on a daily basis, and have the potential with each order given. An example of this may be beefing up one unit over the other in preparation for a task knowing full-well the other unit will be less prepared as a direct action of your order. Another example is beefing up one unit but not supplying them with what they need because of the extra numbers. If this is unaccounted for it is an active error as direct decisions made by yourself account for the error. Call this an error in judgement but that is exactly what happens when the multiples are not accounted for.

Latent errors are harder to identify in comparison. They are usually the formulation of small errors multiplied and put together indirectly relating to the problem. For example as paramedic you check the vehicle every day. This day your check is half-done and you get a call-out. You quickly check the priority components: the oxygen. Looks fine. The primary kit and medications, looks good. You arrive on scene. The patient is unconscious and during the resuscitation you realize the oxygen in the carry-kit is empty. Time to move them to the ambulance, simple problem solving right? You get there and within 10 minutes the oxygen runs out. You call for back-up. Luckily the patient starts to waken but is still ALOC. Complaining of pain. You go to your drug kit and guess what... no morphine! Latent errors add up. They are usually small and OFTEN missed. They can be there for weeks until suddenly error after error appears.

Error Cascading

As mentioned with latent errors and active errors, once one error appears it seems to highlight others. When you combine the two error types that's when you really get messy. Hazards usually do not exist in isolation. The stars may line up and before you know it the errors have converged on you. This is known as the "Swiss cheese" model of accident causation, officially the "cumulative act effect".

For example.

It's getting dark. The weather is getting worse. You cannot handle the on-going activity. You do not have night-gear. You need to eat shortly. You have no experience of night-trekking in such conditions.

If you decide to press on anyway, you have ignored all the errors that will eventually accumulate into one giant problem.

Load Theory

Sometimes error cascading can be the responsibility of one sole person in the team. This is a person usually within a leadership position who has many 'stressors' contributing to decisions at once. This is known as cognitive load theory. Accounting for these human factors allow us to counter any bad decisions made. The OODA loop and KISS principles try to diminish this during times of cognitive [information] overload that need immediate action.

Situational awareness, battlefield [big picture] awareness, task fixation and peripheral and central vision recognition are all major topics within this. This occurs especially with things that are above, below us, behind us and to our obliques when we are fixated to our front. "Monitoring" in this regard is known as situational awareness. We're wired to fail. Complex information brings traps. This is not about intelligence, it is about limitations and strategies around them. Two conflicting outcomes or options are a dilemma... you have to account for these and everything else going on.

This especially happens to medics, they get stuck on one thing for example the bleeding and mull over any additional precautions, for example allergies to the medication they're giving the patient.

Risk Mitigation

So how do we prevent error cascading?
Risk mitigation and golden rules.

Typical risk mitigation is a process of:

Identifying the Risk.
Eliminating the Risk.
Substituting a Task around the Risk.
Engineering Solutions around the Risk.
Adapting Behaviour to the Risk.
Adopting PPE to the Risk.
Further Contingency Planning.

Golden rules:

Always operate within your limits.
Sleep when you can.
Eat and drink when you can.
Obtain knowledge and experience before you go.
Maintain fitness.
Communicate regularly.
Metacognition... think about your thinking.
And best of all - build a culture of safety and control measures.

Control measures and safety measures must:

Be robust, resilient methods of operating.
Be firm foundations for innovation. Presentation must be confident in doing so.
Flatten the hierarchy to allow decisions by any member. Keep roles dynamic. Hierarchy gradients shift.
Open channels of communication. Keep communication open. Allow for discussion or friction to limit animosity.
Include well defined roles and best practices. Ensure a standard to this. No 'filler' information.
Share responsibility to all involved.
Train teams not just individuals. This includes out of the comfort zone with each other.
Standardize until you absolutely have to improvise [there is increased risk with improvising, especially on the fly].
Understand the body. Being scared is normal. Neurobiology underpins decisions we make. Straight forward tasks still make mistakes, sometimes it is NOT a cognitive failure.
Search for PROGRESS rather than BLAME. Search to CORRECT mistakes.

STOP Strategy


WIN Strategy


Other Industries - How do they account for human errors?

Surgery/Medicine Industry: Human Errors

Atuls Gawande’s Checklist - a checklist for every component needed to conduct a task.
Flattening the hierarchy especially in emergency surgeries.
Without adding drugs or technology we can save lives by waking up to the experience of others.
Simply adding better methods of doing things.

I'd like to lead you on to watch this video, and recollect a story told in it.

The Elaine Bromiley Case.

29th Match 2005.
Anesthetist could not intubate.
Surgeon could not intubate.
ENT could not intubate.
25 minutes passed.
Nurse pre-emptively brought tracheotomy kit and phoned the ICU.
Alternative solutions like a tracheotomy were forgotten.
Decision making was compromised as was cognition in this critical event.
Elaine died after being left in a coma for 13 days.

Aviation Industry: Human Error

Civil aviation.
Flight simulators, regular training, complex scenarios, all emergency potentials.
Subculture of being cautious and procedural.
Confirmation systems via partner.
Checklists for vital system checks.
Human memory is flail so systems help; process-driven for better outcomes.

Firefighter Industry: Human Errors

Complex scenario-based training in real-time observed by supervisors keeping check.
Starts with basic scenarios and works up to multitudes of scenarios and complex environments.
Similar to the Military with training ranges and killhouses.

Formula One Industry: Human Errors

Pit Crews function under high stress in real-time, with a goal of conducting everything properly in limited time (<6-8 seconds, even under 4 seconds).
Multitudes of specialists come together, reconfigure a single unit and perform a complex task in real-time, in a short time period.
Every member of the team has a very simple but very specific clearly defined task; one person in charge.

Military Industry: Human Errors

Hostage rescue training every day in killhouses with multitudes of scenarios that are outcome driven.
Allows autonomy but keeps it within boundaries.
OODA loop: Observe, Orientate, Decide, Act.
AARs, Update Briefs, BUBs.
Mission profiles and recon allows an estimation into what assets are needed.
A combination of most of the industries listed.
CQB-TEAM Education and Motivation.

"Pragmatism over theory."
"Anyone with a weapon is just as deadly as the next person."
"Unopposed CQB is always a success, if you wanted you could moonwalk into the room holding a Pepsi."

Post Reply

Return to “Leadership, General Discussions”