Most accidents are caused by human error: 75-90%
How? They are not. It's a design problem. They must be another underlying factor.
When an accident is thought to be caused by people, we blame them and then continue to do things just as we have always done.
Why does investigation not work?
- Most accidents do not have a single cause, there area multiple things that went wrong
- It stops as soon as a human error is found
It is just beginning of analysis when we discover a human error
Five Whys, keep asking "why"
Although people tend to blame themselves. "It was my fault, I knew better." - not helpful to prevent the error
- stay up far longer than is natural
- try to do too many things at the same time
- drive faster than is safe
Most of the time we manage okay - we would be rewarded and praised for our heroic efforts
But... when things go wrong and we fail, then this same behaviour is blamed and punished
People knowingly take risks
- Drive too fast in the snow
- Agree to do some hazardous act even though you think it foolhardy to do
- Drive with too little sleep
- Work with coworkers even though you are ill (potential infection risk)
- Go thorough a red light
-
Slips
- Action-based
- e.g., I poured some milk into my coffee and then put the coffee cup into the refrigerator
- Memory-lapse
- e.g., I forget to turn off the gas burner on my stove after cooking dinner
- A slip occurs when a person intends to do one action and ends up doing something else
- Action-based
-
Mistakes
- Rule-based
- Knowledge-based
- e.g., Weight of fuel was computed in pounds instead of kilograms
- Memory-lapse
- e.g., A mechanic failed to complete troubleshooting because of distraction
- A mistakes occurs when the wrong goal is established or the wrong plan is formed
Quite often the interference comes from the machines we are using: the many steps required between start and finish of the operations can overload the capacity of short-term memory
The confusion that results when operators believe the system to be in one mode, when rein reality it is in another
Alarm clock's a.m. and p.m.
vim input mode, set paste, etc.
Mode error is design error. Designers must try to avoid modes. Modes need to be visible.
People tend to rely upon remember experience rather than on more systematic analysis
When the situation is mistakenly interpreted. A club got in a fire, security guards didn't let them out because sometimes people leave without paying
This is difficult to avoid/detect
Designers should provide as much guidance as possible
They need to be automated. But problems become complex it is going to be hard. Good procedural manuals will be helpful
It occurs by forgetting the evaluation of the current status of the environment by getting an interruption
environment: The goals, plans, system status
Time and economic forces: Deadline, bosses, parents, customers, costs
Checklist is useful. Working on it with 2 people is good. Adding more people is bad because they become irresponsible: "Other people will check"
Social pressures often make it difficult for people to admit to their own errors. If people report their own errors, they might be fined or punished
Hospitals, courts, police systems, utility companies - all are reluctant to admit to the public that their workers are capable of error
We need to make it easier to report errors, for the goal is not to punish, but to determine how it occurred and change things so that it will not happen again
TOYOTA Jidoka, andon, poka-yoke: Automate the error reporting process, even though it stops production. Then avoid the same error by error proofing
NASA semi-anonymous reporting to avoid punishment
- Understand the causes of error and design to minimize those causes
- Do sensibility checks. Does the action pass the "common sense" test?
- Is the number is correct? - USD 100$ vs JPY 10,000Y
- Make it possible to reverse actions - to "undo" them - or make it harder to do what cannot be reversed
- Make the item being acted upon more prominent: - Change the color, enlarge it
- Make the operation reversible: - Auto save to prevent closing it by accident
- Add constraints to block errors - "Do you want to delete this file? Are you sure?"
- Make it easier for people to discover the errors that do occur, and make them easier to correct
- Don't treat the action as an error; rather, try to help the person complete the action properly. Think of the action as an approximation to what is desired
- Accidents usually have multiple causes, whereby had any single on of those causes not happened, the accident would not have occurred. - Reason's Swiss Cheese Model of Accidents
- Add more slice of cheese
- Reduce the number of holes (or make the existing holes smaller)
- Alert the human operators when several holes have lined up
The cost of the interruption is far greater than the loss of the time required to deal with the interruption: it is also the cost of resuming the interrupted activity
People can lose competency if sleep deprived, fatigued, or under the influence of drugs
The far greater percentage of accidents is the result of poor design, either of equipment or, as is often the case in industrial accidents, of the procedures to be followed
Sometimes the problems do not arise in the organization but outside it - natural disasters for example
The question is how to design and manage these systems so that they can restore services with a minimum of disruption and damage.
When automation works, it is wonderful, but when it fails, the resulting impact is usually unexpected, as a result, dangerous
People: flexible, versatile, and creative Machines: rigid, precise, and relatively fixed in their operations
Difficulties arise when we do not think of people and machines as collaborative systems
What we call "human error" is often simply a human action that is inappropriate for the needs of technology. As a result, it flags a deficit in our technology. It should not be thought of as error. We should eliminate the concept of error: instead, we should realize that people can use assistance in translating their goals and plans into the appropriate form for technology.
Given the mismatch between human competencies and technological requirements, errors are inevitable. Therefore, the best designs take that fact as given and seek to minimize the opportunities for errors while also mitigating the consequences.
Make actions reversible; make errors less costly. Here are key design principles:
- Put the knowledge required to operate the technology in the world
- Don't require that all the knowledge must be in the head
- Allow for efficient operation when people have learned all the requirements, when they are experts who can perform without the knowledge in the world, but make it possible for non-experts to use the knowledge in the world
- This will help experts who need to perform a rare, infrequently performed operation or return to the technology after a prolonged absence
- Use the power of natural and artificial constraints: physical, logical, semantic, and cultural
- Exploit the power of forcing functions and natural mappings
- Bridge the two gulfs, the Gulf of Execution and the Gulf of Evaluation
- Make things visible, both for execution and evaluation
- On the execution side: provide feedforward information - make the options readily available
- On the evaluation side: provide feedback information - make the results of each action apparent
- Make it possible to determine the system's status readily, easily, accurately, and in a form consistent with the person's goals, plans, and expectations
- Make things visible, both for execution and evaluation
We should deal with error by embracing it, by seeking to understand the causes and ensuring they do not happen again, We need to assist rather than punish or scold.
Thanks for summarizing :)