Improving System Safety With Human Factors Approach

Goals:

With the dynamic nature of technology, many safety-critical systems are coming up with the obvious good intention of improving service delivery and others speed and quality of production. Rail, avionics, medicine, and automotive industries are some of the domains where the safety-critical systems are commonly used. However, there are several variables that could interfere with the perfection of these systems. There is a phrase that “a human is to error” which in most cases is associated to accidents. Human errors are inevitable because the systems are designed by humans and used by humans . In this regard, minimizing the chances occurrences of human errors is vital during the system design . This is easily achievable when analysis of human factors is conducted prior to the system design . Human error can be viewed in two different angles; person and system views . The person view focuses on the errors that resulted from carelessness and could have been easily avoided the perpetrator. On the other hand, system view involves the poor system design that easily allows errors and is very hard for the users to avoid . This report focuses on the considering human factors as a way of improving system safety by considering these two perspectives. It shows cases where human factors have rendered the systems unsafe and researched approaches to reduce the safety breaches.

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

While safety of the safety-critical systems cannot be 100% guaranteed, the risk of unsafety should be minimized at all cost. Several researches have been conducted to establish the best approaches that need applied to achieve the safety of these systems. The approaches are based on the human error as a main factor. The main areas where human are the main cause of unsafety of the systems are identified in the researches that have been conducted on systems safety.

In medicine, there are many areas where the safety of the patients have been jeopardized by human mistakes. Betsy Lehman, a victim of an overdose chemotherapy and Willie King a victim of wrong leg amputation are two major examples of system safety issues that hit the news headlines. According to research [9], 6,000 Americans lose their lives annually as a result of workplace injuries and another 7,000 die annually due to errors in medication. These errors not only cause the loss of life but also the loss of money. According to research [9], approximately $17-$29 billion in America is lost as a result of careless mistakes that could have been prevent. These amount accounts for victim’s disability, health care, personal income and productivity. A recent research [9] shows that in every 100 admissions at least two fall victim of avoidable drug events. This is an indication that there is great need for establishment frameworks that will help improve the safety of the health systems .

The report must include the following sections:

In the aviation industry, NASA identifies human contribution to system safety as a challenge in ensuring the safety of the aeronautical systems. According to aviation safety magazine , 90% of the reported aircraft accidents are as a result of pilot error. However, all the blame does not lie on the pilot, some other accidents could be as a result of air traffic controllers, members of the crew and mechanics errors. The common thing on these aviation problems is that the accidents are as a result of human factors .

In the rail industry, there have been accidents resulting from poor communication among the employees, inadequate safety awareness and ignoring human factors such as fatigue, mental wellbeing and others that can make a human not operate properly . Over 12 rail accidents where trains carrying oil derail and cause fire have occurred since 2012.  The most disastrous occurred in 2013 at Lac-Mégantic, Quebec where a crude oil derailed and exploded causing the loss of 42 lives and destruction of approximately 30 buildings . The blame was greatly laid on the material used to make the containers, DOT-111 which was almost immediately replaced with DOT-117 which were considered to be safer. Although some of the train accidents are as a result of technical issues and the materials used to create the container, human is still to blame. This is because humans are involved in the design and manufacturing.

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

All these tragic occurrences due to human error prompted serious research on the best approaches to reduce system safety issues resulting from human factors . Reasons Swiss Cheese Model is one of the proven approaches to explain causes of human errors on the systems safety. It tracks human failures from the top level to the lowest levels where the accidents occur. The reasoning bases on four major human factors as the root causes of the accidents experienced in the systems. They include; influences by the organization, operators’ unsafe behaviors, preconditions for the unsafe behaviors, and poor supervision. According to the model, these four factors are interconnected such that failure in one factor leads to failure in the other. For instance, influences by the organization leads to poor supervision which brings about unsafe behavior preconditions that encourage operators’ unsafe behaviors which eventually cause accidents.  However, it does not give detailed description of these factors hence making less reliable in establishing systems that are safe. For this reason, Human Factors Analysis and Classification System (HFACS) approach was introduced.

Literature Review

HFACS derives 19 casual categories from the four human factors from the Swiss Cheese Model . The framework attempts to establish the exact cause of the problem that needs to be dealt with rather than where to blame [10]. It uses historical occurrences to establish the level at which the failure occurred in the four Swiss Cheese Model factors. Like Swiss Cheese Model, it tracts a problem from the top level to the lowest level to establish the point at which human failure occurred . This allows prevention of repetition of a similar occurrence by taking a corrective measure at an early stage before it becomes uncontrollable. In this framework, system safety problems are contained at an early stage minimizing the number of accidents that can possibly occur.

The other suitable yet simple approach to system safety is training of the employees. Training is a common practice in most organizations but is mostly focused on the performance on their specific and very little on the system safety . They often assume that the employees will read the safety practices from the default safety guidelines that have been set out by the organization and adhere to them. Some organizations wait until a disaster has occurred in order for them to train their employees on the best safety practices. Safety practices should be emphasized from the word go before a disaster occurs in order to reduce the chances of the disaster occurrence or reduce the impact in case it occurs .

Systems Engineering Initiative for Patient Safety (SEIPS) model is a system approach used to ensure the safety of the patients in hospitals. The framework is a three phased model in which the first phase is the working system, second, the processes and finally, the outcomes. The first phase, work system involves the sociotechnical aspects of the work system with persons at the center of five other interacting components. The five other components include; external environment, internal environment, tools and technologies, tasks, and organization. It focuses on how the five factors linked to person affect person. Persons at the center indicate that systems should be created to support persons and not to compensate or replace them . The model at this phase emphasizes that the person should be taken into consideration during system analysis and design. Person not only refers to the patient but also the doctors and any other human who may be linked to the hospital in one way or another.

Description of research and approaches advocated to improve system safety.

SEIPS depends on the analysis of work system which involves the people, processes and the outcomes of the processes in order to come up with appropriate solution to the health human errors. Processes involves the professional work done by the medical professionals in attempt to treat the patients. It involves serious social, physical and cognitive work that collaborate together for the wellbeing of the patient. It is at this stage that the human error might occur if the first phase was not well polished.  The outcomes phase generally involves the results of the process stage. If the process stage was characterized with error, they are reflected at this stage. SEIPS like HFACS, the phases are interconnected such that a problem in the first phase is reflected at the final result . Therefore, when the results at the outcome phase are not appealing, the problem is traced right from the work system phase. It is an iterative framework. SEIPS can also be described using three terms; configuration which involves the interactive parts of the system, engagement which involves both professionals and non-professions that are part of the patient’s recovery process and adaptation which involves feedback mechanism of the entire process . The framework not only identify human factors affecting system safety but also design, software and management factors .

HFACS framework is widely applied in the aviation industries . It was specially designed for the aviation industry but it can be applied in any other industry such as rail, healthcare, mining, marine, and construction . The framework was first used by the US navy to solve the aviation challenges that they were experiencing . The navy was experiencing a higher rate aviation accidents resulting from human factors. The framework enabled them to determine that the accidents were as a result of violation of the required routine by the responsible parties. This enabled them to take the appropriate informed actions to prevent the accidents. These enabled them to significantly reduce the percentage of the aviation accidents in the navy. HFACS framework is also applied in the marine. Despite the fact that the marine disasters may not be directly linked to the humans, the framework attempts to establish where human factors is connected to the system elements . This has greatly assisted the marine in risk managements and reducing the losses resulting from human error.

SEIPS model was specially designed to be used in the health sector particularly surgery. It was applied on outpatient surgery services at five centers in Madison . These were pilot programs intended to establish the effectiveness of the framework in pointing out safety threats in an organization. The framework main work was to act as an assessment guide tool for processes, systems and outcomes in the outpatient surgery . The assessment results would then be used to design interventions for the identified flaws.

Swiss Cheese Model

The above discussed approaches to system safety are viable but there must be an approach that is better than the other. All the approaches focus on the human safety while also considering the contribution of the same humans to the issues with the safety of the systems. The approaches were initially focused on one particular industry but their ideas can be applied on the other industries with safety critical systems.

 For instance, HFACS was found to be effective in the aviation industry and its success is evident in the aviation of the US navy . SEIPS focuses particularly on the health sector. Its research basis is on the hospital operations, the technology, processes and people . However, the same concept can be applied in any other industry particularly construction and mining industry where people, tools and technology are highly involved.

Training as an approach to ensuring the safety of the safety-critical systems is basic. It requires a backup of another approach like HFACS, SEIPS or Reason’s Swiss Cheese Model to produce desired results. However, it may only be fully effective in cases where systems have been built with low probability for error and only easily avoidable errors are possible.

In as much as SEIPS and HFACS promote system safety by ensuring the safety of a human, they vary in the way they approach the issues. SEIPS focuses on the broader perspective which involves interaction of the people with the tools and technology, internal and external environment and the organization. On the other hand, HFACS solely focuses on the role of the humans in the system safety . It shows that the behavior of the top management affects all that are below them which includes the operations personnel who interact directly with the safety-critical systems. According to the framework, poor exercise of authority could jeopardize the safety of the safety-critical systems. Poor training of the operations’ team could lead to wrong operation of the system and hence accidents .

Therefore, HFACS framework is suitable for organizations whose system safety is highly dependent on the employee behavior and where the organization has great desire to improve its workforce . On the other hand, SEIPS model is suitable for organization who intent to improve the overall system safety by considering all the other factors apart from human factors. Both of these frameworks can be used alongside training of the employees on the system safety measures for optimum results. Both frameworks are applicable depending on the needs of the organization.

References

Belvoir Media Group. (2018). Aviation Safety [online]. Available: https://www.aviationsafetymagazine.com/issues/38_9/

BMJ. (2000). Human Error: Models and Management [online]. Available: https://doi.org/10.1136/bmj.320.7237.768

D. Mallard. (2016). The Human factor in Safety and Operations [Online]. Available: https://www.ehstoday.com/human-factor-safety

D. A. Wiegmann and S.A. Shappell, Engineering and Technology, London: Routledge, 2017.

Dekker, S. (2001). The re-invention of human error. Human Factors and Aerospace Safety, 1, 247–266.

FAA.(2000, Dec 30). Human Factors Engineering and Safety Principles and Practices [Online]. Available: https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/risk_management/ss_handbook/media/Chap17_1200.pdf

Hfacs.(2014).The HFACS Framework. [Online]. Available: https://www.hfacs.com/hfacs-framework.html

Komaki, J., Heinzmann, A. T., & Lawson, L. (1980). Effect of training and feedback: Component analysis of a behavioral safety program. Journal of Applied Psychology, 65(3), 261-270. https://dx.doi.org/10.1037/0021-9010.65.3.261

L.T. Kohn, J.M. Corrigan and M.S. Donaldson (2000). To err is Human: Building Safer Health Systems [Online]. Available: https://books.google.co.ke/books?hl=en&lr=&id=Jj25GlLKXSgC&oi=fnd&pg=PT25&dq=describe+how+system+safety+can+be+improved+with+the+adoption+of++new+enhanced+approach+that+focus+inherently+safe+design+&ots=bIkamnQ55E&sig=mo83QV3LmdUT25v10PE2A0KNYOw&redir_esc=y#v=onepage&q&f=false

M. Celik and S. Cebi., “Analytical HFACS for investigating Human Errors in Shipping Accidents”, Accident Analysis and Prevention, vol 4, pp. 66-75, Jan 2009.

National Research Council (1997). Flight to the Future: Human Factors in Air Traffic Control. Washington, DC: National Academy Press

NCBI.(2005). Work System Design for Patient Safety: The SEIPS Model [Online]. Available: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2464868/

NCBI.(2014). Human factors systems approach to healthcare quality and patient safety [Online]. Available: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3795965/

SAFESTART. (2018). 4 Railway Safety Concerns [Online]. Available: https://safestart.com/news/4-railway-safety-concerns/

SafetyWorks. 2013). Managing Safety and Health [Online]. Available: https://www.safetyworksmaine.gov/safe_workplace/safety_management/

SAH. (2018). Safety and Health Management System [Online]. Available: https://www.hsa.ie/eng/Topics/Managing_Health_and_Safety/Safety_and_Health_Management_Systems/

Slack Davis Sanger. (2018). Air Craft Accidents Due to Human Error [Online]. Available: https://www.slackdavis.com/blog/aircraft-accidents-due-to-human-error/

Z. Jiale, “An Ontological Approach to Safety Analysis of Safety-Critical Systems”, Ph.D. Embedded systems, Dept. Sofr. Eng., Mälardalen Univ., Västerås, Sweden, 2017