The importance of organizational culture within accidents: an insight about human factor and automation impact

The article aims to describe the dynamics of accident events, taking into account the relationships between individual factors and organizational culture

ID Articolo: 164537 - Pubblicato il: 03 maggio 2019
The importance of organizational culture within accidents: an insight about human factor and automation impact
Messaggio pubblicitario SFU Magistrale
Condividi

Risk management is a process of planning, organizing, directing and controlling the human and material resources of the organizations. Nowdays is an aspect that is impossibile to not considerate!

 

Messaggio pubblicitario The present article is based on the work of James Reason “Managing the Risks of Organizational Accidents and aims to describe the dynamics of accident events, taking into account the existing relationships between individual and organizational factors. The theme emphasizes the arrival of automation and its consequences in the performance of tasks in highly dangerous jobs. Then, the role of the organizational culture in the organizational failures will be analyzed and finally some considerations about the importance of the organizational culture in the appearance of the accidents will be proposed.

Introduction

Nowadays, security management is the bread that feeds the performance of high-risk jobs and allows those jobs to be developed. Risk management is a process of planning, organizing, directing and controlling the human and material resources of the organizations. The objective of security management reflects the fact of maximizing the effects of the risks that may occur in the productive process of goods or services. Currently, automation confers important help to improve security, but ultimately reveals some peculirities that must be studied correctly to mitigate their effects and prevent security levels from decreasing instead of increasing.

1. The impact of automation on highly hazardous jobs

The risk of an accident occurring during the performance of a productive activity is connected to the dangers existing in the work tasks, in our own environment or place of work. Today these tasks and these environments are very different from those that were performed ten or fifteen years ago. Highly dangerous jobs have gradually been integrated with technologies of increasing complexity, indeed.

The technologies that support highly dangerous work are always evolving as well as the characteristics of the work. The automation of tasks that require a lot of concentration and attention, in aviation (ie. the flight deck), has reached very complex levels. Although technology has not yet completely replaced the role of the human operator, most of the calculation and precision activities are automated. However, with the emergence of automation in organizations it is clear that some problems will be solved, but the operation of them becomes more and more intricate.

Thus, when we think about earning benefits such as increasing efficiency and reducing costs, we do not fear to forget that in any case the human factor has not disappeared, on the contrary, it has a very large impact on the functioning of those systems.
According to David Woods and Nadine Sarter (1992; 1993), automated systems can increase the demands on the memory of their operators, increasing workload, stress and anxiety. In fact, Mulder’s definition of workload (1980) reflects the number of stages of a process or the number of processes required, depending on the time required to correctly perform a task.

For the previous reason, it may also seem that the human being does not have such an active role in the performance of complex tasks, in which, through focused attention on precise machines at a precise moment, he performs those operations. We put a more practical example: in the flight deck, the mental load of horizontal navigation has decreased a lot thanks to computer control systems, but the one related to vertical navigation, which is more complex and dangerous, can increase significantly.

With this concept we introduce the ironies identified by Lisanne Bainbridge (1980) with respect to the automation of complex tasks. Many years of accumulated experience in particular in the world of aviation, nuclear energy and the chemical industry emphasize that high automation sometimes moves costs instead of cutting them radically. The ironic concepts discussed by Lisanne Bainbridge (1980) refer to the automation of some tasks and the disappearance of some errors, but at the same time we are witnessing the introduction of new types of errors.

Later on we will focus on how the perspective of the causes of accidents is diverting each day more in a direction that focuses on the weight of organizational rather than individual factors.

2. The “Swiss cheese” model by James Reason

The idea that errors and accidents are generated by a human error and/or a technical failure is based on the Newtonian-Cartesian dualism, which is inadequate when we talk about complex events that happen in organizations (Dekker, 2005). On the basis of this dualistic conception the mental world is separated from the material world (Descartes) and for each event there must be one and only one cause (Newton). However, empirical research has widely demonstrated that in these last decades, a conception based only on human error does not meet the complexity of the events it tries to explain and if the analysis is not adequate, the solutions will not be identified either.

On the motivations and causes of accidents and disasters in organizations different explanatory models have been developed. Each model has its frame of reference, its conception of error and accident and promotes a practice of safety consistent with those assumed implicit. The “systemic and organizational” model of James Reason (1997) promotes learning and is based on the principle that human error is inevitable and for this reason, although we cannot change human nature, we can change the conditions of the human being work environments. This issue considers operators as networks of system defects and proposes to increase the security and reliability conditions of organizations.

This conception of James Reason takes the shape of a model that explains accidents in organizations taking into account active failures, characterized by having a direct impact on the safety of the system. Active failures are actions or omissions, including errors and infractions, which have immediate adverse consequences. These failures are usually related to the personnel that are located in the human-system interface: the “frontline” personnel.

However, the human contribution is not only in those actions so close to the accidental event. Strategic decisions and high hierarchical levels, such as decisions made by public administrators, managers, manufacturers, designers, etc., are conceptualized as precursors of latent conditions present in all systems. These conditions, in contrast to active failures that are immediate, require a long time for a damaging result to be experienced.

Messaggio pubblicitario When we analyze the genesis of an accident according to this model, we have to take into account that accidents are not caused by a single error, on the contrary, they are the result of a concatenation of “active failures” and “latent conditions”. From these considerations, the model of James Reason of defenses type “Swiss cheese”, assumes the existence of layers of defense in the different levels of action that separate the exhibition of the outcome. According to the representation, each stratum has holes whose position varies depending on local conditions; the accident occurs when those holes line up and leave an open window that leads directly to the accident.

The awareness of the specific errors in risk management is essential to develop risk analysis actions. According with the literature there are different theories about fault typologies, but the main tendency is to refer to James Reason and his theory of errors. Among the factors that allow defenses to fail, human beings contribute to the production of active failures and the generation of latent conditions. On the contrary, the latent conditions are related to strategic decisions and of high organizational level.

3. Individual factors

The conception of human error proposed by James Reason is complex and well-articulated. Following to his vision, not all dangerous acts are classified the same category; in fact, errors take different shapes and have different psychological origins and occur in different parts of the system. The life of human beings, according to a strictly psychological perspective, is based on decisions. This concept reflects the activities performed on a normal day: we get up and have to choose the clothes we are going to wear, thinking about the activities we are going to do (dressing more comfortable, more elegant etc.) or the temperature is going to be outside our home (clothes hotter for low temperatures). These decisions occur naturally, without pressure, because, basically, they do not carry great risks. In high-risk jobs, decisions do not address the criteria of comfort, but efficiency and security.

Making decisions for a pilot is a mental process that consists of choosing the most appropriate actions in certain situations, taking into account a very large number of input and information, which are taught by human beings or automated machines. Based on the model proposed by Rasmussen (1987), James Reson differentiates three execution errors and three acts performed according to the intentions, thus defining three different types of errors (Reason, 1990). Assuming that all conditions are encouraging the success of an operation, if an unfavorable event finally occurs, slips or lapses may have occurred. Slips refer to attention or perception-based failures in skills (it occurs in routine tasks, practiced in an automatic way, with occasional verification of their progress); on the other hand, slips are more internal events, which generally involve failures of memory.
At higher level there are mistakes which result in the mental processes involved in evaluating the available information.

In particular, the mistakes are usually based on the rules (when we need to modify our behavior because we have to take into account changes) or knowledge (when we have the time to think things carefully and meticulously).

In other words, the first class of errors refers to an incorrect application of the rules and the second to the lack of knowledge necessary to carry out an operation. In any case, all errors imply some kind of deviation: in the slips and lapses the deviation is about the current intention, while in the mistakes the deviation depends on taking the correct path to reach the goal. In these circumstances, the operator is intentionally rational, but despite this, is linked to limited cognitive abilities and incomplete information. Attention, memory, understanding and communication define the decisions, and what results is that the actions may not be rational, despite good intentions (March, 1994, Simon, 1947).

4. Difference between errors and infractions

The difference between errors and infractions is based on the concept of “intentionality”. Infractions are all the actions that involve the omission of rules (or codes of behavior shared in the organizations). Infractions are usually intentional, but they do not always cause negative events. However, it is necessary to keep in mind that human beings do not plan and execute actions in an isolated environment, on the contrary, in a well-defined context in which behavior is regulated by operational procedures, codes and rules. In this context, infractions are conceptualized as deviations from the procedures or violation of the appropriate rules to operate safely.

In this regard, James Reason, identifies three types of infractions that take into consideration the level of intentionality of the individual acting:

  1. Routine infractions
    Routine infractions imply a faster way of acting and are usually a routinary part of the behavioral pattern.
  2. Optimized infractions
    They are committed in relation to the advantage they bring and reflect the fact that human actions have a plurality of motivational purposes.
  3. The necessary infractions
    They are usually caused by organizational failures and the location of work; they presuppose violating the rules to carry out the work.

Analyzing the causes of these different types of infractions we can notice that the organizational factors are constantly present in the event of these. The necessary infractions are a very clear example of how the organization of work has a very important role in the success of accidents and can be a precursor of organizational accidents. Routine infractions also have a direct link with the organizational culture that does not punish (or wrongly punish) the infractions and feeds the frequency that those things happen.

5. Organizational level factors

People act in a specific work environment and the organizational level is linked to the context in which individuals act when an accident occurs. The organizational level focuses on human-machine interactions, crew and group work, communication processes and operational coordination. These factors refer to dimensions that are far away from the accident in time and space. This level does matter in the understanding of the organizational processes, the activity systems, the strategies, the specific culture in every organizations. More specifically, they are strictly related to the dimensions of the coordination and control system, the training, the weaknesses of the defense system, the management decisions, etc.

According to the sociologist Schein (1984) the organizational culture is a coherent set of basic issues that a group shares, develops to face the problems of external or internal adaptation. The culture continues to live in the organizations because during the time it has had positive outcomes, that were later considered successful. These procedures are “the way of doing things”, for example, the new members of an organization get to know them and put them into practice when a problem occurs.

In relation to the field of security in organizational cultures, Ron Westrum (1993) has classified organizational cultures according to the way in which they handle information related to security. These cultures are reflected in three concepts that define them:

  1. Pathological
    It refers to a culture that actively discourages suggestions of new ideas, where responsibilities are shirked and failure is punished.
  2. Bureaucratic
    This culture is based on the compartmentalization of responsibilities and power is usually distributed in a very hierarchical way in the organization. Faults respond with local repairs.
  3. Generative
    It generally receives new ideas openly and fails to respond with far-reaching reform.

As Reason points out to us (p.249), the culture of security show off the values ​​(what is important) and shared beliefs (how things work) and when interacting with the structures and control systems of the organizations, produce norms of behaviors (the way things are done) (Uttal, 1983). In an organization that respects safety, to estabilish a strong and informed safety culture is one of the primary objectives.

To understand why culture is so striking in the way of performing highly dangerous jobs, we refer to the concept of organizational culture described above. Culture, in fact, is something that tells us the direction we have to choose to get things done according to the rules (for example, safety laws) and according to the standards of the organizations (for example, the attention that the company gives the quality). In an organization, where the mutual and reliable communication is not allowed (e.g., pathological culture) some latent conditions that result from slips, lapses or mistakes (for example in maintenance operations), can lead to accidents, but if they are communicated suddenly it is possible that they can be solved.

So, more simply we can say that: in the organizations that presents a culture where the fear of punishment is stronger than the fear of an accident, even if an operator identifies the error, it is unlikely to notify properly. Clearly those aspects are connected to how the organizations handles guilt and punishment and to the assumption of responsibilities. For this reason, it is important for an organization to promote a generative culture is, with no doubts, more appropriate for spreading safety (Westrum, 1993).

Conclusions

The study of accidents in the transport industry, nuclear or chemical energy, allowed a greater understanding of the causes of accidents with a focus always more focused on the pre-existing organizational factors. In fact, the fallible decisions of the highest levels of management are transmitted through the different distributions and services of the organizations to the personnel in the “front line”, creating tasks and conditions of the context that can promote unsafe acts. These conditions include factors such as a high workload and mental fatigue, inadequate knowledge, skill and experience, abrupt changes in the organization of work, inadequate communication system etc.

These factors can influence the performance of operators and increase the possibilities that an accident will occur. Clearly, on the part of the managers and the leaders of the resources (human, technological, economic, etc.) it seems very difficult to foresee the consequences of their own actions on the security of the entire system and the impact they can have on the decisions of the “frontline” operators, especially because they are located far away in the hierarchy.

For this reason, we can say that talking about errors in work environments also means talking about tolerance of errors and that it is important to not forget that human actions can only be understood within a “human environment”; consequently this context represents one of the causes of human errors. However, an effective antidote to these misconducts is a strong and generative safety culture, which stresses the repercussions of one’s actions on other people.

In this perspective, which leaves behind the activity of the individual alone, not close to the places where active failures occur, and focuses more on the activity of all the members of the organization, it is important that the responsibility for safety is shared among all company personnel. In fact, effective error management is based on continuous verification of the system and the work environment that makes the error visible so that it can be corrected immediately.

VOTA L'ARTICOLO
(voti: 2, media: 3,50 su 5)

Consigliato dalla redazione

Innovazione: quando il conflitto aumenta la creatività

Innovazione: quando discutere aumenta la creatività

Per produrre innovazione deve esserci conflitto: esso infatti aumenta il pensiero divergente, incrementando la creatività e dunque le idee innovative

Bibliografia

  • Bainbridge, L. (1987). Ironies of automation. En J. Rasmussen, K. Duncan y J. Leplat (eds) New Technology and Human Error, Chichester, Wiley, 1987, pp.271-283.
  • Dekker, W.A. (2005). Ten Questions about Human Error. London, Lawrance Erlbaum Associates.
  • March, J.M. (1994). A Primer on Decision Making. How Decisions Happen. New York, The Free Press.
  • Mulder, G. (1983). The heart of mental effort Groningen (discurso). Citado en Drenth, P.J.
  • Reason, J. (1997). Managing the Risk Organizational Accidents. Aldershot Hampshire, Ashgate Publishing Limited.
  • Sarter, N.B., Woods, D.D. (1992). Mode error in the supervisory control of automated systems. En Proceedings of the Human Factors Society 36th Annual Meeting, Atlanta, G.A.
  • Schein, E.H. (1984). Coming to a New Awareness of Organizational Culture, Sloan Management Review, n. 25.
  • Simon, H. (1947). Administrative Behavior. New York , MacMillan.
  • Uttal, B. (1983). The corporate culture vultures. Fortune, 108(8), 66-72.
  • Woods, D.D. (1993). The Price of flexibility. En Hefley W. y Murray D. (eds.), Proceedings of International Workshop on Intelligent User Interfaces, ACM.
  • Westrum, R. (1993). Cultures with requisite imagination. In Verification and validation of complex systems: Human factors issues (pp. 401-416). Springer Berlin Heidelberg.
State of Mind © 2011-2019 Riproduzione riservata.
Condividi
Messaggio pubblicitario

Messaggio pubblicitario

Messaggio pubblicitario