Safety Culture

 

Safety Culture Resources For Aviation And ALL High-Risk Industries!


 

 

 

What Is Safety Culture?

"Safety culture can be defined as the enduring value and priority placed on worker and public safety by everyone in every group at every level of an organization. It refers to the extent to which individuals and groups will commit to personal responsibility for safety; act to preserve, enhance, and communicate safety concerns; strive to actively learn, adapt, and modify (both individual and organizational) behavior based on lessons learned from mistakes; and be rewarded in a manner consistent with these values" (Zhang, Wiegmann, von Thaden, Gunjan, & Mitchell, 2002).

There were a few sentinel events that occurred in the high-risk domain that precipitated the need to address safety culture; foremost, the world's worst nuclear power plant accident that occurred at Chernobyl, (former USSR), in 1986. There were 56 fatalities as of 2004, with more casualties occurring on an ongoing basis, mostly due to cancer. Coincidentally, that same year, NASA lost the Challenger Space Shuttle shortly after liftoff and all seven crewmembers perished. Through an extensive investigation, it was revealed that both of these tragedies had human factors and safety culture-related underpinnings that attributed to the trigger events. However, it wasn't until the inflight breakup of Continental Express Flight 2574 in 1991 that the aviation industry began to take a serious look at how a pathogenic safety culture can contribute to an aircraft accident. Among the findings of the National Transportation Safety Board (NTSB) investigation of Flight 2574: 

The failure of Continental Express maintenance and inspection personnel to adhere to proper maintenance and quality assurance procedures for the airplane's horizontal stabilizer deice boots that led to the sudden in-flight loss of the partially secured left horizontal stabilizer leading edge and the immediate severe nose-down pitchover and breakup of the airplane. Contributing to the cause of the accident was the failure of Continental Express management to ensure compliance with the approved maintenance procedures, and the failure of the FAA surveillance to detect and verify compliance with approved procedures (NTSB, 1992).

 

 

This report reveals, among other things, deficiencies in management oversight. However, what was not stated in the Probable Cause was the role that a poor safety culture played in the accident chain. This was not elucidated until John Lauber (then NTSB board member) offered a dissenting opinion. Lauber believed that the Probable Cause was shortsighted, due to the fact that a poor safety culture was not included as part of the Probable Cause. In his dissenting opinion letter, Lauber suggested that the Probable Cause should be rewritten as follows:

The National Transportation Safety Board determines that the probable causes of this accident were (1) the failure of Continental Express management to establish a corporate culture which encouraged and enforced adherence to approved maintenance and quality assurance procedures, and (2) the consequent string of failures by Continental Express maintenance and inspection personnel to follow approved procedures for the replacement of the horizontal stabilizer deice boots. Contributing to the accident was the inadequate surveillance by the FAA of the Continental Express maintenance and quality assurance programs (NTSB, 1992).

It was this dissenting opinion by Member Lauber that set the wheels in motion for the proposition that a pathogenic safety culture can have a significant contributing affect on accident causation. Indeed, Continental Express Flight 2574 was the watershed event that made aviation take notice of this insidious, latent, and sometimes deadly part of the overall organizational culture. 

 

Just Culture

According to Professor James Reason, "A Just Culture promotes an atmosphere of trust in which people are encouraged (even rewarded) for providing essential safety-related information, but in which they are also clear about where the line must be drawn between acceptable and unacceptable behavior." Reason also states that a Just Culture is, "A way of safety thinking that promotes a questioning attitude, is resistant to complacency, is committed to excellence, and fosters both personal accountability and corporate self-regulation in safety matters."  

A Just Culture promotes safety by acknowledging the fact that humans are vulnerable to errors; errors will always occur; and some errors should not carry with them a personally harsh, punitive, resolution when in fact the system itself might be flawed. However, a clear line must be drawn which differentiates between what is common everyday human error versus flagrant or willful violations that could, and should, be dealt with in a stricter manner.

Another important aspect of a Just Culture is how errors are identified, addressed, and mitigated. This is accomplished through a reporting system. Certain incidents and accidents are subject to mandatory reporting, although these reports are typically filed after some kind of visible event. However, it is the errors that occur that are unreported, dismissed as insignificant, or are "too minor to report" (i.e, occur "below the waterline") that are the ones of most interest. It is these unreported errors that, if not addressed in their early or latent stages, may continue to feed an accident chain and lead to higher-level events.

 

 

A Just Culture promotes a voluntary error reporting system where the "little and insignificant" mistakes are actively sought. Some of the individual benefits of this system include confidentiality and immunity from punitive action (with certain restrictions, such as criminal intent). Although a voluntary error reporting system can be a highly effective process for identifying errors before they become problematic, the system is not without a dichotomous element; on one hand, voluntary reporting will allow an organization to look deeper into error commissions that it would not otherwise be aware of. On the other hand, employees might be unwilling to voluntarily report errors for reasons such as loss of confidentiality, embarrassment, lack of trust regarding the use of the information, and not enough time or motivation to fill out the required forms. Therefore, trust, transparency, and training are requisite elements for any voluntary reporting system.  

Once the error reports are entered into a database, the information can then be analyzed to look for trends or constant errors, "hot spot" identification, and other categories as required by the organization. The final stage of the error reporting system (a system with a goal of reducing errors) might include all, or some, of the following processes; feedback (through newsletters, emails, safety pamphlets), training or retraining, safety seminars, or even job reassignment.


High-Reliability Organization 

In order to have a profitable organization, you need to have a safe organization. Aviation is a high-risk industry. Safety Risk Management (SRM) is key in proactively addressing hazards and risks before they become incidents or accidents. All organizations should strive to become a high-reliability organization (HRO).

Without true management buy-in and commitment, safety will not be effective at lower levels, and programs that depend on a healthy safety culture, such as SMS and Human Factors, will be bound to fail. Accidents are typically not caused by a single errant individual, but rather a chain of events that may have been hiding in the organizational system for months or even years (also known as latent conditions). The individual that causes the accident may simply be the "trigger puller" of a chain of systemic failures. TACG President Dr. Bob Baron discusses this in more detail in his article, The Organizational Accident. Or not.

According to the primary authors of HRO (Karl Weick and Kathleen Sutcliffe), "HRO describes a subset of hazardous organizations that enjoy a high level of safety over long periods of time. What distinguishes types of high-risk systems is the source of risk, whether it is the technical or social factors that the system must control, or whether the environment, itself, constantly changes. This latter can be controversial to observers as environments change within a range of expected extremes. It is the surprise of the change, its unexpected presentation that influences the level of reliability. The coupling between technology and people creates the socio-technical system."

Four organizational characteristics of the HRO limit accidents or failures:

  • Prioritization of both safety and performance, and shared goals across the organization;
  • A “culture” of reliability (or, better, attitude toward reliability) that simultaneously decentralizes and centralizes operations allowing authority decisions to migrate toward lower ranking members;
  • A learning organization that uses “trial-and-error” learning to change for the better following accidents, incidents, and, most importantly, near misses;
  • A strategy of redundancy beyond technology, but in behaviors such as one person stepping in when a task needs completion.

An HRO is an organization that has succeeded in avoiding catastrophes in an environment where normal accidents can be expected due to risk factors and complexity.

There are Five Characteristics of an HRO:

  • Preoccupation with failure
  • Reluctance to simplify interpretations
  • Sensitivity to operations
  • Commitment to resilience
  • Deference to expertise

 

 

Safety Culture Assessments

The Aviation Consulting Group provides safety culture assessments for organizations that wish to understand the current status of their safety culture. With our multi-pronged approach, we can unobtrusively drill down into your organizational culture to discover areas that could use improvement, as well as your safety strong points! Our methodology includes an online survey and an onsite visit.

Would you like to combine an online safety culture survey with our 2-day Safety Culture Development course? Click here for more details. 

 

Online Survey

We use our proprietary Safety Culture Assessment Tool (SCAT) as our online survey instrument, a very powerful tool that quantifies (and qualifies) employee attitudes towards safety culture. The SCAT measures an organization's culture on three scales; Corporate CultureSafety Culture, and Corporate Communication, and only requires 10-15 minutes to complete.

SCAT Benefits-

  • Can be conducted completely online 
  • Collects employee opinions and attitudes about safety culture, completely anonymously
  • No setup needed on client side. TACG handles the entire online survey process
  • Data collected are both quantitative (scaled responses) as well as qualitative (short textual narratives) to give a more complete picture

 

Onsite Visit

Our onsite visit allows us to gain an objective, real-time peak into your safety culture. We will be as unobtrusive as possible as we conduct observations and anonymous, informal interviews with personnel at various levels of the organizational hierarchy. 

Onsite Visit Benefits-

  • Real-time measurement of the organization's safety culture
  • Objective assessment by a neutral third-party
  • Not an audit and no judgment
  • Can be conducted in conjunction with an SMS Gap Analysis

 

Analysis And Report



Once the data are collected, we then analyze the results and provide a comprehensive report of our findings, including recommendations and suggestions. All data collected are anonymous and maintained securely at our office.

It is important to keep in mind that a healthy safety culture is a pre-requisite for implementing and maintaining a truly effective Safety Management System. Therefore, we recommend conducting an initial safety culture assessment and then conducting follow-up assessments on a one or two-year basis. This ongoing process allows the organization to see what, if any, safety culture changes are occurring, and in which direction.

 

Fees And Options



The fee for our full safety culture assessment is based on company size, making it affordable for any size operation!

We also offer the option to uncouple the online survey from the onsite visit. We can conduct the online survey only--or--the onsite visit only. Although we recommend the coupled assessment for the most robust data gathering, we understand that some organizations may prefer to deploy just one method.

To receive a customized quote, or to request additional information, please contact us.


 

 

 

 

Ready to Build Your Safety Culture?

Dr. Baron is available for aviation and all high-risk industries!

 

 

Dr. Robert (Bob) Baron
TACG President/Chief Consultant
Full bio available here


TACG President and Chief Consultant Dr. Bob Baron has been involved in aviation since 1988, with extensive experience as a pilot, educator, and aviation safety advocate. Unlike some other "safety consultants," Dr. Baron has the proven and time-tested qualifications and experience to assist aviation organizations on a global basis. This includes a Ph.D degree in Industrial and Organizational Psychology (the things safety culture and SMS are built on), as well as degrees in Aeronautical Science (Specializations in Human Factors and Aviation/Aerospace Safety Systems) and Professional Aeronautics (Minor in Aviation/Aerospace Safety).

As an Organizational Psychologist with extensive, practical experience in aviation, you can rest assured that you are making the right choice when selecting an organizational/change management consultant who can make demonstrable and measurable changes in your organizations' safety culture.

Consulting and training now available for all high-risk industries, including: 

  • Nuclear
  • Rail
  • Energy
  • Maritime
  • Healthcare

Consulting and training services include:

 


TACG offers free, initial consultations via Skype or teleconference! 
Click here
 to request your consult.

 

 

 

Safety Culture Blog Articles by Dr. Baron

 

Procedural Noncompliance: Why Pilots Don’t Always Play by the Book

Procedures, policies, and checklists are an effective way to ensure safety-related tasks are being conducted in a standardized, pragmatic way. They are also, in most cases, a regulatory requirement. Thus, all pilots are following the procedures…right?

Wrong. Procedural noncompliance, or procedural drift, has been either a primary, or contributing, causal factor in the majority of aviation accidents. The term procedural drift refers to the continuum between textbook compliance, and how the procedure is being done in the real world. Procedural drift is not something new. We have been drifting for a very, very, long time. However, recent accidents have illuminated the ubiquity and severity of the problem, prompting the U.S. National Transportation Safety Board (NTSB) to add ‘Strengthen Procedural Compliance’ to its Most Wanted List of Aviation Safety Improvements (2015). This NTSB recommendation was largely in response to an Execuflight Hawker HS125-700A crash on approach to Akron, Ohio, in 2015. All onboard perished (2 crew and 7 passengers).

 


James Reason's Swiss Cheese Model

 

In the case of Execuflight, the links in the chain were connected in such a way that the pilots, on that day, were the enablers of an accident that was waiting to happen. The holes in the Swiss Cheese model lined up. As is often the case, there were latent threats that were inherent in the system. These latent threats included inadequate oversight by the FAA, inadequate pilot recruiting and training, procedural noncompliance, and a poor safety culture. These latent threats allowed the pilots to fly the trip together that day and commit a series of errors that culminated in a preventable crash that took the lives of all onboard.

The crash illuminated a number of issues related to procedural noncompliance. Latent threats set the accident precedents, but the pilots were the ones that enabled the accident to occur on the day of the accident. Instead of being the final safety nets to avoid such an accident, they instead were the “trigger pullers”. So why, on that day, did the pilots deviate so extensively from procedures? There are a number of reasons why these deviations occurred, which I delve into extensively here. Suffice it to say that there were issues with the culture, oversight, hiring practices, training, and CRM.

Unfortunately, there are still many charter outfits that are operating with a similar modus operandi, where the balance of protection versus production is weighted too heavily on the production side. This often sets the precedents for bad things to happen, such as procedural drift or noncompliance.

The good news is that all of this is avoidable, as long as your company is willing to take a proactive approach; trust me, it will be much less expensive to prevent it from happening in the first place rather than being forced to fix it reactively after the accident…but you already knew that!

 

 

When National Culture Supersedes Safety Culture

As a global aviation safety consultant for a few decades now, it’s fair to say that I have been exposed to many different cultures. When I speak of culture, I am focusing specifically on two types; the national culture, and the organizational safety culture. Often there is a relationship between both— national culture has a direct effect on the organizational safety culture. As an example, a high Power Distance (PD) national culture (more accepting of an unequal power relationship) may see manifestations of excessive deference in everyday flight operations. Within the context of national culture, this is appropriate behavior; however, within flight operations, the inability or unwillingness to speak up, or voice a concern to a superordinate (i.e., Captain, Maintenance Crew Chief, etc.) can have (and has had) serious safety implications.

 

 


Hofstede's Power Distance (PD) Scales

 

 

So how can an aviation organization separate its national culture from its organizational safety culture? Or can it? In some cultures (i.e., low PD cultures), where the disparity is minimal, it’s easier to have a flatter hierarchy, more open and transparent communication, and a collaborative working environment. But what about high PD cultures? These cultures tend to have a more rigid hierarchy, are more suppressive with communication, and favor a more autonomous working environment.

 

The simple solution to negating the effect of national culture is to ask employees to “check their national culture at the door” and adapt to the organizational safety culture during their shift. This might not be an easy task, however, in light of the fact that safety culture principles may be in sharp contrast to their national culture. As an example, a few years back, a CRM course was conducted at an airline located in Asia (high PD culture). In the class, all of the pilots learned about, and tested well, on CRM principles such as teamwork, delegation, assertiveness, speaking up, etc. Yet, once in the cockpit, they defaulted to their instinctive, national cultural behavior patterns (particularly, excessive deference and lack of assertiveness).

 

There are no simple solutions. However, the potential clash of national and organizational safety cultures is an issue which merits serious thought and consideration in all types of aviation organizations and operations.

 

 

 

A Just Culture

A healthy safety culture must include a Hazard/Error Reporting System (HERS). And the key to a successful HERS is a Just Culture. A Just Culture is a culture that acknowledges that well-intentioned people still make mistakes and they should not be punished for slips, lapses, mistakes, and other common everyday UNINTENDED errors. Yet, a line is still drawn where willful violations and purposeful unsafe acts must be dealt with in punitive form. The general indications are that only around 10 percent of actions contributing to bad events are judged as culpable (Reason, 2004). The bottom line of a Just Culture is trust. Employees must know that they can report hazards and errors without sanction. Once this trust is established then an organization can have a reporting culture, something that provides the system with an accessible memory, which, in turn, is the essential underpinning to a learning culture (Reason, 2004).

 

Along the same lines, Eiff (1999) suggests that, “An effective and systematic reporting system is the keystone to identifying the weakness and vulnerability of safety management before an accident occurs. The willingness and ability of an organization to proactively learn and adapt its operations based on incidents and near misses before an accident occurs is critical to improving safety.”

 

Participation in hazard reporting is relatively easy because employees objectively report the things they "see." On the other hand, errors are much more challenging because employees may be reluctant to report the erroneous things they "do." Is there enough trust in your company culture so that employees feel comfortable reporting errors that they personally commit, even if the report is anonymous? Think about it.

 

 

 

A Poisoned Safety Culture

A few years ago I was teaching a human factors course to a helicopter operator in the United States. About halfway through the course I was ready to teach the module about communication skills, specifically addressing the issue of assertiveness. At that moment there was a knock on the door and the Director of Maintenance (DOM) walked in asking if he could have a few minutes alone with the class. Though odd, I agreed and proceeded to sit in the waiting area for what I thought would be a few minutes. About an hour later the DOM exited the classroom and thanked me for giving him a “few minutes” with the class. Upon returning, I was told by the class attendees that the purpose of that interruption was so that the DOM could make it clear to employees that “what he says goes” and that the class should basically ignore what I was teaching regarding speaking up when acting in, or observing, unsafe conditions.

That DOM is no longer with the company, but the event did effectively highlight the extreme end of a poisoned safety culture. In my over 20 years of teaching human factors courses all over the world, I had never, and have still never, experienced something quite as brazen as what occurred that day. Authoritarian, toxic leadership styles have absolutely no place in aviation, or, for that matter, in any other high-risk industry. How can you work on attaining, or maintaining, a generative safety culture when the role models of your organization are stuck in the Pathogenic mode? Does any of this sound familiar?

 

 

 

Teaching Safety Culture in Human Factors Courses

As a follow-up to my last article on “Human Factors Hotspots,” I wanted to delve a bit deeper into one of the points I had stated. I wrote, “It is clear that safety culture and procedural deviations are two of the most significant contributing factors in aviation maintenance-related accidents and incidents (and, typically, procedural deviations are a manifestation of an unhealthy safety culture).”

 

The reason for aiming in on this statement is so that I can discuss in more detail some of the limitations of HF training regarding procedural deviations being a manifestation of an unhealthy safety culture. These limitations, in no particular order, are as follows:


  • Although the topic of safety culture is important in an HF course, it’s mostly targeted towards awareness. Don’t expect to make paradigmatic culture changes as a result of your HF course!
  • The very people that can actually do something about making changes to the culture are most likely not even in your class (high-level managers often feel as if HF training is only for the people that turn wrenches).
  • Procedural error mitigation can certainly focus on the mechanics (since they are the last line of defense). However, if the mechanics are working within the brackets of a pathogenic safety culture, it will be difficult, if not impossible, to change the negative norms that have become ingrained in the culture. In other words, for real changes to happen, they must be initiated at the top of the organization.
  • The health of your organization’s safety culture can be very subjective based on whom you ask. Ask any upper- level manager and they will probably tell you that “the culture is fine.” Ask a line mechanic and he/she may tell you that “the company is an accident waiting to happen.”  

Now, with all that being said, let’s assume you are your company’s HF instructor and you are going to teach a module on Safety Culture. Let’s also assume that your company’s safety culture is pathogenic (or, quite literally, “an accident waiting to happen”). How would you answer the following questions regarding the development and delivery of your Safety Culture training module?

 

  • In your HF course, would you skip the topic of Safety Culture altogether?
  • Would you ignore your own company’s safety culture issues and teach the topic from a neutral, objective position?
  • Would you try to change the safety culture by teaching people how to improve the culture? (keeping in mind that mechanics may not be able to change the culture themselves; change needs to start at the top—and the people at the top are probably not going to be in your class).
  • Would you try to develop a special course just for management to address safety culture to see if you can initiate change from the top? If so, do you think management would be receptive to a high-level safety culture course tailored to them?

These questions are certainly something to ponder as an HF instructor. Personally, I am confronted with this dilemma every time I teach an HF course. To make matters even more interesting, I facilitate HF courses at aviation organizations all over the world—some with outstanding safety cultures—some, not so much. Very often, while I’m teaching line mechanics, there are tacit, sometimes palpable, signs of frustration and angst when the subject of safety culture comes up; it can also get eerily quiet in the room. This provides evidence that the culture may be suppressive, unjust, and untrusting. If so, then we know that procedural deviations are most likely a manifestation of the unhealthy safety culture that exists, which can negatively affect mechanics’ performance through such channels as fatigue, pressure, norms, distractions, and stress. And if that’s the case, then you can expect your discussion on safety culture to be nothing more than nice-to-know information for your course attendees. The procedural deviations will just keep happening.

 

 

 

 

TACG Presentations


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The CEO As A Top-Level Hazard

 

 

The Little Things

 

 

Safety Risk Management: Unplugged

 

 

Looking Forward To Your Next Accident

 

Procedural Drift: Causes And Consequences

 

 

The Organizational Accident. Or not