The Normalization of Deviance

Human beings are instinctively inclined to experiment and innovate. Whether it’s my mother tinkering with an old cake recipe or an engineer trying to figure out how to increase efficiency at his or her plant, humans are always on the lookout to do what they do better. For the readers of this newsletter,the human propensity to revise, modify, deviate from, or change existing standards and rules is especially important to appreciate. The reason is that certain professionals, such as health care providers,police officers, air traffic controllers, engineers, food producers, transportation workers, scientists,etc., work in densely rule bound environments where rule compliance is required. Nevertheless,the same urge that my mother has to tinker with her venerable cake recipes can be seen among these system operators as they perform their various, rule bound tasks. Just like my mother, they are always looking to make their deliverable (or the delivery process) better, which they sometimes do.  At other times, though, their rule departures or “process variations” can invite disaster. Our rather extensive knowledge of disaster occurrence—irrespective of whichever industry we’re talking about,e.g., transportation, energy, health care, global investments, etc.—shows that catastrophic events are virtually always preceded by a lengthy period of “deviation,” wherein system operators depart from rules, regulations, policies, procedures, and standards in pursuit of their conviction that “their way” is better.

Sociologist Diane Vaughan coined the term “the normalization of deviance” to refer not only to nonadherence to rules but to the way that non-adherence becomes normalized in the work environment. Her research showed that rarely do personnel depart from rules with the intent to harm. Instead, rule deviations are virtually always a response to production pressures. System operators find their way easier, faster, simpler, and energy conserving. Vaughan noted that system operators break rules or violate standards because they believe:

  1. The relevant rules or standards are stupid. They are made up by administrators, not by the people in the trenches, and complying with all of them is remarkably onerous and an obstacle to efficiency and productivity.
  2. The work process itself often seems to demand rule or process variation, as personnel are always responding to production pressures while the system itself always runs in a degraded mode. Unanticipated problems arise constantly which seem to require rule-breaking, improvisation, or process variation.
  3. Personnel often don’t understand the rules that exist; or they don’t know which ones apply; or how to apply them. Alternatively, workers from one unit might have a very different understanding of compliance from workers in another unit.
  4. Leadership often doesn’t monitor or insist on rule compliance. Indeed, supervisors or authority figures are occasionally notorious rule breakers. Once personnel observe and become impressed by leadership’s breaking rules—or, as in the case of Enron and other organizations iconic for their misdeeds, where personnel are rewarded for violating rules and standards in
    pursuit of the company’s goals—it is a virtual certainty that compliance with rules and standards will be minimized if not abandoned altogether.
  5. The fact that nothing bad has yet happened despite rule deviation means it’s OK. This is one of the most pernicious beliefs that account for deviant practice: System operators are lulled into thinking that if untoward outcomes haven’t happened despite a long period of rule and standards deviation, they never will.

Safety experts are probably able to tick off a long list of catastrophes that were precipitated by years of rule and standards violations or, at the very least, by system operators who knew that operating procedures or technologies were unsafe. Investigations of both the Challenger and Columbia space shuttle disasters revealed NASA’s prior knowledge of the defective O-rings and debris shedding for years before these incidents occurred. Root cause analyses of “sentinel” or “never” events in hospitals virtually always uncover a long history of system faults or risky practices, where personnel had either been ignorant of standards or had simply gotten used to doing things their way. Investigations of the BP disaster in the Gulf have uncovered numerous rule and standards violations affecting worker safety prior to the explosion—very often in attempting to get the job done faster and cheaper. Thepoint, therefore, is that rule or standard noncompliance can be positively disastrous. So, what should be done?

Space doesn’t permit listing the various recommendations that exist in the literature so readers are encouraged to study Vaughan’s work as well as others, such as James Reason’s books on error or Marc Gerstein’s book Flirting With Disaster. My personal recommendation is that organizations should cultivate a “speaking up” environment. The reason is that personnel often know about system operators who flaunt rules, or they know about serious or worrisome system weaknesses or practices, but they are reluctant to call attention to them—usually because they believe they will be retaliated against, or they think it’s not their job, or they just don’t know how.

If leadership, however, works to erase those beliefs and improve relational and communication skills, we can make enormous progress in maintaining system safety and integrity. Leadership must realize that as long as rule compliance continues to be challenging, people will always consider ignoring rules and standards. Consequently, rule deviation should be understood as a natural, human reaction to productivity pressures, so that when it first appears, leadership’s response should probably not be punitive (unless the deviation is incomprehensibly reckless or the individual had already been warned). The point is that if system operators know they are going to be penalized for rule deviations, those deviations will likely remain under leadership’s radar screen.

In addition to expecting that personnel will violate rules, leadership should take a proactive approach towards finding out what it was like to be the system operator in this or that instance of rule violation. Was he or she terribly pressed for time? Ignorant of the purpose of the rule? Was he or she taught that the rule was absurd or that everybody flaunts it? Insights might be discerned through focus groups, department meetings, and the like. But, again, staff must be impressed that no retaliation will be forthcoming for airing these issues, either privately or in the open.

Perhaps most importantly, leadership must educate personnel on how to speak up when rule violations are spotted or threatened. Personnel should be taught how to approach rule violators, point out the deviation, and discuss the implications. Research indicates that more often than not, just the simple act of alerting a colleague that he or she is violating some standard or regulation will act as a corrective. But in a minority of cases, the violator may become defensive and hostile. And it is at moments like this that informed and skillful intervention—especially intervention that maintains a supportive and professional attitude toward everyone involved—is all important.

Again, readers are encouraged to study the literature on the “normalization of deviance” as it is extremely instructive and its utilization may prevent any number of workplace unpleasantries, not to mention disasters. Intervening in instances of rule breaking may also be very stressful, so that leadership ought never take it lightly. However, insisting on rule compliance or, at least on analyzing departures from it and providing oversight, is an absolutely integral feature of safe systems.

By John Banja, PhDprofessor in the Department of Rehabilitation Medicine at Emory University and a medical ethicist at the Center for Ethics. He is the author of Medical Errors and Medical Narcissism and has written and lectured widely about medical errors and system failures. He can be reached at jbanja@emory.edu.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *