Normalization of Deviance

Media197The January/February 2016 issue of The Atlantic had an interesting article entitled, “What Was Volkswagen Thinking?” In it the author, Jerry Useem, explores examples of corporate cultures that resulted in decisions that are, to the uninitiated outsider, simply appalling.  Ford’s refusal to recall Pintos with exploding gas tanks in the 1970s and NASA’s proceeding with the Challenger launch despite questions about freezing O-rings are two of the better-known examples Mr. Useem cites, as well as a lesser-known one involving B.F. Goodrich and aircraft brakes in 1968 (the brakes literally melted when used). And of course, Volkswagen’s decision to deliberately cheat on emissions testing in order to sell more cars.

One can make a case that the Challenger launch may have been ill-advised and may have ignored inconvenient data, but it doesn’t appear to have been intentionally devious or misleading. Not so with the other examples. With the Pinto and the aircraft brakes, evidence was buried and tests were deliberately falsified – as were the pollution tests by Volkswagen. As Useem says, “You cannot unconsciously install a ‘defeat device’ into hundreds of thousands of cars.” And although Useem doesn’t use it, the recent example of GM’s faulty ignition switches also appears to be a clear case of deliberate deception and/or covering up of unwelcome evidence.

What in the world causes this? I doubt that the people at NASA, Ford, GM, Volkswagen, and B.F. Goodrich are Media575inherently evil or dishonest. What convinced them to put their code of ethics on hold? According to Diane Vaughan, a sociologist, it was nothing – or at least, no one thing. Rather, Vaughan says that actions that have always been thought of as “not okay” are slowly reclassified as “okay” over a period of time, through a gradual cultural shift. She calls this the Normalization of Deviance.  The term, “slippery slope,” is terribly overused these days, but the Normalization of Deviance is just such a slope. People start by compromising their ethics on small issues, which often leads to situations in which one is forced to choose between compromising on larger issues or admitting one’s transgressions on the small issues. Having rationalized once, it’s much easier the second time around. And then those larger transgressions lead to critical transgressions. And the whole way, people rationalize their behavior.

The subtitle of Carol Tavris and Elliott Aronson’s excellent book, Mistakes Were Made – But Not By Me is, “Why we justify foolish beliefs, bad decisions, and hurtful acts.” I’ve written about this book before – it discusses the fact that when people are confronted with evidence that contradicts beliefs which they hold, most people will deny the evidence. This is especially true when the beliefs in question deal with oneself, and how one thinks of oneself. Most people think of themselves as good people, and that’s usually true. As I said, the people at these companies are probably no more immoral than most of us. It’s important to note that the O-ring engineers weren’t consciously lying; they actually believed that everything would be fine. Denny Gioia, the coordinator of product recalls at Ford during the 70s, wrote, “Before I went to Ford I would have argued strongly that Ford had an ethical obligation to recall. I now argue and teach that Ford had an ethical obligation to recall. But, while I was there, I perceived no strong obligation to recall and I remember no strong ethical overtones to the case whatsoever.” As Useem points out, this is Orwellian doublethink, in which a bureaucracy conceals evil not only from the public but from itself.

This begs the question: How does this happen? What causes people to stray on that first, small ethical issue? In many cases, it starts with an impossible commitment made at the top. 60 shuttle launches each year; a car that weighs less than 2000 lbs and costs less than $2000, ready for production in 25 months; a cheap, non-polluting, high-mileage diesel vehicle that will make the company the largest car manufacturer in the world.

But there is another enormous contributing factor: culture. People are extremely attuned to their cultural environment. If a company is known to be one where bending the rules is okay, where taking advantage of clients, the buying public, or even competitors is okay, employees will fairly rapidly start bending the rules themselves. And we’re not just talking about bureaucratic red tape rules; we’re talking about basic rules of right and wrong. Culture doesn’t just trump strategy (as has been said many times); culture trumps pretty much everything.

Useem includes a quote from a businessman who was being interviewed by the Association of Certified Fraud Examiners: “Culture starts at the top. But it doesn’t start at the top with pretty statements. Employees will see through empty rhetoric and will emulate the nature of top-management decision making.” The businessman was Andrew Fastow, the former CFO of Enron.

The author closes with the statement, “[Fastow] got one thing right: Decisions may be the product of culture. But culture is the product of decisions.” A well-designed decision process – one that involves transparency, rigorous questioning of the proposed strategy, and clear identification of company policies and standards – can help to shape the company’s culture in ways that make it less likely to slip into the Normalization of Deviance.

Leave a Reply

Your email address will not be published. Required fields are marked *

Solve : *
16 + 8 =

Decision Strategies