February 13, 2003 Episode 35: The Way We've Always Done It
On the morning of February 1st, 2003, many of us in North America woke up to yet another day that we knew we would remember for the rest of our lives. The shuttle, Columbia, was missing. Contact had been lost during what seemed like a routine landing. By the time the day was over, we knew that seven people had lost their lives and a second shuttle had been destroyed. Inevitably, the period after is filled with a single question: Why? The answer to this question is determined, in part, by the limits of the knowledge of those who investigate the question. Aeronautic engineers, physicists, technicians, human factor analysts, accident investigators and politicians bring a certain kind of knowledge base to the question. Their knowledge is important and they will find some answers.
After the first shuttle accident, the explosion of Challenger shortly after lift-off, it was concluded that key players in the launch decision failed to stop the launch even though they suspected that the cold weather would adversely affect the O-Ring connection in the fuel tank. After a nine-year review of the documentation of the launch decision and extensive interviewing of key personnel, Boston College Sociologist Diane Vaughan came to a different conclusion than the official report. In her book, The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA, Vaughan asserted that these key players acted according to the rules set out in the pre-launch procedures. The fault lay in the culture at NASA that had come to accept high risk as normal. We spoke with Vaughan this week about her thoughts on how the Columbia investigation should proceed and what contributions sociology could make to their findings.
Bureaucratic structures can often become so complex that no single member of the bureaucracy is capable of knowing the rules and regulations of his own department, much less seeing the larger organisational picture. We contemplate the price that is paid when no one is seeing that big picture.
On Saturday, February 1st, many of us woke up to the news that the space shuttle Columbia had been lost. As the day wore on, those of us old enough to remember the Challenger explosion in 1986 realized that seven more people had died and that another shuttle had been tragically destroyed.
The same kinds of questions that were asked then are being asked again: basic questions about space travel, scientific knowledge, government funding, and safety.
Boston College Sociologist Diane Vaughan studied the organisational aspects of the Challenger launch decision. NASA faulted the launch decision-making process, stating that critical errors were made by key players in the hours before the launch. Vaughan, however, found that these key players had followed the decision-making processes by the book. The problem, she asserted, lay in the normalization of deviance found in the culture of NASA. Safety considerations were treated as routine incremental decisions with little built into the system to allow for players to assess how these incremental decisions were increasing risk.
Today on First Person, Plural, we talk with Dr. Vaughan about her ideas as we consider the consequences of ignoring social contexts in an episode we call, The Way We've Always Done It.
An argument for the decline of American civilization is that when the space shuttle Columbia exploded on February 1st, the most immediate public reaction after the customary expressions of grief and disbelief was to place fraudulent listings for pieces of shuttle debris on the eBay website. While eBay shut down these listings within hours of their posting, such behaviour was not unprecedented in American culture.
Perhaps more telling of the ritualized aspects of American society is that in less than a week after February 1st, questions arose about the appointment of the so-called "independent" panel of investigators to assess the shuttle accident. Within days of convening the panel, NASA had to reorganize it amid criticism that the panel personnel were too close to NASA leadership and that the panel was too much under NASA control. More independent members were added and the reporting structure shifted to the Naval Safety Centre rather than NASA. To underscore the failure of achieving independence, the word "independent" was changed to "external" in the news briefings given by NASA officials. What was supposed to have been a built-in response to any shuttle accident ended up relying more upon the rhetoric of independence than upon ensuring a substantially independent probe. Again, while the lack of independence was resolved eventually, such disparities between speech and action were not surprising or unprecedented in American culture.
Americans often accept extreme levels of mechanical, ritualized decision-making, accompanied by code words that often are not grounded in practice. The custom of avoiding critical thinking in favour of performance of ritual behaviour is strong. In a bureaucracy that is sufficiently complex that no street-level bureaucrat can possibly know all of the regulations pertinent to his agency, let alone to any other organ of public policy administration, the stopgap solution is and must be for the low-ranking bureaucrat to comply with his superiors' most recent instructions and hope that such acquiescence is enough to save his job. Besides, as in private sector hierarchies, the boss may not always be right, but he is still the boss. The result is that rules are understood through authority only and are followed unquestioningly.
The possibility of democracy is the obvious first casualty of such resignation. A casualty that follows all too soon after is the possibility of the agency or agencies fulfilling any manifest function. Dr. Frankenstein could bring the monster to life, but he could not control him afterwards. Likewise, how the internal mechanisms of an agency function determine what it will produce or destroy in the end. An elected official can make as many speeches as he likes about the need to make government perform certain functions, but the intricacies of mundane operation of the agencies involved must be addressed if the desired results are to be achieved. To blame superfluous levels of management is effective only up to a point: the fantasy that all members of an organization can, quote, just know, close quote, what to do and that they will do it upon such a realization is one prone to serious battering when exposed to real-world considerations, even the real world of the public sector or sectors.
To give administration short shrift is to buy into the notion that all management is created equal and that all managers are created equal. A corollary is that it doesn't matter how anything is managed or by whom, that results in any given organization are independent of the actions of those who run it, the process by which it is decided who will run it, and the processes by which those who will run it are prepared for the tasks at hand.
The lapse in analytical diligence that presumes or simply declares that government is perfectly organized by some unspecified but uniquely determined standard is used every day as part of recitation after recitation that everything is in order, everything is proceeding well enough, no one is screaming and any assertions to the contrary must therefore be due to dark ulterior motives. Administrations that eschew any diagnostics at all pertaining to their dynamics have not only made themselves deaf to the loudest of wake-up calls, they have taken the telephone and pitched it out a closed window.
Such is the culture of many organisations in government and business. The results of this lack of self-reflections within the culture can be devastating in any context, but add to this the dangers of space flight, the realities of government funding and the politics of public-private partnerships and the results can be tragic. All too often this cultural assessment is left out of the technical, scientific and human factor assessments of health and safety issues. Sociological and organisational assessments are not welcomed, not funded and not considered in changes to public policies. Yet the cultural climate still gives context to the assessments. The lack of scrutiny does not lessen the effects. Pretending to live outside the fishbowl may be comforting, but the water still ripples when you swim.
In the late 1980s, Diane Vaughan took an interest in the organisational aspects of the assessment of the Challenger. After nine years of research, involving thousands of pages of documents and extensive in-depth interviews with key participants in the launch decision-making process, Vaughan produced an assessment of what had gone wrong that differed sharply from the official assessment. Officially, the conclusion was that key players made fateful decisions about the suitability of the launch under extreme cold weather conditions. The decision to launch in spite of warnings was sharply criticized after the fact. Vaughan argues persuasively, however, that this decision was not outside the organisational box. Like many other decisions made before each launch, the decision to dismiss the concerns was business-as-usual. The key players were following procedure, not breaking it.
Her book, The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA, published in 1996, has quickly become a must read for many engineering students. It has not, however, been recognized or discussed openly by NASA officials. This affects how the investigation will proceed. Such investigations become the quintessential case studies in what can happen if an important stream of knowledge is simply ignored.
copyright by Pattie Thomas and Carl Wilkerson 2003
Back to First Person, Plural
Back to CCC Radio Shows
Back to Cultural Construction Company