Fiona Godlee: Learning safety from other industries

At an international patient safety meeting I attended earlier this month (part of a series, see riskybusiness2008.com), I found myself remembering words from Atul Gawande’s book Complications (if you haven’t read it, I recommend it.) Gawande writes: “We have come to view medicine as both more perfect than it is and less extraordinary than it can be.”

Allan Goldman, a consultant paediatrician at London’s Great Ormond Street Hospital, had gathered an extraordinary cast of speakers to help us get a grip on medicine’s imperfections. Listen to a podcast interview with him to find out more. Simon Yates was there – the man who cut the rope from which his fellow climber Joe Simpson was hanging over an ice cliff with a broken leg. Roberto Canessa was there – one of the Argentinean rugby players who survived a plane crash in the Andes by, among other things, eating the bodies of their dead companions.

Also there were the man who trained the New York Fire Department before and after 9/11, an astronaut-now-surgeon who was at NASA during the Challenger and Columbia disasters, and General Sir Mike Jackson, who by his own account may have averted a second cold war by sharing his whisky flask with the Russian general set on invading Kosovo airport.

All gripping stuff – high emotion, great entertainment, but what has this to do with health care? A presentation by Martin Bromiley left us in little doubt. Bromiley is an airline pilot with two small children. Five years ago his wife went into hospital for a routine sinus operation. She died as a result of prolonged hypoxia when mask ventilation and then intubation failed. Describing the events of that day, painstakingly reconstructed in the aviation-style independent investigation that Bromiley asked the hospital to undertake, he painted a picture of technical competence but disastrous failings in situational awareness, leadership, decision making, communication, and assertiveness – all “human factors” that the airline industry estimates cause 75% of aviation accidents.

In the vital three to six minutes after she was anaesthetised, the anaesthetic room filled with increasingly senior medical and nursing staff as attempts were made with different sized masks and then tubes. The anaesthetists became fixed on one path – intubations – to the exclusion of other options. Several nurses thought of alternatives – tracheotomy, admission to intensive care – but were unwilling to speak up or were ignored. No one took overall charge. No one kept an eye on the clock.

Bromley is clear that health care suffers crucial cultural flaws. Inadvertent human error is seen as weakness, as poor performance, and as a result it is not talked about and cannot be anticipated. There is no mechanism for catching small errors before they become big errors. Technical skills are overemphasised in training to the exclusion of human factors, which make up 50% of the compulsory regular assessment for airline staff at whatever level of seniority. The voluntary group he has established, the Clinical Human Factors Group, aims to raise awareness of the importance of human factors in health care.

Other speakers used different language but reached many of the same conclusions. Simon Yates recognised that what led to the cutting of the rope were a series of small errors. Roberto Canessa, now cardiologist in Uraguay, saw that it was the optimists who became the leaders: “the ones who say we’re going to make it.” Mike Jackson told us that if you want to get from A to B, “you’ve got to define what B is and it’s useful to have a fair idea what A is.”

The military way is to have open, unhierarchical debate about the options, and then a clear decision by the commander after which debate stops and everyone falls in behind the plan. James Bagian (again, listen to a podcast interview with him) concluded that NASA’s hierarchy was a major barrier to effective communication.

People deferred authority to others and it wasn’t clear whose responsibility it was take the decision to stop the ill-fated flights in the face of intense political pressure to go ahead. (As an intriguing aside, the chance of a catastrophic disaster had been set at 4%, and it was the 25th flight that blew up.)

The audience reflected that medics are not good at giving feedback – it’s seen as critical rather than constructive – and that, with the European Working Time Directive, a clinical “team” may never have worked together before and there are no protocols or etiquette for establishing teams on the hoof.

Sir Liam Donaldson, England’s chief medical officer, acknowledged that some people (I have been one of them) get tired of analogies being drawn between medicine and aviation – “the fit is not perfect.” But hearing him list the things medicine could learn from aviation and other industries, it was hard to muster good reasons why medicine resists these measures other than the desire to hang on to traditional automonies.

On his list were standard operating procedures, training for what might never happen, sychronised team work (training in teams and reviewing mistakes in teams), and scoping risks so that one can clearly identify ahead of time the things that could cause harm.  Airline pilots fought against these changes 30-40 years ago, said Bromiley.

The key was to show then that they were not being de-skilled but re-skilled as safety and risk managers.

Over lunch, another airline pilot who is also a doctor said he found it embarrassing that medicine is so slow to accept the need for a different approach. Why can’t we create standard operating procedures for more things in medicine, he asked. Here’s an opportunity for clinicians to take the lead, to re-skill themselves in the non-technical aspects of their work, to become risk managers, to develop teams that deliver safer care for patients. What are we afraid of?

Fiona Godlee, Editor in chief, BMJ