this week one of my friends asked the question,
are companies/hospitals with a high degree of interpersonal relationships and trust more efficient and do they have better quality outcomes?
francis fukuyama describes the cultural foundations of toyota’s lean manufacturing operations in “the high trust workplace” chapter of his 1996 book “Trust: The Social Virtues and The Creation of Prosperity” and compares it to auto manufacturing in other companies like GM with its highly contractual relationship between management and labor unions. he reports that toyota experiences far fewer defects and the lowest labor hours required to manufacture a car in the world. to understand the difference, see the section on “labor relations” described by the website for the joint GM/Toyota manufacturing plant in Fremont, CA — this is one of the case studies that fukuyama looks at in that chapter.
it took me a while to connect all the dots but it there is evidence from multiple sources that high (low) trust in organizations is linked to fewer (more) errors, failures, and mistakes in all kinds of highly technical industries such as hospitals, beyond just auto-manufacturing plants. the opposite — a lower level of trust/relationships among doctors and nurses contributes lower reporting rates of medical errors, and hence more mistakes in practice.
first dot. the fun short 2007 book called “the no asshole rule” by sutton (from stanford business school) — on p. 39,
“Edmondson did what she thought was a straightforward study of how leadership and coworker relationships influenced drug treatment errors in eight nursing units. she assumed that the better the leadership and coworker support, the fewer the mistakes people would make. yet edmondson, along with the harvard medical school physicians funding her research, were at first bewildered when questionnaires showed that units with the best leadership and coworker relationships reported the most errors: units with the best leaders reported making as many as ten times more errors than units with the worst leaders. after Edmondson pieced together all the evidence, she figured out that nurses in the best units reported far more errors because they felt psychologically safe to admit their mistakes. nurses in the best units said that mistakes were natural and normal to document and that mistakes are serious because of the toxicity of the drugs, so you are never afraid to tell the nurse manager. the story was completely different in the units where nurses rarely reported errors. fear ran rampant. nurses said things like the environment is unforgiving; heads will roll, you get put on trial, and that the nurse manager treats you as guilty if you make a mistake and treats you like a two year old. as the late corporate quality guru w. edwards deming concluded long ago, when fear rears its ugly head, people focus on protecting themselves, not on helping their organizations improve. edmondson’s research shows that this happens even when lives are at stake.”
second dot. in 2004 i started reading a book about medical errors called “internal bleeding” by wachter & shojania (at ucsf), p. 216
“in one survey of more than seven hundred nurses, 96 percent said they had witnessed or experienced disruptive behavior by physicians. nearly half had pointed to fear of retribution as the primary reason such acts were not reported to superiors.”
p. 222
“one study compared the attitudes of flight crews regarding teamwork to those held by surgical teams. the surgeons themselves thought the team functioned well (three quarters rated teamwork as “high”). the other members of the team begged to differ. only 39% of anesthesiologists, 28% of surgical nurses, 25% of anesthesia nurses, and 10% of anesthesia residents agreed that the level of teamwork was good. nearly half the surgeons felt that junior team members shouldn’t question the decisions of the senior physician. in contrast 94% of airline pilots reject this sort of hierarchy– possibly because when the captain makes a mistaken everyone else goes down as well.”
watcher discusses an example of what they are doing to address this in an article
“At my hospital (UCSF Medical Center) and several other centers around the United States, we have enlisted the help of commercial airline pilots to teach us how to communicate better, how to dampen down hierarchies (so that a young nurse feels comfortable questioning a senior doctor when something seems awry), and how to debrief participants after an operation, just as crew members are debriefed after a flight.”
also see a piece on medical ethics and reporting of errors. a piece on transforming hospital culture from secrecy to safety oriented, explains it well
Focus on Safety, Rather than Secrecy
In the current climate, hospitals often deny that mistakes exist. When they do happen, they react as if the event is an anomaly. The popular approach is to find the person responsible – the “bad apple” – and issue a punishment. Typically, the first question workers ask when they hear a mistake occurred is “who did it?” followed by “have they done it before?” The focus is on the individual rather than the system.
The reality, Dr. Pepper said, is that human beings will always make mistakes. “Human beings have flexibility. This is what distinguishes us from a computer, but it is also what makes us error-prone,” she said. “If we didn’t make errors, we couldn’t be creative.”
A safety culture has a different assumption: It says that errors are common and they are made by good people in a flawed system. It distinguishes between blame-worthy and blameless mistakes (those that are made out of vindication, carelessness or recklessness versus those that are unintentional).
In a safety culture, the discovery and reporting of errors is rewarded, not punished. This doesn’t mean health professionals are not accountable. Accountability is not being perfect, but rather it involves acknowledging the error, apologizing, repairing the harm, discovering the causes of the error and fixing the system or process.
Initiating this kind of safety culture does work, Dr. Pepper said.
A five-year study in a variety of industries demonstrated that a behavioral safety initiative resulted in 29-percent improvement in safety practices in one year, which rose to a 69-percent improvement by the fifth year.
third dot. it turns out that hierarchy, secrecy, and underreporting of errors was a major factor that led to the poor safety record of the soviet nuclear industry and in particular the poor designs/operation of the chernobyl reactor according to richard rhodes in “Arsenals of Folly: The Making of the Nuclear Arms Race” in his first chapter on chernobyl, p. 7
“unknown to the soviet public and the world, at least 13 serious power reactor accidents had occured in the soviet union before the one at chernobyl”
after reading “internal bleeding,” in later 2004 i started studying case studies of the largest technological disasters and found that “system understanding” was missing in all of the accidents and a good portion of this can be avoided with effective team reviews where trust already exists between team members. if trust does not exist might as well forget the reviews.
in the end, the lesson is simple. in highly technical industries where corporate success depends on every team member, hire and work with teammates whom you can trust and can be completely open with. otherwise, you can expect failures at all levels that you won’t know about until it’s too late.