failure analysis tactics and cognitive bias

Garry Nieuwkamp Garry.Nieuwkamp at HEALTH.NSW.GOV.AU
Fri Aug 16 01:06:22 UTC 2013

I'm becoming increasingly sceptical of the the utility of this entire body of 'research'. Metacognition might have some utility but it is quite possible to become an experienced clinician who doesn't make errors without being familiar with any of this body of literature. So what happens in these instances?

In the June 27 2013  edition of NEJM the editor placed Pat Croskerry's article along side an article titled Uncertainty-The Other Side of Prognosis. It had me thinking about about the fact quoted by Pat, that in 55% of fatal PEs the diagnosis was missed. One is encouraged to think that the problem is one of a missed diagnosis rather than one of the inherent problems of uncertainly and the opacity to our minds to aspects of this world we live in. Aristotle warned us of not seeking precision beyond what the subject matter would allow. To do so would itself be a form of cognitive bias. But now we're back at the beginning.

Garry Nieuwkamp MBBS FACEM DA MA PhD


From: Lorri Zipperer [Lorri at ZPM1.COM]
Sent: Friday, 16 August 2013 4:30 AM
To: Garry Nieuwkamp; Society to Improve Diagnosis in Medicine
Subject: Re: [IMPROVEDX] failure analysis tactics and cognitive bias

My colleague Dean Hooper shared these thoughts on my query.

I thought they'd be useful to include in our discussion here. Lorri

“What a healthy, useful, politically incorrect way to look at the problem!! In his seminal paper from the '83 NATO conference on human error, Hollnagel called for the focus of error not to be the error itself, but the mental mechanisms we use to solve problems. Rasmussen subsequently created a taxonomy labeled "mechanisms of error" as direct causes of adverse events. These mechanisms were such thing as inference, deduction, information processing, etc. The researcher then looks at these mechanisms to figure out what went wrong. I don't know of anyone else that comes close to formalizing cognitive bias as immediate cause. I've always thought Rasmussen's terms were too broad. Using cognitive bias as a cause could add some precision, I would definitely like to discuss this further and hear what others have to say.

PS Politically incorrect in that HF'ers don't like to look at the human as the source of error. Of course, you would still have to assess what caused the bias in the first place in order to determine effective mitigations.

Dean Hooper
Principal at HE Consulting LLC
dean.hooper at<mailto:dean.hooper at> “

On Aug 14, 2013, at 4:06 PM, Lorri Zipperer <Lorri at ZPM1.COM<mailto:Lorri at ZPM1.COM>> wrote:
I wondered if anyone has used system failure analysis techniques (5-whys, FMEA, RCA, etc) to identify cognitive biases (ie availability, confirmation, anchoring) and subsequently design mechanisms/strategies to reduce the impact of cognitive bias on decision making.


 Lorri Zipperer, Cybrarian and editor

Zipperer Project Management

Patient Safety: Perspectives in Evidence, Information and Knowledge Transfer

ISBN 978-1-4094-3857-1

Knowledge Management in Health Care

ISBN: 978-1-4094-3883-0

lorri at


Data: nurse numbers

Information: nurse textbooks

Evidence: nurse effectiveness

Knowledge: nurse experience


To unsubscribe from IMPROVEDX: click the following link:


Visit the searchable archives or adjust your subscription at:

Moderator: Lorri Zipperer Lorri at, Communication co-chair, Society for Improving Diagnosis in Medicine

To learn more about SIDM visit:

Save the date: Diagnostic Error in Medicine 2013. September 22-25, 2013 in Chicago, IL.

This message is intended for the addressee named and may contain confidential information. If you are not the intended recipient, please delete it and notify the sender.

Views expressed in this message are those of the individual sender, and are not necessarily the views of NSW Health or any of its entities.

HTML Version:
URL: <../attachments/20130816/22c53d9c/attachment.html>

More information about the Test mailing list