[EXTERNAL] Re: [IMPROVEDX] The Lab Says It=?utf-8?Q?=E2=80=99s_?=Cancer. But Sometimes the Lab Is Wrong. - The New York Times

Samuel, Rana Rana.Samuel at VA.GOV
Thu Jun 29 21:27:40 UTC 2017

Helene – while I also chafe at the race, wealth and gender discrimination that exists in medical research and medical policy making, your email offers a classic example of how error is enthroned and then perpetuated by poor peer review of articles submitted for publication, sensational headlines and dissemination of the conclusions by individuals.

This study was not a comparison of breast biopsy diagnostic accuracy. It was just a study of discordance rates when pathologists were asked to make a diagnosis under highly artificial conditions far removed from the way pathology is practiced in the real world:



Participants were not given any patient history other than the patients age and type of biopsy.

We look up all prior diagnoses on each patient before the case is signed out. This helps me understand the medical context in which the diagnosis is being made, and helps me make sure I am not missing anything (eg: in a patient with a history of lung cancer who is now having a breast lump biopsied, I would probably do immunostains to differentiate a lung cancer metastatic to the breast from a primary breast carcinoma, especially if the tumor cells did not have the typical appearance expected in breast cancer).

Participants could only look at one slide

The entire case is examined. Clues or findings on one slide often make me go back to other slides to see if I missed similar findings the first time I looked.

Participants could not look at additional tissue levels

The standard of care is to look at multiple tissue levels, especially on small pieces of tissue, like breast core biopsies.

Participants could not correlate the findings on the slide with the clinical and radiologic findings and impression

This is the essence of all pathology: clinical-radiology-pathology correlation. If what I see on the slide does not explain the patients signs and symptoms and radiologic findings, then I need to get additional levels, submit more tissue (if a large resection), and if there is still no correlation, I raise the possibility that the lesion was not excised (was missed).

Participants could not discuss or show the case to others

Every case where I have ANY question at all about the diagnosis is discussed at a daily QA conference with all pathologists in the department at a multiheaded microscope. If the case is especially difficult, we circulate it among all pathologists for detailed individual review. If we still don’t achieve consensus, we send the case out to an ‘expert’ breast pathologist (and sometimes to more than 1 expert). In very rare cases where even the experts don’t agree, we sign out the case with a differential, explain the steps we took to ensure accuracy, and indicate the reason for the ongoing uncertainty of the diagnosis.

Participants could not perform any special stains to clarify the diagnosis

This would be failure to practice to the standard of care in the real world. Borderline cases almost always need immunostains.

I could go on and on, but I will stop by pointing out that the study pathologists performed as well as the reference pathologists:
The concordance rate for study pathologists was 75 %
The concordance rate for independent diagnosis by the reference pathologists  was 75 %!

The concordance rate for the consensus driven reference diagnosis was only 90 %. If the reference pathologists in this study had access to some of the error reduction tools used in real world pathology (deeper sections, special stains and multidisciplinary conferences where the surgeon, oncologist, radiologist and pathologist all discuss the case and make sure that all findings are concordant before the patient is treated) they would probably have way less than 1 % diagnostic disagreement that actually matters to the patient (ie – affects treatment).

The title of the JAMA paper was “Diagnostic concordance among pathologists interpreting breast biopsy specimens”. A more accurate title would have been “ Diagnostic concordance when not practicing to the standard of care due to an artificial environment”. The authors themselves conclude by saying “Further research is needed to understand the relationship of these findings with patient management”
There is no doubt that errors in diagnosis are happening every day, but this study, instead of identifying factors and conditions that contribute to accuracy or inaccuracy just muddied the waters.

None of this takes away from the accuracy of your concern that a non-representative group of people is making discriminatory health policy decisions heavily biased by the narrow lens of their own life experiences.

From: HM Epstein [mailto:hmepstein at GMAIL.COM]
Sent: Wednesday, June 28, 2017 9:46 PM
Subject: [EXTERNAL] Re: [IMPROVEDX] The Lab Says It’s Cancer. But Sometimes the Lab Is Wrong. - The New York Times

We want to prevent every error we can. This article talks about identity mix-ups at the labs, not errors in evaluating the biopsy results, although obviously the lab errors contribute to inaccurate diagnostic communication and treatment.

I'd like to look at this from a different angle. I recognize we are comparing apples to oranges, because the type of Dx error is different, but the attitudes of the legislators burns me up.

Based on the relatively low rate of errors labs have seen with prostate exams, the article says our Congress has still introduced legislation this May to include DNA fingerprinting in Medicare coverage for prostate cancer only. But there are 806 thousand prostate biopsies a year. and an overall error rate of 0.5%. They'll negotiate a rate for the DNA fingerprinting but at 50% of cost, that's $121 Million per year.

Compare that to the error rate of breast cancer biopsies and look again at the legislation. In 2015, it was discovered that the error rate for the 1.6 million breast cancer biopsies was 20-25% (depending on the type of breast cancer tested). It's not an identity issue, although that probably exists as well; it's a diagnostic error for biopsies that show DCIS or atypical cells.

That's double the number of biopsies and 50x the error rate but the male dominant Congress members writing this legislation have decided that having Medicare cover costs for DNA fingerprinting is necessary for prostate biopsies but breast cancer is a pre-existing condition that isn't covered. If a women with breast cancer is initially told she doesn't have it, and then they discover she does, she won't be covered. On the other hand, if she is told she does have breast cancer (right or wrong) and endures the surgery (lumpectomy or mastectomy), radiation or chemotherapy, and reconstructive surgery, she can't change insurance policies or stop working because she loses her insurance. Then, her breast cancer (real or mistaken) is a pre-existing condition. One more indignity. If the patient discovers she was mistakenly diagnosed with breast cancer and gets breast cancer later, she will have to go to court to prove the breast cancer is a new Dx.

Sorry to rail. But if we can't improve the error rates for breast cancer biopsies, if we aren't even researching how to fix this, we are creating a health care fiscal crisis for over 300 thousand women every year.


Breast Biopsies Leave Room for Doubt, Study Finds
March 17, 2015

Abby Howell chose to have a biopsy when a mammogram showed some calcification two years ago. Instead of being definitive, the biopsy  found atypia — abnormal duct cells that are not cancerous but which some doctors recommend having removed.

Kyle Johnson for The New York Times

Breast biopsies are good at telling the difference between healthy tissue and cancer<http://health.nytimes.com/health/guides/disease/cancer/overview.html?inline=nyt-classifier>, but less reliable for identifying more subtle abnormalities, a new study finds.

Because of the uncertainty, women whose results fall into the gray zone between normal and malignant — with diagnoses like “atypia” or “ductal carcinoma<http://health.nytimes.com/health/guides/disease/breast-cancer/overview.html?inline=nyt-classifier> in situ” — should seek second opinions on their biopsies, researchers say. Misinterpretation can lead women to have surgery and other treatments they do not need, or to miss out on treatments they do need.

The new findings, reported Tuesday in JAMA,<http://jama.jamanetwork.com/article.aspx?doi=10.1001/jama.2015.1405> challenge the common belief that a biopsy<http://health.nytimes.com/health/guides/test/biopsy/overview.html?inline=nyt-classifier> is the gold standard and will resolve any questions that might arise from an unclear mammogram<http://health.nytimes.com/health/guides/test/mammography/overview.html?inline=nyt-classifier> or ultrasound<http://health.nytimes.com/health/guides/test/ultrasound/overview.html?inline=nyt-classifier>.

In the United States, about 1.6 million women a year have breast biopsies; about 20 percent of the tests find cancer<http://health.nytimes.com/health/guides/disease/cancer/overview.html?inline=nyt-classifier>. Ten percent identify atypia, a finding that cells inside breast ducts are abnormal but not cancerous. About 60,000 women each year are found to have ductal carcinoma<http://health.nytimes.com/health/guides/disease/breast-cancer/overview.html?inline=nyt-classifier> in situ, or D.C.I.S., which also refers to abnormal cells that are confined inside the milk ducts and so are not considered invasive; experts disagree about whether D.C.I.S. <http://www.cancer.gov/cancertopics/pdq/screening/breast/healthprofessional/page3> is cancer.

“It is often thought that getting the biopsy<http://health.nytimes.com/health/guides/test/biopsy/overview.html?inline=nyt-classifier> will give definitive answers, but our study says maybe it won’t,” said Dr. Joann G. Elmore<http://www.uwmedicine.org/bios/joann-elmore>, a professor at the University of Washington School of Medicine in Seattle and the first author of the new study on the accuracy of breast biopsies.

Her team asked pathologists to examine biopsy slides, then compared their diagnoses with those given by a panel of leading experts who had seen the same slides. There were some important differences, especially in the gray zone.

An editorial in JAMA<http://jama.jamanetwork.com/article.aspx?doi=10.1001/jama.2015.1945> called the findings “disconcerting.” It said the study should be a call to action for pathologists and breast cancer scientists to improve the accuracy of biopsy readings, by consulting with one another more often on challenging cases and by creating clearer definitions for various abnormalities so that diagnoses will be more consistent and precise. The editorial also recommended second opinions in ambiguous cases.

A second opinion usually does not require another biopsy; it means asking one or more additional pathologists to look at the microscope slides made from the first biopsy. Dr. Elmore said doctors could help patients find a pathologist for a second opinion.

A surgeon not involved with the study, Dr. Elisa Port, a co-director of the Dubin Breast Center and the chief of breast surgery at Mount Sinai Hospital in Manhattan, said the research underlined how important it is that biopsies be interpreted by highly experienced pathologists who specialize in breast disease.

“As a surgeon, I only know what to do based on the guidance of my pathologist,” Dr. Port said. “Those people behind the scenes are actually the ones who dictate care.”

In Dr. Elmore’s study, the panel of three expert pathologists examined biopsy slides from 240 women, one slide per case, and came to a consensus about the diagnosis.
Pathology of Errors

Video As pathologists help doctors diagnose breast cancer at an earlier, more survivable stage, the potential for mistakes has grown.

“These were very, very experienced breast pathologists who have written textbooks in the field,” Dr. Elmore said.

Then the slides were divided into four sets, and 60 slides were sent to each of 115 pathologists in eight states who routinely read breast biopsies. The doctors interpreted the slides and returned them, and the same set was sent to the next pathologist. The study took seven years to complete.

The goal was to find out how the practicing pathologists stacked up against the experts. The task was tougher than actual practice, because in real cases pathologists can consult colleagues about ambiguous findings and ask for additional slides. They could not do so in the study.

There was good news and bad news. When it came to invasive cancer — cancer that has begun growing beyond the layer of tissue in which it started, into nearby healthy tissue — the outside pathologists agreed with the experts in 96 percent of the interpretations, which Dr. Elmore called reassuring. They found the vast majority of the cancers.

For completely benign findings, the outside pathologists matched the experts in 87 percent of the readings, but misdiagnosed 13 percent of healthy ones as abnormal.

The next two categories occupied the gray zone. One was D.C.I.S. For this condition, the pathologists agreed with the experts on 84 percent of the cases. But they missed 13 percent of cases that the experts had found, and diagnosed D.C.I.S. in 3 percent of the readings where the experts had ruled it out.

The finding is of concern, because D.C.I.S. sometimes becomes invasive cancer, and it is often treated like an early-stage cancer, with surgery and radiation. Missing the diagnosis can leave a woman at increased risk for cancer — but calling something D.C.I.S. when it is not can result in needless tests and treatments.

The second finding in the gray zone was atypia, in which abnormal, but not cancerous, cells are found in breast ducts. Women with atypia have an increased risk of breast cancer, and some researchers recommend surgery to remove the abnormal tissue, as well as intensified screening and drugs to lower the risk of breast cancer.

But in the study, the outside pathologists and the experts agreed on atypia in only 48 percent of the interpretations. The outside pathologists diagnosed atypia in 17 percent of the readings where the experts had not, and missed it in 35 percent where the experts saw it.

“Women with atypia and D.C.I.S. need to stop and realize it’s not the same thing as invasive cancer, and they have time to stop and reflect and think about it, and ask for a second opinion,” Dr. Elmore said.

Abby Howell, 57, who lives in Seattle, two years ago had some calcifications show up on a mammogram<http://health.nytimes.com/health/guides/test/mammography/overview.html?inline=nyt-classifier>, which are sometimes a sign of cancer. She was given the option of just mammograms every six months or having a biopsy. She chose the biopsy, thinking it would be definitive. But instead, it showed atypia.

Ms. Howell, who has a master’s degree in public health, looked up the condition and realized it was unclear whether those odd-looking cells would ever lead to cancer. Surgery was recommended, but she decided to watch and wait instead. So far, her mammograms have been normal, but the experience has shaken her peace of mind.

“If I had to do it all over again, I wouldn’t have jumped for the biopsy,” Ms. Howell said. “I really regret it. In a way it’s made more anxiety in my life.”

Sent from my iPhone

Helene's Website<http://hmepstein.com/>
Helene's Twitter Account<https://twitter.com/hmepstein>
Diagnostic Error's Twitter Account<https://twitter.com/DxErrors>
Diagnostic Errors on Facebook<https://www.facebook.com/DiagnosticErrors/>
Helene on LinkedIn<https://www.linkedin.com/in/helenekepstein/>

On Tue, Jun 27, 2017 at 9:59 PM, David Meyers <dm0015 at icloud.com<mailto:dm0015 at icloud.com>> wrote:

The Lab Says It’s Cancer. But Sometimes the Lab Is Wrong.
By GINA KOLATA<https://www.nytimes.com/by/gina-kolata>JUNE 26, 2017
Timothy Karman discovered he had prostate cancer a few days after being told he was cancer-free as a result of a lab error. Logan R. Cyrus for The New York Times

It was the sort of bad news every patient fears. Merlin Erickson, a 69-year-old retired engineer in Abingdon, Md., was told last year that a biopsy<http://health.nytimes.com/health/guides/test/biopsy/overview.html?inline=nyt-classifier> of his prostate was positive for cancer<http://health.nytimes.com/health/guides/disease/cancer/overview.html?inline=nyt-classifier>.

Mr. Erickson, worried, began investigating the options: whether to have his prostate removed, or perhaps to have radiation treatment. But a few days later, the doctor called again.

As it turned out, Mr. Erickson did not have cancer. The lab had mixed up his biopsy with someone else’s.

“Obviously, I felt great for me but sad for that other gentleman,” Mr. Erickson said.

The other gentleman was Timothy Karman, 65, a retired teacher in Grandy, N.C. At first, of course, he had been told he was cancer-free. The phone rang again a few days later with news of the mix-up and a diagnosis of cancer.

Ultimately he had his prostate removed. “I said, ‘Mistakes happen,’” Mr. Karman said.

They may be happening more often than doctors realize. There is no comprehensive data on how often pathology labs mix up cancer biopsy samples, but a few preliminary studies suggest that it may happen to thousands of patients each year.

Fortunately, there is now a high-tech solution: a way to fingerprint and track each sample with the donor’s own DNA.

But it costs the patient about $300 per sample, and labs have been slow to adopt it, saying that the errors are rare and the test too expensive, and that they have plenty of checks in place already to avoid mix-ups.

Dr. John Pfeifer, vice chairman for clinical affairs in the pathology and immunology department at Washington University School of Medicine in St. Louis, who has studied the problem, is not quite so sanguine.

“All the process improvement in the world does not get rid of human errors,” he said. “Millions get biopsies every year. Is society going to say, ‘Yeah, mistakes happen but we’re not going to look for them?’”

The fingerprinting method, offered by Strand Diagnostics, is simple: A doctor gets a DNA sample by swabbing inside a patient’s mouth. It is sent directly to Strand with a bar code identifying the patient.

That bar code is also used to label the patient’s biopsy. If it shows cancer, the pathologist sends the biopsy cells to Strand. The lab matches the DNA from the swab to that of the biopsy cells.

If these DNA fingerprints did not match, that signaled a lab mix-up. That was how pathologists discovered that samples from Mr. Erickson and Mr. Karman had been switched.

Despite the best efforts of pathologists to avoid these mix-ups, hints of trouble have been turning up for years.

In 2011, researchers conducting a large clinical trial reported that two men who were found to have prostate cancer<http://health.nytimes.com/health/guides/disease/prostate-cancer/overview.html?inline=nyt-classifier> — and who had their prostates removed — did not have the disease at all.

Instead, their biopsy samples had been mishandled. (A third mix-up was caught before any action was taken.)

The researchers then performed a rigorous DNA analysis<https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3107764/> of more than 10,000 biopsies taken during the period. Twenty-seven were mislabeled. Among 6,733 blood samples, 31, or 0.5 percent, had been switched.

The percentage of errors may not be high. But each one may lead a patient down a life-altering path, to a grueling treatment that was unnecessary, or to the neglect of a cancer that may or may not prove deadly.

Pathologists see lab mix-ups routinely, but often the mistake is obvious — a sample supposed to be from a brain actually is from a lung, for example.

“You say, ‘O.K., yeah, there’s been a mistake,’” Dr. Pfeifer said. “I don’t know many pathologists who haven’t had that occur.”

But what about mix-ups that are not so obvious — two lung tissue samples that are switched, or two breast samples? Dr. Pfeifer turned to DNA fingerprinting to determine how often such samples are mixed up at Washington University.
Merlin Erickson was told he had cancer after a pathology lab mixed up his biopsy with another patient’s. Nate Pesce for The New York Times

He found a few errors<https://academic.oup.com/ajcp/article-lookup/doi/10.1309/AJCPLNO4PFVZVA4P>. One man’s lung tissue was cancerous, but DNA analysis showed the lung cells were not his.

Another patient had a liver biopsy that showed cancer, but the cells were from somebody else. Still another man was mistakenly thought to have advanced aggressive prostate cancer; again, DNA showed the tissue was somebody else’s.

To really get an idea of the frequency of these mix-ups nationwide, however, Dr. Pfeifer needed a large database.

Ted Schenberg, the chief executive at Strand, offered to supply the data: more than 13,000 biopsy results from men evaluated for prostate cancer at a number of laboratories.

Dr. Pfeifer agreed to review data, although he knew the company had a significant financial interest in the outcome. To minimize conflicts of interest, Mr. Schenberg would not pay him to do the work and would not be involved in the analysis.

Dr. Pfeifer documented<https://academic.oup.com/ajcp/article/139/1/93/1766518/Rate-of-Occult-Specimen-Provenance-Complications> two types of errors in this large sample: an “absolute switch,” in which one patient’s tissue was mixed up with another’s. And a “partial switch” in which some of one patient’s cells ended up mixed in with cells from someone else.

“Every lab had both of these errors,” Dr. Pfeifer said. In general, the rates were low — .26 percent of samples were absolute switches, and 0.67 percent were partial switches.

But the rates were slightly higher among independent labs, including large commercial companies that handle huge numbers of specimens: 0.37 percent were absolute switches, and 3.14 percent were contaminated.

Remedying these infrequent errors is a costly endeavor. Most private insurers are willing to cover the testing; it’s far less expensive than paying for unnecessary treatment, or treatment late in the course of a disease that should have been identified sooner.

Medicare<http://topics.nytimes.com/top/news/health/diseasesconditionsandhealthtopics/medicare/index.html?inline=nyt-classifier>, on the other hand, does not cover DNA fingerprinting of biopsies, and many of patients receiving cancer biopsies are older. (Legislation introduced in Congress in May would require the program to cover the service, but only for prostate biopsies.)

Consumers may request DNA fingerprinting themselves, but there is no guarantee that the pathology lab to which their biopsies are sent will offer the service.

Recently, a group of researchers led by Dr. Kirk Wojno, a pathologist at the Comprehensive Urology and Comprehensive Medical Center in Royal Oak, Mich., decided to address the financial obstacles to widespread DNA testing of biopsies, in this case specifically for prostate cancer.

Unnecessary treatments and lawsuits come with a high price tag, the researchers concluded.

There are about 806,000 prostate biopsies a year in the United States. Lab mix-ups of these biopsies alone cost the nation about $879.9 million per year<https://www.ncbi.nlm.nih.gov/pubmed/25463992>. That figure includes cost of lawsuits that result from mix-ups.

The cost of doing DNA fingerprinting, Dr. Pfeifer argues, “is well within the range of costs we see with other clinical testing.”

”You can make an argument that for prostate cancer you should probably do this for every patient at the time of initial diagnosis,” he added. “By extension, you probably have the same situation for other diseases.”

But other experts are not convinced the test is worth the cost.

While mix-ups do happen, pathologists have put a series of steps<http://www.cap.org/web/home/resources/cap-guidelines/current-cap-guidelines/uniform-labeling-surgical-pathology?_afrLoop=303951474392889#!%40%40%3F_afrLoop%3D303951474392889%26_adf.ctrl-state%3D1br6msxspm_4> in place to try to avoid them, including 26 requirements for labeling containers and identifying patients, and ordering tests, said Dr. Raouf Nakhleh, vice chair of the College of American Pathologists’ Council on Scientific Affairs and a professor of pathology at the Mayo Clinic in Jacksonville, Fla.

“We get paid $125 to process a specimen and produce a diagnosis,” he said. He turns to DNA fingerprinting only when he suspects a mix-up — for example, a clinical exam is at odds with a pathology report.

Dr. Jennifer Hunt, who chairs the Department of Pathology at the University of Arkansas for Medical Sciences, also objects to the cost. “It’s related to finances,” she said. “And the risk of error is extraordinarily low.”

Dr. Sanford Siegel of Chesapeake Urology used to feel the same way. But in 2015, a new patient had a blood test that indicated he might have prostate cancer. He had a biopsy, which confirmed it.

The man had his prostate removed — only to learn he had been the victim of a lab mix-up. His reaction, as Dr. Siegel recalled? “I am calling a lawyer.”

After that, Chesapeake Urology made the DNA test mandatory.

Correction: June 27, 2017
An earlier version of this article misstated the percentage of blood samples, out of 6,733, that were switched in a clinical trial in 2011. It was 0.5 percent, not 0.05 percent.



To unsubscribe from IMPROVEDX: click the following link:

Visit the searchable archives or adjust your subscription at: http://list.improvediagnosis.org/scripts/wa-IMPDIAG.exe?INDEX
Moderator:David Meyers, Board Member, Society for Improving Diagnosis in Medicine

To learn more about SIDM visit:



To unsubscribe from IMPROVEDX: click the following link:

Visit the searchable archives or adjust your subscription at: http://list.improvediagnosis.org/scripts/wa-IMPDIAG.exe?INDEX

Moderator:David Meyers, Board Member, Society for Improving Diagnosis in Medicine

To learn more about SIDM visit:

To unsubscribe from the IMPROVEDX:
or click the following link: IMPROVEDX at LIST.IMPROVEDIAGNOSIS.ORG


For additional information and subscription commands, visit:

http://LIST.IMPROVEDIAGNOSIS.ORG/ (with your password)

Visit the searchable archives or adjust your subscription at:

Moderator: David Meyers, Board Member, Society to Improve Diagnosis in Medicine

To unsubscribe from the IMPROVEDX list, click the following link:<br>
<a href="http://list.improvediagnosis.org/scripts/wa-IMPDIAG.exe?SUBED1=IMPROVEDX&A=1" target="_blank">http://list.improvediagnosis.org/scripts/wa-IMPDIAG.exe?SUBED1=IMPROVEDX&A=1</a>

HTML Version:
URL: <../attachments/20170629/d77bca0d/attachment.html> ATTACHMENT:
Name: ~WRD000.jpg Type: image/jpeg Size: 823 bytes Desc: ~WRD000.jpg URL: <../attachments/20170629/d77bca0d/attachment.jpg>

More information about the Test mailing list