See - The FBI’s Forensics Disaster - Reason.com
"x x x.
The FBI has admitted that its hair examiners have been dishing out clap-trap in court and in their reports prior to 2000 when describing hair analysis in hundreds of cases—though the flawed science involved could affect as many as 3,000 cases. The FBI itself quotes Peter Neufeld, co-founder of the Innocence Project: "These findings confirm that FBI microscopic hair analysts committed widespread, systematic error, grossly exaggerating the significance of their data under oath with the consequence of unfairly bolstering the prosecutions' case."
The FBI hair comparison experts were found to have made "erroneous statements" in about 96 percent of the studied cases in which "examiners provided testimony used to inculpate a defendant at trial." In 33 cases, errors were found in the analysis of defendants who were subsequently sentenced to death. Of those defendants, nine have already been executed, and five others died while on death row.
Worse still, this bad hair science is just the latest example of phony forensics. Another whole field of forensic science, compositional bullet lead analysis, was shown to be bogus in a 2004 National Academy of Sciences study. The FBI had been testifying that the chemical composition of a bullet could identify it down to the maker, or even the batch, or even, in some cases, the box. No, said the study: "The available data do not support any statement that a crime bullet came from a particular box of ammunition."
And then there's the heavily questioned science of bite-mark analysis, heavily reported by former Reason editor Radley Balko. A 2002 study found a "false positive" error rate of 64 percent in bite-mark analysis. The Chicago Tribune reported that the study's author "figured that on average, they falsely identified an innocent person as the biter nearly two-thirds of the time." Fortunately, the FBI does not do bite-mark analysis. Unfortunately, other labs do.
A 1992 study showed that many traditional arson investigation techniques were bogus. And yet Texas convicted Cameron Todd Willingham of murder mostly on the basis of those very techniques. In 2004, 12 years after the release of the report discrediting the crucial techniques used in Willingham's case, he was executed for his supposed crime.
Even fingerprints and DNA can go wrong. Fingerprints are pretty reliable when both the "known" and "unknown" images are clear and distinct. But the "unknown" image is often far from clear and distinct. The unknown image might be smudged, a small partial print, overlain by other possibly smudged prints, or deposited on an irregular surface like wood grain. In those cases errors become more likely.
In 2004, the FBI made a "100 percent match" of a print from the deadly Madrid train bombing to Portland area lawyer Brandon Mayfield. They turned out to be 100 percent wrong, however. The FBI later apologized to Mayfield, who claimed to have been profiled because he was a convert to Islam, and paid out $2 million to settle a suit he had filed against them. In another famous misidentification, that ofShirley McKie, a Scottish police agency was found to have mistaken wood grain for fingerprint ridges!
In ideal conditions, DNA is our most reliable forensic technique. Conditions are less than ideal if the crime-scene sample is small or corrupted or if it has the DNA of more than one person mixed together in it. And we have seen mistakes there, too. Josiah Sutton was convicted of rape largely on DNA evidence that was later shown to be bogus. He was convicted and imprisoned at the age of 16 and released more than four years later.
What in the world is going on here? It's partly bad science, partly bad organization, and wholly unacceptable.
x x x."