If you were to judge only by what's shown on detective-procedural television shows like CSI, you might think forensic investigations and crime lab results are virtually infallible. But from time to time, a government study comes along to point out how that's frequently far from the complete truth.
Take, for example, a groundbreaking study ordered by Congress and released in 2009 by the National Academy of Sciences' National Research Council. It pointed out numerous shortcomings, including scant scientific validation, for many forms of forensic evidence other than DNA, and urged more research, better standards and greater credentials for crime labs.
Then in April last year, the Federal Bureau of Investigation issued a report admitting its analysis of microscopic hair analysis frequently overstated the scientific reliability of such tests. In fact, DNA evidence in some instances revealed crime labs wrongly identified the source of hair fibers found at crime scenes.
On September 20th, 2016, after a year-long review of research studies, the President's Council of Advisors on Science and Technology (PCAST) issued a new report that was sharply critical of some forensic evidence methods commonly used in federal and state criminal courts.
The PCAST report, Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods, looked not only at the innate reliability of several types of forensic evidence -- including analyzing bite marks, identifying firearms, microscopic hair analysis, footwear and tire-tread analysis -- but also how the reliability of even better-validated types of evidence, such as DNA and latent fingerprints, are presented in criminal prosecutions.
The new study found quite a few potentially serious problems, both in the weakness or absence of proof of the scientific validity of some types of forensic evidence -- notably bite marks --especially "feature-comparison" attempts to differentiate between the particular source of a particular sample. Even for more reliably established types of evidence analysis, the report cautioned, experts may exaggerate their value by claiming greater-than-provable confidence in such findings.
The PCAST report also recommended specific actions that federal agencies -- such as Commerce's National Institute of Standards and Technology, the White's House's Office of Science and Technology Policy, and the FBI Laboratory -- could take to bring greater scientific certainty to forensic testing, as well as steps the Justice Department and federal courts could take to improve courtroom use of forensic test results.
Perhaps predictably, the PCAST report drew mixed responses. Some noted jurists associated with the project, such as federal appellate judges Alex Kozinski and Harry Edwards, wrote op-eds praising the report, but the reaction was quite different from prosecutors' and crime labs' groups. The National District Attorneys Association, for example, shot off a press release calling the PCAST report "scientifically irresponsible" and attacking the panel's members as unqualified to pass judgment on the issues they addressed.
The FBI also dissented, saying it takes issue with "many of the scientific assertions and conclusions" in the PCAST report, and the Justice Department has advised federal and state prosecutors that it planned to send them materials to use to counter claims in the report in case they are raised by litigants.
Christopher Zoukis is the author of College for Convicts: The Case for Higher Education in American Prisons (McFarland & Co., 2014) and Prison Education Guide (Prison Legal News Publishing, 2016). He can be found online at ChristopherZoukis.com, PrisonEducation.com and PrisonLawBlog.com