The Texas Prosecutor, Jan-Feb 2018

Responding to PCAST-based ­attacks on forensic science

In September 2016, a relatively obscure federal commission issued a report calling into question nearly every forensic science discipline currently used by law enforcement. While this report by the President’s Council of Advisors on Science and Technology (PCAST) was immediately controversial within the forensic science community, it has taken much longer for both prosecutors and defense attorneys to begin utilizing it during expert testimony. However, a recent article in the American Bar Association’s Criminal Justice magazine indicates that PCAST Report-based attacks on forensic science are on the horizon.1 With an understanding of what PCAST is, what its report says, and the problems with the report, we prosecutors can be ready to respond to these attacks.

What is PCAST?
“PCAST is an advisory group of the nation’s leading scientists and engineers who directly advise the President and Executive Office of the President.”2 It is intended to make “policy recommendations in the many areas where understanding of science, technology, and innovation is key to strengthening our economy and forming policy that works for the American people.”3 PCAST’s published reports since 2014 have addressed such wide-ranging subjects as big data and privacy, systems engineering in healthcare, and ensuring long-term U.S. leadership in semiconductors. While PCAST’s membership consists of individuals who are distinguished in their fields, it is critical to note that virtually none of those fields are forensic disciplines. Its membership includes a systems engineer, a physician specializing in geriatric medicine, a string physicist, and the Executive Chairman of Alphabet, Google’s parent company.

The PCAST Report
The report itself focuses on six “forensic feature-comparison methods” that attempt to determine whether evidentiary samples can be associated with source samples based on the presence of similar patterns, characteristics, features, or impressions.4 The methods it examines are:
•    DNA analysis of single-source and simple mixture samples,
•    DNA analysis of complex mixture samples,
•    bitemark analysis,
•    latent fingerprint analysis,
•    firearm and toolmark analysis, and
•    footwear analysis.5
    The report primarily addresses the reliability of these disciplines for purposes of admissibility under Federal Rule 702 (and by implication, its state equivalents, including Texas’ Rule 702 and Kelly test). Although the report claims to leave decisions about legal admissibility to the courts,6 it also attempts to establish its own threshold tests for admissibility based on error rates.7 The report creates its own concept, termed “foundational validity,” which “requires that it be shown, based on empirical studies, to be repeatable, reproducible, and accurate.”8 The report then says that “foundational validity” corresponds to the legal requirement of “reliable principles and methods.”9 “Validity as applied” means “that the method has been reliably applied in practice”10 and corresponds to the legal requirement of proper application of the principles and method in the particular case.11
    The report heavily emphasizes error rates in both foundational validity12 and validity as applied13 through studies that were designed to determine the error rate for a method by evaluating the error rate of individual analysts. The design of those studies and their focus on individual analyst error rates is at odds with reality in the laboratory. For example, standard practice in virtually all accredited laboratories involves quality assurance mechanisms that are designed to detect errors by individual analysts. In fact, the operation and effectiveness of such quality assurance mechanisms are key components of the accreditation process.14 However, the report relied upon studies that did not allow verification, suggesting the error rate in practice is lower than calculated.15 Additionally, the report relied on a latent fingerprint study in citing a false positive rate that itself contained a calculation error that PCAST failed to detect.16 Furthermore, by focusing on the error rate of individual analysts, PCAST fails to consider that the studies do not show what the error rate of the discipline or method is, but instead show the error rate of the individual analysts studied.17

Responses from the forensic science ­community
Understandably, the report prompted a number of responses in rebuttal throughout the forensic science community and the federal government. Then-Attorney General Loretta Lynch released a statement advising that the U.S. Department of Justice would not adopt the report’s recommendations.18 The FBI published comments noting the report’s “subjectively derived” criteria and disregard of numerous published studies that would meet the report’s criteria for “foundational validity.”19 The American Society of Crime Lab Directors also released a response detailing the flaws in the report’s methodology.20 The response of the Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF) noted PCAST’s failure to address firearms and toolmark studies that had been submitted for consideration.21 The Association of Firearm and Tool Mark Examiners’ response pointed out that the report’s insistence upon a single report being the benchmark for foundational validity suggested a “fundamental lack of understanding” of the extent of research in the field.22

Use by the defense
Despite these numerous problems with the report’s methodology and findings, prosecutors should expect to see an increasing number of challenges to the State’s experts based upon the report. In the Summer 2017 issue of Criminal Justice, the chief defender for the Federal Public Defender’s Office in Puerto Rico laid out a four-step strategy for using the report to exclude or discredit the State’s forensic experts.23
•    Step One is an argument to begin indoctrinating the judge through appeals to the judge’s emotions rather than reason.24 “By establishing an alternative emotion, we increase our chances that the judge’s demand will pay homage to the NRC25 and PCAST reports while precluding or limiting the introduction of the government’s damaging expert testimony.”26
•    Step Two tries to exclude the expert testimony entirely by showing that the PCAST Report is “novel evidence” that should call into question well-established forensic disciplines.27
•    Step Three, assuming failure to exclude the testimony, is to limit it, especially in terms of the expert’s certainty as to his conclusions.28
•    Finally, Step Four is to neutralize the expert testimony by a competing expert.29 Interestingly, the author does not recommend bringing a defense expert in the same field, as that would give legitimacy to the State’s use of the forensic discipline.30 Instead, he recommends bringing in an academic from a local university, even if that person knows “little about the particular field in question.”31

Responding to the defense
Once we know the expected attacks on forensic disciplines using the report, it becomes much easier to defeat them. At any 702 hearing, it is critical to highlight for the judge the significant flaws in the report’s methodology, the composition of its authoring body, and the fact that the report is the product of a policy-oriented (rather than science-oriented) body and process. As noted above, much of PCAST’s membership is from outside the forensic disciplines addressed. Undeterred by this lack of subject matter expertise, PCAST issued a number of “scientific findings” regarding the validity of various disciplines.32 The report’s “scientific findings” are especially questionable given that the report was not itself peer-reviewed prior to release; ironically, one of its criteria for any study to be acceptable in determining validity was that it be peer-reviewed. The report also cannot be considered a properly conducted scientific literature review,33 even though the report claims to have been one.34 A scientific literature review should include a summary, classification, and comparison of each article reviewed.35 PCAST purports to have reviewed over 2,000 papers in its report,36 but it fails to provide individual analyses of them.37
    With all of those flaws noted, argue to the judge that any statements contained in the report should not be admissible under the Rule 803(18) exceptions for learned treatises because the report is not accepted as a reliable authority. The responses to the report from the various forensic discipline working groups, as well as the Department of Justice and other federal agencies, should highlight to the judge that the report is not a reliable authority. We should also attempt to obtain specific findings of fact from the court regarding the report’s flaws to support appropriate conclusions of law. Findings that directly address the report’s authorship, peer-review process, and general rejection throughout the forensic science community will be relatively straightforward matters to support from the record and should lead to conclusions regarding its unreliability and rejection. If the defense offers a copy of the report for the record, prosecutors must ensure that we offer copies of any reports, studies, affidavits, or statements supporting the State’s opposition. Because our counterattack is against the report as a whole, responses from disciplines outside the scope of the motion at issue are still of value (e.g., filing the ATF and AFTE responses when opposing a motion to exclude latent print analysis). For example, one opposition to a motion to exclude firearm and tool mark testimony used by the U.S. Attorney’s Office in the District of Columbia included an appendix that totaled over 1,100 pages. Establishing unreliability in the record early on will help shape appellate arguments regarding the defense’s challenge to forensic expert testimony. It will also help rebut attempts to use the report as “novel evidence” to attack forensic disciplines.
    Next, even if we preclude direct use of the report, we still have to prepare our expert witnesses for attacks based upon it. Whether preparing a DNA analyst, latent print examiner, or firearms and toolmark examiner, make sure that trial preparation includes reviewing the body of validation studies for the relevant field, especially those directly addressed in the report. For any study directly addressed in the report, such as the exclusion of verification processes and use of incorrect statistical calculations, our experts should be familiar with the flaws in them and their use by PCAST. This is also the point where prosecutors can anticipate more discipline-specific attacks and tailor our responses accordingly.
     In some cases, we may want to keep our powder dry and let the report come in. If trying a case before a judge who will let the report in regardless of the State’s objections (or if being used by a defense expert whom we can discredit on cross-examination), there may be tactical value in not tipping our hand before dissecting the report in front of the jury. Whether to attempt outright exclusion or using as fodder for cross-examination will be a situation-specific call by the prosecutor at trial.

Firearms and toolmark examiners
With a firearms and toolmark examiner, we can expect a PCAST-based challenge to claim that there has been a single validation study for the field, which is insufficient to establish either foundational or validity as applied. Such a challenge will likely further attack the discipline as being entirely subjective. Our response in this scenario would focus on consecutive manufacture studies and the 10-barrel study.38 At its heart, firearms and toolmark identification relies upon the fact that even items manufactured to the same specifications will have minor variations due to the gradual, microscopic wear of the tools manufacturing them. In the case of firearms, this means that otherwise identical barrels will have slight variations in their rif