Header image

SYM 04: Judging Expertise

Thursday, June 12, 2025
2:30 PM - 3:50 PM
Belling Suite

Overview

Symposium organiser: Tess Neal


Details

Judges, grant agency program officers, interdisciplinary journal editors, and policymakers, among other types of “gatekeepers,” are in a challenging position of being expected to recognize and screen out unreliable, poor science proffered by experts across fields. They’re expected to do so under intense pressure, with regards to time, stakes, and resources. This symposium includes four papers unpacking when, how, and why this charge is so difficult, with a focus on courts in particular. Psychological insights for improving the assessment of expertise are offered as well.


Speaker

Prof Kristy Martire
Professor
Unsw, Sydney

Psychological Insights for Judging Expertise

Symposium Presentation

This review, commissioned by Nature Reviews Psychology, addresses a critical societal problem psychology is uniquely positioned to address: the challenge of distinguishing genuine- from pseudo-experts. We examine the cognitive processes that underpin genuine expertise and explore the practical methods used to evaluate it. We distinguish between ‘show-it’ and ‘know-it’ expert performances. The key difference is in their visibility, measurability, and immediacy: ‘know-it’ performances are particularly challenging and critical to interrogate. This distinction serves as a heuristic for identifying when evaluations of expertise require greater care and should incorporate a range of diagnostic factors including foundational and applied validity.

Paper Number

500
Prof Tess Neal
Associate Professor And Dean's Professor
Iowa State University

Experts Screening Experts: U.S. Courts Are Failing to Gatekeep Psychological Expert Quality

Symposium Presentation

Psychological testing, based on psychometric science, often informs legal decisions that profoundly affect people’s lives, such as eligibility for disability benefits, child custody, and whether and where someone will serve a criminal sentence. We investigate psychological tests introduced as legal evidence throughout the entire history of U.S. case law, finding a sharp increase in recent years. Although the law requires judges to screen for quality before allowing an expert to testify, that rarely occurs (we estimate 1.66% of cases). This finding raises questions about the rigor of legal admissibility standards and the functioning of those rules regarding expert evidence quality.

Paper Number

499
Prof Tess Neal
Associate Professor And Dean's Professor
Iowa State University

Can Courts Effectively Gatekeep Expert Evidence?: Robust Data from One Domain

Symposium Presentation

U.S. law requires courts to recognize and screen the quality of evidence proffered by experts across fields. Yet, there is wide quality variation in expert evidence, with scant legal scrutiny. Are legal professionals able to distinguish varying qualities of expert evidence but do so sparingly for strategic or resource-related purposes, or are they unable to distinguish quality? Through preregistered experiments and surveys with judges, attorneys, and forensic psychologists, we find judges and attorneys can gauge evidence quality in laboratory conditions. However, we also report the reasons they outline for routine failure to do so in real-world cases.

Paper Number

522
PhD Jennifer Johan
PhD
University Of New South Wales

DEVELOPING AND TESTING A CHECKLIST TO IMPROVE SCIENTIFIC REASONING IN COMPLEX DECISION TASKS

Symposium Presentation

Psychological expert evidence can play a substantial role in legal proceedings. However, research has identified problems with this evidence, including concerns about reasoning underlying expert opinions. Across five studies (involving 1579 participants), we constructed and evaluated the effectiveness of a reasoning checklist to improve the application of scientific reasoning in complex decision tasks. Across these studies, we find that the reasoning checklist improved reasoning and argument quality – as well as their evaluation by a legal audience. Accordingly, our findings lay a foundation for future research to continue to explore methods of improving the quality of expert opinion evidence.

Paper Number

491
loading