Fingerprint Analysis & Documentation – Week 6 Lecture 1 Study Notes
Review of Week 5 and Recurring Problems
- False‐positive fingerprint conclusions discussed last week (cases in Scotland & USA) serve as cautionary tales.
- Core failures identified:
- Incomplete analysis of the unknown (latent) print during the ACE (Analysis–Comparison–Evaluation) process.
- Inadequate or missing documentation of each ACE step → makes peer review or court scrutiny impossible.
- Verification phase of ACE‐V (the “V”) failed as a QA/QC back-stop: verifiers either rubber-stamped the call or were deprived of sufficient documentation to evaluate it.
- Ethical implication: A deficient workflow can place innocent people at risk and erode public trust in forensic science.
Wrongful-Conviction Vocabulary
- Faulty evidence
- Testimony not actually supported by underlying science.
- Misleading evidence
- Opinions overstated or limitations/opposing data concealed.
- Why it matters: Courts and juries often treat forensic testimony as authoritative; unqualified statements magnify the danger.
Illustrative Misidentification Case Files
- Madrid Train Bombing (2004) – Brandon Mayfield mis-ID by FBI.
- Boston example (slide shows but not detailed) – reminder that high-profile errors recur.
- Elkhart, Indiana
- Chart displayed latent vs. inked print; illustrates subjective mark-ups and potential quality pitfalls.
- Raises question: “Chart Quality?”—underscores documentation standards.
Foundations of Fingerprint Identification (Key Scientific Assertions)
- Individuality
- No two fingerprints share identical ridge configurations (Level 1 patterns & Level 2 minutiae).
- Permanence
- Once formed in utero, ridge detail persists throughout life barring dermal damage.
- Evidence base:
- Welcker longitudinal study (1856–1897) → identical impressions ~41 yrs apart.
- Faulds 1880; Galton 1892; Herschel 1916: early confirmations.
- Modern confirmations: Okajima 1979; Wertheim et al 2002; Wan et al 2003.
- Exceptions / modifiers:
- Scarring penetrating dermis.
- Aging → ridges flatten, dermis loses elasticity, pattern visibility lowers (Okajima 1979).
- Certain systemic medications can erode ridge detail.
- Human growth (child to adult) scales print but topology remains.
- Recoverability
- Latent ridge detail can be transferred to and subsequently visualized on diverse substrates.
Probability & Statistical Models for Identification
- Approx. two dozen models proposed since 1892; none yet validated under operational casework conditions.
- Common theme: all predict astronomically low probability that two random individuals share a given arrangement of minutiae—yet assumptions & independence questions remain.
Galton Model (1892)
- Relied on predicting minutiae occurrence from surrounding ridge layout; lacked empirical frequency data.
- Calculated joint probability of a specific minutiae arrangement as \tfrac{1}{68\,000\,000\,000}.
- Crude by modern standards but seminal—it framed the individuality debate.
Henry Model (1900)
- Treated each ridge characteristic (rc) as an independent, identically distributed (i.i.d.) event with p=\tfrac{1}{4}.
- Probability of matching 12 rc: \left(\tfrac{1}{4}\right)^{12}=6\times10^{-8}\,\,(\text{≈ }1:!17\text{ million}).
- Added pattern‐type weighting—equates to 2 extra rc → for a whorl w/12 rc:
\left(\tfrac{1}{4}\right)^{14}=4\times10^{-9}\,\,(\text{≈ }1:!270\text{ million}). - Critique: rc independence assumption unrealistic; still influential—spawned numerical thresholds (e.g., “12‐point rule”).
Balthazard Numerical Standard (1911)
- First explicit numeric threshold: ≥17 matching rc → identification.
- Applied probabilistic reasoning to a world population of 1.5 billion.
- Allowed reduced threshold for local‐suspect pools (town/country)—foreshadowed conditional probability/bayesian thinking.
Locard’s Tripartite Rule (1914)
- NOT a statistical model—rather, an evidential sufficiency heuristic.
- >12 clear matching rc → certainty beyond debate.
- 8–12 rc → identification “marginal”; certainty depends on five quality factors:
a) overall clarity; b) rarity of minutiae; c) clear core & delta; d) visible pores; e) agreement of ridge/furrow width, flow direction, bifurcation angles. - <8 rc → cannot establish ID; only conveys proportional presumption.
- Requires at least two competent examiners to concur for parts 1 & 2.
- Modern view (Champod 1995) sees part 3 as proto‐probabilistic.
Synthesis of Models
- Regardless of approach, calculated random match probabilities extremely small.
- Limitation: none validated with large, real latent databases in operational environments → gap between theoretical certainty & courtroom reality.
Contemporary Fingerprint Research
Latent Residue: Drugs
- Sensitive MS techniques detect cocaine, heroin, morphine down to tens of picograms ( 10^{-11}\,g ) from a single print.
- Also identifies prescription medications—potential investigative intelligence (timeline, user habits).
Latent Residue: Aging of Prints
- Mass‐spec imaging tracked triacylglycerol decay over 7 days for individual donors.
- Rate of chemical degradation donor‐specific; still works on powder‐dusted prints.
- Could eventually provide “time‐since‐deposition” estimates.
Processing New Polymer Banknotes
- Pre‐2015 paper bills: ninhydrin / 1,2‐indanedione effective.
- Post‐2015 polymer notes:
- Apply cyanoacrylate fuming (limited to transparent windows).
- For opaque inked areas use Vacuum Metal Deposition (VMD): vaporized gold adheres to fatty residues, zinc coats gold creating contrast.
- Alternative research (Canadian study JEFSR 2016):
- Natural‐IR powders + forensic light sources.
- Silicone casting material & gelatin lifters as non-destructive options.
FRStat Software (Defense Forensic Science Center)
- Provides likelihood ratios to quantify evidential strength.
- Intended to complement examiner opinion, not replace it.
- Caveats:
- Quality of output proportional to quality of minutiae counts/classification fed into algorithm.
- Must be internally validated before courtroom deployment.
Sex Determination via Amino Acid Profiling
- Women’s latent sweat shows ≈2× higher concentrations of specific amino acids.
- 2015 Analytical Chemistry study (Vol 87 Issue 22) demonstrates laboratory viability; field kit development ongoing.
- Possible role as investigative adjunct or to corroborate other biological traces.
Vision for the Future – Insights from Champod (2015) Post-NAS Report
A) Clarifying the Inference Process
- Advocate abandoning categorical “identification”/“individualization” claims.
- Replace with Bayesian framework: evaluate P(E|Hp) vs. P(E|Hd) (evidence under prosecution vs. defence hypotheses).
B) Improving Transparency
- Full disclosure of case notes, comparison rationale, and analytic uncertainties.
- Move away from authority-based assertions (“unique because I say so”).
C) Avoiding the “Expert Black Box”
- Distinguish individual examiner error rate from systemic error rate.
- A claimed 0.1 % false positive rate may be misleadingly small unless accompanied by context (complexity, substrate, lab practices).
D) Introducing Statistics & Reconciling Conflicts
- Probability statements inevitable; courts must prepare for likelihood ratios.
- Research & operational validation still lacking → urgent priority.
- Training programs will need overhaul; legal community education essential.
E) Quantifying Mark Quality
- Need standardized metrics to grade latent print “information content.”
- Allows triage: high‐quality marks → minimal bias safeguards; low‐quality → strict blind verification & context management.
F) Managing Bias
- Bias‐mitigation proportional to evidentiary quality.
- High clarity: lower risk but still benefits from safeguards.
- Low clarity: enforce blinding, sequential unmasking, peer review.
Ethical, Legal & Practical Takeaways
- Robust documentation and validated quantitative tools are central to sustaining credibility.
- Courts increasingly expect empirical foundations; forensic discipline must evolve from tradition-based to evidence-based practice.
- Interdisciplinary collaboration (statisticians, cognitive scientists, chemists, software engineers) will drive next‐gen fingerprint science.