WhatIs.site
science 10 min read
Editorial photograph representing the concept of forensic science
Table of Contents

What Is Forensic Science?

Forensic science is the application of scientific principles and methods to matters of law, encompassing the collection, preservation, analysis, and presentation of physical evidence in criminal and civil proceedings. It spans dozens of disciplines — from DNA analysis and fingerprint examination to toxicology, ballistics, and digital forensics — united by the common goal of using science to establish facts relevant to legal questions.

The Crime Scene: Where It All Starts

Everything in forensic science flows from the crime scene. Mess this up, and the best laboratory in the world can’t fix it.

Scene Documentation

Before anyone touches anything, the scene must be thoroughly documented. This means photography — wide-angle shots establishing the scene, mid-range photos showing the spatial relationships between evidence, and close-ups of individual items with and without measurement scales. Video walkthroughs capture the overall scene. Detailed sketches record precise measurements because photographs can distort distances and perspectives.

Modern scenes may also be documented with 3D laser scanning, creating a detailed digital model that can be revisited and measured long after the physical scene has been released. This technology — adapted from surveying and architectural applications — captures millions of spatial data points in minutes and allows investigators to virtually revisit the scene from any angle.

Evidence Collection

Different types of evidence require different collection methods. Blood samples are collected on sterile swabs and dried to prevent bacterial degradation of DNA. Trace evidence (fibers, hair, glass fragments) is collected with tweezers, tape lifts, or vacuuming. Firearms are packaged to preserve fingerprints on the surface and residue in the barrel. Documents are handled with gloves and placed in protective sleeves.

The chain of custody — a documented record of who handled the evidence, when, and where — must be maintained from crime scene to courtroom. Any break in the chain creates grounds for challenging the evidence’s admissibility. Every transfer is logged. Every opening of a sealed evidence package is recorded.

Contamination prevention is paramount. Crime scene investigators wear protective equipment not just for personal safety but to prevent their own DNA, fingerprints, and trace evidence from contaminating the scene. Separate tools are used for different evidence items. Items from different suspects are processed in different areas.

Locard’s Exchange Principle

Edmond Locard, a French forensic scientist working in the early 1900s, formulated the principle that underlies all physical evidence analysis: “every contact leaves a trace.” When two objects come into contact, material is transferred between them. A burglar entering through a window leaves fibers on the frame and carries glass fragments on their clothing. A killer leaves DNA on the victim and carries the victim’s blood on their hands.

This principle — simple in concept, enormously powerful in practice — drives forensic evidence collection. If a suspect was at the scene, there should be traces. If a weapon was used in contact, there should be transfer. The forensic scientist’s job is to find, preserve, and analyze those traces.

DNA Analysis: The Gold Standard

DNA fingerprinting has been called the greatest forensic advance since actual fingerprinting. And that’s not an exaggeration.

How It Works

Human DNA is 99.9% identical between individuals. Forensic DNA analysis focuses on the 0.1% that varies — specifically, on Short Tandem Repeats (STRs), regions of DNA where a short sequence (typically 4 bases long) repeats a variable number of times. Different people have different numbers of repeats at each location.

The FBI’s Combined DNA Index System (CODIS) uses 20 STR loci (locations) for identification. At each locus, a person has two alleles (one from each parent). The probability that two unrelated people would match at all 20 loci is astronomically small — on the order of one in billions of trillions.

The laboratory process involves:

  1. Extraction: Separating DNA from the biological sample (blood, saliva, skin cells, semen).
  2. Quantification: Measuring how much DNA was recovered.
  3. Amplification: Using Polymerase Chain Reaction (PCR) to make millions of copies of the target STR regions. This is what makes modern DNA analysis so powerful — even tiny samples containing just a few cells can be amplified to detectable levels.
  4. Separation and detection: Using capillary electrophoresis to separate the amplified STR fragments by size and detect them with fluorescent labels.
  5. Interpretation: Comparing the resulting DNA profile to known reference samples or searching it against a database.

Touch DNA and Low-Copy-Number Analysis

Modern techniques can extract DNA from incredibly small samples — the skin cells left behind when you touch a surface. This “touch DNA” has expanded the range of evidence that can be analyzed, but it’s also created new challenges.

With very small amounts of DNA, stochastic (random) effects can distort the results. An allele might not amplify. A contaminant’s DNA might be present at detectable levels. Mixtures of DNA from multiple contributors become extremely difficult to interpret. The line between a genuine match and an artifact of the analysis becomes blurry.

This has led to controversies around probabilistic genotyping software, which uses algorithms to calculate the likelihood ratios for DNA mixtures. Different software programs can produce different results from the same data. The scientific community hasn’t fully resolved questions about validation and transparency of these algorithms.

The DNA Database

CODIS contains over 22 million offender profiles and more than 5 million forensic profiles (from crime scenes). Since its inception, it has produced over 700,000 investigative leads. When a crime scene DNA profile is uploaded and matches an existing offender profile, investigators have a suspect they might never have identified otherwise.

Familial DNA searching — looking for partial matches that might indicate a relative of the offender is in the database — has solved high-profile cases including the Golden State Killer, identified through a genealogy database rather than CODIS.

But DNA databases raise privacy concerns. Who should be in the database? Just convicted felons? Anyone arrested? Everyone? Different jurisdictions draw the line differently, and the debate reflects fundamental tensions between public safety and privacy.

Fingerprint Analysis

Fingerprints remain one of the most widely used forms of forensic identification, despite being first used over a century ago.

The Science

The friction ridge patterns on your fingertips are formed during fetal development and remain unchanged throughout your life (barring injury). No two people — including identical twins — have been found to share identical fingerprints, though a definitive mathematical proof of uniqueness has never been established.

Fingerprint examiners compare prints using three levels of detail: the overall pattern (arch, loop, or whorl), the minutiae (specific features where ridges end, bifurcate, or form other characteristic shapes), and the fine detail (ridge edge shapes, pore positions). Identification is based on finding sufficient corresponding minutiae between the questioned print and a known print.

AFIS: The Automated System

The Automated Fingerprint Identification System (AFIS) — and its national version, IAFIS, maintained by the FBI — contains fingerprints from over 160 million individuals. When a crime scene fingerprint is entered, the system searches its database and returns a list of candidate matches, ranked by similarity.

Critically, AFIS doesn’t make identifications. It narrows the field. A trained examiner must then compare the candidates to the crime scene print and make the identification decision. This human element is both the system’s strength (pattern recognition that computers still struggle with) and its weakness (subjective judgment subject to bias).

Controversies and Reforms

The 2004 Brandon Mayfield case shook the fingerprint community. Mayfield, an Oregon attorney, was incorrectly linked to the Madrid train bombings based on a fingerprint match made by FBI examiners. Spanish authorities ultimately identified the actual source of the print. The error resulted from confirmation bias — once the initial examiner declared a match, verification examiners agreed despite discrepancies.

This case and others prompted significant reforms: blind verification procedures (the verifying examiner doesn’t know the first examiner’s conclusion), documentation requirements, proficiency testing, and research into error rates. A major study by the FBI and academic researchers found a false positive rate of approximately 1 in 306,000 comparisons — low, but not zero.

Toxicology

Forensic toxicology determines whether drugs, alcohol, poisons, or other chemical substances contributed to death or impairment.

Postmortem Toxicology

When someone dies unexpectedly, the medical examiner collects blood, urine, vitreous humor (eye fluid), liver tissue, and sometimes other specimens for toxicological analysis. Immunoassay screening tests detect broad categories of substances, and positive results are confirmed and quantified using gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-tandem mass spectrometry (LC-MS/MS).

Interpretation is where the complexity lies. A drug found in postmortem blood doesn’t necessarily mean it caused death. The concentration matters, the person’s tolerance matters, drug interactions matter, and postmortem redistribution (drugs moving between tissues after death) can alter concentrations. A competent forensic toxicologist considers all these factors when rendering an opinion.

Impaired Driving

Blood alcohol concentration (BAC) measurement for DUI cases is one of the highest-volume forensic analyses performed. Breath testing (using infrared spectroscopy) provides immediate roadside results, while blood testing in the laboratory provides more accurate confirmation.

The legal limit in most U.S. states is 0.08% BAC. At this concentration, the average person experiences reduced coordination, impaired judgment, and slowed reaction times. But individual variation is substantial — chronic heavy drinkers may show little impairment at 0.08%, while infrequent drinkers may be significantly impaired at 0.05%.

Drug-impaired driving is a growing concern, particularly with marijuana legalization. Unlike alcohol, there’s no established blood concentration that reliably predicts impairment for most drugs. THC concentrations don’t correlate well with impairment because the drug distributes into fat tissue and is released slowly. This makes prosecution of drug-impaired driving cases significantly more challenging than alcohol cases.

Firearms and Toolmark Analysis

Ballistics

When a gun fires, the barrel leaves microscopic marks on the bullet — striations from the rifling grooves that spin the bullet. The firing pin leaves a characteristic impression on the cartridge case. An ejector leaves marks. The breech face leaves marks.

Firearms examiners compare these marks under a comparison microscope to determine whether a specific firearm fired a specific bullet or cartridge case. The examiner looks for sufficient agreement in the microscopic toolmark patterns to declare an identification.

This discipline has faced scientific criticism. A 2009 National Academy of Sciences report noted that “sufficient agreement” is subjectively defined, error rates haven’t been rigorously established, and the fundamental assumption — that every firearm leaves unique marks — hasn’t been scientifically proven. Research is ongoing to develop more objective, statistically grounded methods.

Gunshot Residue

When a firearm discharges, a cloud of particles containing lead, barium, and antimony is expelled from the barrel and action. These particles can land on the shooter’s hands, clothing, and nearby surfaces. Detection of gunshot residue (GSR) using scanning electron microscopy with energy-dispersive X-ray spectroscopy (SEM/EDX) can indicate that a person was near a discharged firearm.

But GSR evidence has significant limitations. Particles can transfer from one surface to another (a police officer who handled a gun can transfer GSR to a suspect during arrest). Environmental contamination from other sources (fireworks, car brake pads) can produce similar particles. And GSR dissipates quickly — after a few hours of normal activity, most particles will have been lost from the hands.

Trace Evidence

Trace evidence includes the small — sometimes microscopic — materials transferred during physical contact.

Fibers

Textile fibers can link a suspect to a scene, a victim, or a vehicle. A wool fiber from a suspect’s sweater found on a victim’s clothing supports contact between them. The evidential value depends on the fiber type (a common white cotton fiber is much less significant than an unusual synthetic blend), the number of fibers found, and whether the transfer makes sense in context.

Fiber analysis uses microscopy and instrumental techniques like microspectrophotometry (measuring color precisely) and Fourier-transform infrared spectroscopy (FTIR, identifying the chemical composition). Fibers can usually be classified to type and color but rarely linked to a specific garment to the exclusion of all others.

Glass

Glass fragments from broken windows, headlights, or bottles can be analyzed for refractive index, elemental composition, and physical characteristics. Modern techniques using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) can distinguish glass from different manufacturing batches with high discrimination.

Paint

Automotive paint analysis is particularly useful in hit-and-run cases. Vehicles have multiple paint layers (primer, basecoat, clearcoat) with specific compositions and thicknesses. A paint chip recovered from a victim’s clothing can be compared to paint databases to identify the vehicle make, model, and year range.

The Crisis of Confidence

Forensic science has faced a reckoning over the past two decades, and understanding this context is essential.

The 2009 National Academy of Sciences report, Strengthening Forensic Science in the United States: A Path Forward, was devastating. It found that many forensic disciplines lacked rigorous scientific validation, had no established error rates, and relied on subjective judgment without standardized criteria. Only DNA analysis was praised for having a strong scientific foundation.

Specific disciplines were singled out:

Bite mark analysis was found to have no scientific basis for claiming that bite marks can reliably identify a specific individual. Multiple wrongful convictions have resulted from bite mark testimony. The Texas Forensic Science Commission recommended a moratorium on bite mark evidence in 2016.

Microscopic hair comparison — examining hair under a microscope to associate it with a suspect — was found to produce false positive rates far higher than practitioners claimed. An FBI review of testimony by its hair examiners found that they had overstated the significance of hair matches in 90% of cases reviewed, contributing to wrongful convictions.

Arson investigation underwent a revolution when research demonstrated that many “indicators of arson” traditionally used by fire investigators — pour patterns, crazed glass, V-patterns — can be produced by ordinary fires. Cases where people were convicted (and in some cases executed) based on now-discredited arson science remain deeply troubling.

The response to these critiques has been constructive, if slow. The National Institute of Standards and Technology (NIST) established the Organization of Scientific Area Committees (OSAC) to develop forensic science standards. Research into error rates is expanding. Accreditation requirements are tightening. But the gap between what the public believes about forensic science (largely shaped by television) and the actual state of the evidence is still substantial.

The Role of Cognitive Bias

Forensic science has belatedly recognized that cognitive bias affects examiners just like everyone else. When a fingerprint examiner knows the suspect confessed, they’re more likely to find a “match.” When a forensic pathologist knows police suspect homicide, they’re more likely to rule the manner of death as homicide.

Research by Itiel Dror and others has demonstrated these effects experimentally. The same fingerprint evidence presented with different contextual information produces different conclusions from the same examiners. This isn’t fraud — it’s the well-documented operation of confirmation bias and other cognitive effects that all humans experience.

Countermeasures include:

  • Linear sequential unmasking: Examining evidence before learning case context, then gradually introducing contextual information and documenting whether conclusions change.
  • Blind verification: Having a second examiner verify conclusions without knowing the first examiner’s findings.
  • Task-relevant information only: Limiting the contextual information examiners receive to what’s necessary for the analysis.

These reforms are being adopted unevenly across laboratories. Some have embraced them; others resist, viewing them as unnecessary obstacles.

Emerging Technologies

Forensic science continues to evolve. Rapid DNA instruments can produce a DNA profile from a cheek swab in about 90 minutes, enabling identification at the booking stage rather than weeks later in the laboratory. Next-generation sequencing expands the amount of genetic information extractable from a sample, potentially providing physical appearance predictions (eye color, hair color, skin tone) and biogeographic ancestry estimates.

Forensic genealogy uses DNA profiles uploaded to public genealogy databases to identify suspects through their relatives — the technique that identified the Golden State Killer. It raises profound privacy questions about whether people who voluntarily upload their DNA for ancestry research have consented to being part of a law enforcement search tool.

AI and machine learning are being explored for pattern recognition tasks — automated fingerprint comparison, facial recognition from surveillance footage, and handwriting analysis. The promise is increased objectivity and consistency. The risk is automating biases present in training data.

Key Takeaways

Forensic science applies scientific methods to legal investigation across dozens of disciplines. DNA analysis provides the strongest evidence, with well-established scientific foundations and known error rates. Other disciplines — fingerprints, firearms, trace evidence — are valuable but face ongoing scrutiny regarding scientific rigor and the role of subjective judgment. The field has undergone a significant crisis of confidence following the 2009 NAS report, leading to reforms in standards, training, and bias mitigation. As technology advances — rapid DNA, forensic genealogy, AI-assisted analysis — the field continues to evolve, raising new questions about reliability, privacy, and the proper role of science in the justice system.

Frequently Asked Questions

How reliable is forensic science?

Reliability varies dramatically by discipline. DNA analysis is highly reliable with strong scientific foundations and known error rates. Fingerprint analysis has a solid track record but historically lacked standardized error rate studies. Other disciplines — bite mark analysis, hair comparison, footwear analysis — have been criticized for lacking rigorous scientific validation. A landmark 2009 National Academy of Sciences report called for significant reforms across many forensic disciplines.

Can DNA evidence be wrong?

The DNA analysis itself is extremely accurate, but errors can occur in sample collection (contamination), handling (mislabeling), interpretation (mixed profiles), and database searches (coincidental matches in large databases). Human error, not scientific failure, accounts for most DNA-related mistakes. Proper chain of custody and quality controls are essential.

What is the CSI effect?

The CSI effect is the belief that television crime dramas have influenced juror expectations about forensic evidence. Some prosecutors report that jurors expect extensive forensic testing in every case and may acquit when such evidence isn't presented, even when other evidence is strong. Research on whether this effect actually exists is mixed, but it has influenced how both sides present cases.

How long can DNA evidence last?

Under ideal conditions (cool, dry, dark environments), DNA can persist for thousands of years — ancient DNA has been recovered from remains tens of thousands of years old. Under poor conditions (heat, moisture, UV light), DNA can degrade within days or weeks. The quality of DNA evidence depends heavily on how the sample was stored and the environmental conditions it was exposed to.

Further Reading

Related Articles